Google Cloud SQL import - HTTPError 403: Insufficient Permission - google-cloud-sql

I would like to import mysql dump file into mysql database instance with the gcloud import tool, but I am getting an error:
ubuntu#machine:~/sql$ gcloud sql instances import sql-56-test-8ef0cb104575 gs://dbf/bt_ca_dev_tmp-2017-01-19.sql.gz
ERROR: (gcloud.sql.instances.import) HTTPError 403: Insufficient Permission
What exact permissions am I missing? I can create sql instance with registered service account but I am not possible to import data?

You have issue with permissions
create a bucket if you don't have one, run
`gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]`
Describe the sql instance you are exporting from and copy the sa
`gcloud sql instances describe [INSTANCE_NAME]`
Add the service account to the bucket ACL as a writer
`gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:W gs://[BUCKET_NAME]`
Add the service account to the import file as a reader
`gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:R gs://[BUCKET_NAME]/[IMPORT_FILE_NAME]`
Import the file:
gcloud sql import csv [INSTANCE_NAME] gs://[BUCKET_NAME]/[FILE_NAME] \
--database=[DATABASE_NAME] --table=[TABLE_NAME]

According to the documentation, this should do the trick. The wierd thing is that it needs write permissions. This should do the trick:
gsutil iam ch serviceAccount:"${SERVICE_ACCOUNT}":roles/storage.legacyBucketWriter gs://${BUCKET_NAME}
gsutil iam ch serviceAccount:"${SERVICE_ACCOUNT}":roles/storage.objectViewer gs://${BUCKET_NAME}

It's a permission problem of the Cloud SQL service account in the Google Storage bucket you're trying to use. To solve it you need to grant Storage Legacy Bucket Reader, Storage Legacy Object Owner, Storage Object Viewer roles to the service account email that you get from
gcloud sql instances describe <YOUR_DB_NAME> | grep serviceAccountEmailAddress
To do it go to the Cloud Storage / your bucket in Google Cloud Console and under Permission write the serviceAccountEmailAddress in ADD. Finally, add the roles you need.

Related

How to import sql file in Google SQL with binary mode enabled?

I have a database that is giving error:
ASCII '\0' appeared in the statement, but this is not allowed unless option --binary-mode is enabled and mysql is run in non-interactive mode. Set --binary-mode to 1 if ASCII '\0' is expected.
I'm including importing the database through the console with gcloud sql import sql mydb gs://my-path/mydb.sql --database=mydb but I don't see in the documentation any flags for binary mode. Is it possible at all?
Optional - is there a way to set this flag when importing through the MySQL Workbench. I haven't seen anything about it there too, but may be I'm missing some setting or something. If there is way to set that flag, then I can import my database through MySQL Workbench.
Thank you.
Depending where the source database is hosted, on Cloud SQL or on an on-premise environment, the proper flags are set during the export, so the dump file is compatible with the target database.
Since you would like to import a file that has been exported from an on-premise environment, mysqldump is the suggested way to perform the export.
First, create a dump file as suggested in the documentation. Make sure to pay attention to the following 2 points:
Do not export customer-created MySQL users. This will cause the import to the new instance to fail. Instead, manually create the MySQL users you wish to.
Make sure that you have configured the appropriate flags in order to make sure that the dump file will contain all the necessary details you need. Eg triggers, stored procedures etc.
Then, create a Cloud Storage Bucket and upload the dump file to the bucket.
Before proceeding with the import, grant the Storage Object Admin role to the service account of the target Cloud SQL instance. You may do that with the following command:
gsutil iam ch serviceAccount:[SERVICE-ACCOUNT]:objectAdmin gs://[BUCKET-NAME]
You may locate the aforementioned Service Account in the Cloud SQL instance Overview, or by running the following command:
gcloud sql instances describe [INSTANCE_NAME]
The service account will be mentioned at the serviceAccountEmailAddress field.
Now you are able to do the import either from Console, or using the gcloud command or a REST API.
More details in Google documentation
Best Practices for importing/exporting data

Google Cloud SQL Restore BAK file

I am new in Google Cloud. I created a Cloud SQL Instance and I need to restore the data from a .bak file. I have the .bak file in a GCS bucket, and I am trying to restore using Microsoft Management Studio -> Task -> Restore. But I'm not able to access the file.
Can anyone help me with the procedure on how to restore from a .bak file?
You need to give the Cloud SQL service Account access to the bucket where the file is saved.
On Cloud Shell run the following:
gcloud sql instances describe [INSTANCE_NAME]
On the output search for the field "serviceAccountEmailAddress" an copy the SA email.
Then again on cloud shell run the following:
gsutil iam ch serviceAccount:<<SERVICE_ACCOUNT_EMAIL>:legacyBucketWriter gs://<<BUCKET_NAME>>
gsutil iam ch serviceAccount:<<SERVICE_ACCOUNT_EMAIL>:objectViewer gs://<<BUCKET_NAME>>
That should give the service account permission to access the bucket and retrieve the file, also here is the guide on doing the import, take in mind that doing the import will override all the data on the DB.
Also remember that:
You cannot import a database that was exported from a higher version of SQL Server. For example, if you exported a SQL Server 2017 Enterprise version, you cannot import it into a SQL Server 2017 Standard version.

What is the best way to backup PostgresSQL individual Database (Not Instance) in google CloudSQL

I have PostgresSQL Instance in google cloud SQL with multiple databases, Wanted to back up individual database into google cloud storage. What would be be the best way to achieve this ?
You can export a dump file from your instance with your database to a Cloud Storage bucket:
Exporting data using Cloud SQL
Create a bucket:
gsutil mb -p [PROJECT_NAME] -l [LOCATION_NAME] gs://[BUCKET_NAME]
Describe the instance you are exporting from:
gcloud sql instances describe [INSTANCE_NAME]
Copy the serviceAccountEmailAddress field
Add the service account to the bucket ACL as a writer:
gsutil acl ch -u [SERVICE_ACCOUNT_ADDRESS]:W gs://[BUCKET_NAME]
Export the database:
gcloud sql export sql [INSTANCE_NAME] gs://[BUCKET_NAME]/sqldumpfile.gz \
--database=[DATABASE_NAME]
If you do not need to retain the permissions provided by the ACL you set previously, remove the ACL:
gsutil acl ch -d [SERVICE_ACCOUNT_ADDRESS] gs://[BUCKET_NAME]
I have ended up creating a cloud function that can export sql data in to cloud storage. This function can be triggered by cloud Pub/Sub using cloud scheduler

Reading bucket from another project in cloudshell

Because Firestore does not have a way to clone projects, I am attempting to achieve the equivalent by copying data from one project into a GCS bucket and read it into another project.
Specifically, using cloudshell I populate the bucket with data exported from Firestore project A and am attempting to import it into Firestore project B. The bucket belongs to Firestore project A.
I am able to export the data from Firestore project A without any issue. When I attempt to import into Firestore project B with the cloudshell command
gcloud beta firestore import gs://bucketname
I get the error message
project-b#appspot.gserviceaccount.com does not have storage.
buckets.get access to bucketname
I have searched high and low for a way to provide the access rights storage.bucket.get to project B, but am not finding anything that works.
Can anyone point me to how this is done? I have been through the Google docs half a dozen times and am either not finding the right information or not understanding the information that I find.
Many thanks in advance.
For import from a project A in a project B, the service account in project B must have the right permissions for the Cloud Storage bucket in project A.
In your case, the service account is:
project-ID#appspot.gserviceaccount.com
To grant the right permissions you can use this command on the Cloud Shell of project B:
gsutil acl ch -u project-ID#appspot.gserviceaccount.com:OWNER gs://[BUCKET_NAME]
gsutil -m acl ch -r -u project-ID#appspot.gserviceaccount.com:OWNER gs://[BUCKET_NAME]
Then, you can import using the firestore import:
gcloud beta firestore import gs://[BUCKET_NAME]/[EXPORT_PREFIX]
I was not able to get the commands provided by "sotis" to work, however his answer certainly got me heading down the right path. The commands that eventually worked for me were:
gcloud config set project [SOURCE_PROJECT_ID]
gcloud beta firestore export gs://[BUCKET_NAME]
gcloud config set project [TARGET_PROJECT_ID]
gsutil acl ch -u [RIGHTS_RECIPIENT]:R gs://[BUCKET_NAME]
gcloud beta firestore import gs://[BUCKET_NAME]/[TIMESTAMPED_DIRECTORY]
Where:
* SOURCE_PROJECT_ID = the name of the project you are cloning
* TARGET_PROJECT_ID = the destination project for the cloning
* RIGHTS_RECIPIENT = the email address of the account to receive read rights
* BUCKET_NAME = the name of the bucket that stores the data.
Please note, you have to manually create this bucket before you export to it.
Also, make sure the bucket is in the same geographic region as the projects you are working with.
* TIMESTAMPED_DIRECTORY = the name of the data directory automatically created by the "export" command
I am sure that this is not the only way to solve the problem, however it worked for me and appears to be the "shortest path" solution I have seen.

gcloud Export to Google Storage Bucket from Cloud SQL instance

Running this command:
gcloud sql instances export myinstance gs://my_bucket_name/filename.csv -d "mydatabase" -t "mytable"
Giving me the following error:
ERROR: (gcloud.sql.instances.import) ERROR_RDBMS
I have manually ran console uploads to the bucket which go fine. I am able to login to the sql instance and run queries. Which makes me think that there are no permission issues. Has anybody ever seen this type of error and knows a way around it?
Note: i have googled for possible situations, and most of them point to either sql or bucket permission issues.
Nvm. I figured out that i need to make an oauth connection (using the json token generated from gcloud api/credentials section) to the instance before interacting with it.