I am new in Google Cloud. I created a Cloud SQL Instance and I need to restore the data from a .bak file. I have the .bak file in a GCS bucket, and I am trying to restore using Microsoft Management Studio -> Task -> Restore. But I'm not able to access the file.
Can anyone help me with the procedure on how to restore from a .bak file?
You need to give the Cloud SQL service Account access to the bucket where the file is saved.
On Cloud Shell run the following:
gcloud sql instances describe [INSTANCE_NAME]
On the output search for the field "serviceAccountEmailAddress" an copy the SA email.
Then again on cloud shell run the following:
gsutil iam ch serviceAccount:<<SERVICE_ACCOUNT_EMAIL>:legacyBucketWriter gs://<<BUCKET_NAME>>
gsutil iam ch serviceAccount:<<SERVICE_ACCOUNT_EMAIL>:objectViewer gs://<<BUCKET_NAME>>
That should give the service account permission to access the bucket and retrieve the file, also here is the guide on doing the import, take in mind that doing the import will override all the data on the DB.
Also remember that:
You cannot import a database that was exported from a higher version of SQL Server. For example, if you exported a SQL Server 2017 Enterprise version, you cannot import it into a SQL Server 2017 Standard version.
Related
Using the script below, I was able to load the data to the table with local files.
db2 load from SOME/LOCAL/File.txt of asc modified by reclen=123 method L \(1 11, 12 14\) REPLACE INTO schema.tablename
However, I want to achieve to load the file from another server. I don't want to transfer the files from another server to db2 server so I will be able to use the command as above. Found that DB2REMOTE can be used for remotefiles in this documentation, but I'm not sure how to execute it with success.
Do I need to do this also? Because I don't have the right IAM role and don't have the credentials to do so. If I just can skip this and proceed to connect with another server only.
This is the script I'm trying with DB2REMOTE:
db2 load from 'DB2REMOTE://centos#123.456.789.0:/folders/directory/file.txt' of asc modified by reclen=123 method L \(1 11, 12 14\) REPLACE INTO schema.tablename
Thank you in advance!
DB2REMOTE is for accessing cloud object storage (e.g Amazon S3, IBM Cloud Object Storage), from some Db2 commands.
If you are not using cloud object storage, then mount the remote directory locally with appropriate permissions, and specify the local mountpoint with the Db2 load command .
You can remote mount with SSHFS or similar, when installed and properly configured. This is not programming , but instead it is administration and configuration.
I would be happy to get your help / feedback re data load.
Goal:
Load source data from a Postgres database, which is located in GCP project A to another Postgres database, which is located in GCP project B.
Challenge:
Get a connection (I have an IAM account with sufficient rights to run a COPY TO / COPY FROM command) to the Postgres DB in GCP Project A and copy the table either to a CSV or create a dump that can be used in order to be inserted to another Postgres DB in GCP Project B.
How do I connect to the database (e.g. if I create a key, where shall I store the json keyfile and would that approach even be feasible?) with this IAM email account?
Other ways I've researched were to use psycopg2 (thus I could use the function cursor.copy_expert (which doesn’t need any superuser right or Postgres user credentials and copy the data), but I didn’t succeed in connecting to the database with psycopg2 due to challenges with cloud proxy.
Another idea was to use pg_dump or gcloud sql export csv.
I would be curious if some of you were facing a similar challenge and how did you solve it and what might be the best way/practice
You can have a try out database migration service. You can set up a continuous migration configuration and use Cloud SQL for PostgreSQL.
Hello after a lot of searching I've come to these solutions:
If you have continuous copy, you need to use the database migration service, check this documentation.
If you have one shot copy:
you can restore your instance, see the bottom page of this documentation
you can create a bucket and backup your instance on it, then import it from the other project
I have a database that is giving error:
ASCII '\0' appeared in the statement, but this is not allowed unless option --binary-mode is enabled and mysql is run in non-interactive mode. Set --binary-mode to 1 if ASCII '\0' is expected.
I'm including importing the database through the console with gcloud sql import sql mydb gs://my-path/mydb.sql --database=mydb but I don't see in the documentation any flags for binary mode. Is it possible at all?
Optional - is there a way to set this flag when importing through the MySQL Workbench. I haven't seen anything about it there too, but may be I'm missing some setting or something. If there is way to set that flag, then I can import my database through MySQL Workbench.
Thank you.
Depending where the source database is hosted, on Cloud SQL or on an on-premise environment, the proper flags are set during the export, so the dump file is compatible with the target database.
Since you would like to import a file that has been exported from an on-premise environment, mysqldump is the suggested way to perform the export.
First, create a dump file as suggested in the documentation. Make sure to pay attention to the following 2 points:
Do not export customer-created MySQL users. This will cause the import to the new instance to fail. Instead, manually create the MySQL users you wish to.
Make sure that you have configured the appropriate flags in order to make sure that the dump file will contain all the necessary details you need. Eg triggers, stored procedures etc.
Then, create a Cloud Storage Bucket and upload the dump file to the bucket.
Before proceeding with the import, grant the Storage Object Admin role to the service account of the target Cloud SQL instance. You may do that with the following command:
gsutil iam ch serviceAccount:[SERVICE-ACCOUNT]:objectAdmin gs://[BUCKET-NAME]
You may locate the aforementioned Service Account in the Cloud SQL instance Overview, or by running the following command:
gcloud sql instances describe [INSTANCE_NAME]
The service account will be mentioned at the serviceAccountEmailAddress field.
Now you are able to do the import either from Console, or using the gcloud command or a REST API.
More details in Google documentation
Best Practices for importing/exporting data
I'm trying to export my database created in Google Cloud Sql and import it into a new external server.
I tried to create a sql backup through the google console, downloaded it and copied it to the new server via filezilla and then launched the following command:
psql -U postgres -d ciclods-db -1 -f Backup-db_Cloud_SQL_Export_2019-03-23\ \(17_01_19\)
but i get this output:
ERROR: role "cloudsqladmin" does not exist
REVOKE
ERROR: role
"cloudsqlsuperuser" does not exist GRANT
what is the right procedure to follow in these cases?
I have resolved the same problem by locating and deleting the two lines from the exported sql file with "cloudsqladmin". My app does not use it anyway.
to do this task you can follow the official GCP guide about How to export data from Cloud SQL[1] in that document they give you the option to export the data into a dump file or csv files which can be used for other tools.
https://cloud.google.com/sql/docs/mysql/import-export/exporting
In order to create the export file, you have to do it from a command line and use additional flags. As per documentation‘s “Exporting data to a SQL dump file”, there is a section on Exporting data from an externally-managed database server.
As well you can find there the option to export the data into a CSV file.
I am trying to restore a DB2 backup file into my BlueMix DashDB service. How do I go about doing this?
You cannot restore your DB2 backup image into dashDB for several reasons.
In an entry-level, shared dashDB instance you only have access to one schema in a physical database shared by others.
Even if you have a dedicated instance, you need 1) access to the database local disk to upload the image and 2) sufficient privileges (at least SYSMAINT authority) to perform the restore. I doubt either will be available to you.
What you can do is run db2look and db2move locally to extract your database DDL statements and data respectively. You can then run the extracted DDL script against dashDB provided you replace the original schema name(s) with the one available to you in dashDB and, after creating the tables, load your data into them.