Failed to import gs://bucket_name/Cloud.sql - google-cloud-storage

I have stored everything needed for my database in phpmyadmin , and exported the my database from it. That was saved as Cloud.sql , so now this sql file I imported it to the Google Cloud Storage with the help of this link https://developers.google.com/cloud-sql/docs/import_export.
Now after importing the Contents of .sql using the Import option present in the action of the instance, it shows the green working sign , and after a while it stops, when I check in the Logs , it shows
Failed to import gs://bucket_name/Cloud.sql: An unknown problem occurred (ERROR_RDBMS)
So ,
I am unable to find out the reason behind the error as its not clear, and how can this be solved

Google Cloud Sql probably doesn't know to which database the gs://bucket_name/Cloud.sql commands apply.
From https://groups.google.com/forum/#!topic/google-cloud-sql-discuss/pFGe7LsbUaw:
The problem is the dump doesn't contain the name of the database to use. If you add a 'USE XXX' at the top the dump where XXX is the database you want to use I would expect the import to succeed.

I had a few issues that were spitting out the ERROR_RDBMS error.
It turns out that google actually does have more precise errors now, but you have to go here
https://console.cloud.google.com/sql/instances/{DATABASE_NAME}/operations
And you will see a description of why the operation failed.

Related

ERROR: cannot execute SELECT in a read-only transaction when connecting to DB

When trying to connect to my Amazon PostgreSQL DB, I get the above error. With pgAdmin, I get "error saving properties".
I don't see why to connect to a server, I would do any write actions?
There are several reasons why you can get this error:
The PostgreSQL cluster is in recovery (or is a streaming replication standby). You can find out if that is the case by running
SELECT pg_is_in_recovery();
The parameter default_transaction_read_only is set to on. Diagnose with
SHOW default_transaction_read_only;
The current transaction has been started with
START TRANSACTION READ ONLY;
You can find out if that is the case using the undocumented parameter
SHOW transaction_read_only;
If you understand that, but still wonder why you are getting this error, since you are not aware that you attempted any data modifications, it would mean that the application that you use to connect tries to modify something (but pgAdmin shouldn't do that).
In that case, look into the log file to find out what statement causes the error.
This was a bug which is now fixed, Fix will be available in next release.
https://redmine.postgresql.org/issues/3973
If you want to try then you can use Nightly build and check: https://www.postgresql.org/ftp/pgadmin/pgadmin4/snapshots/2019-02-17/

Parse Migration - How to migrate parse data to localhost mongoDB?

I had been trying to migrate my parse data to localhost mongoDB but to no avail. There are a total of 12 steps as mentioned in https://parse.com/migration#database
I am currently still at step 1 and I had encountered some difficulties. I managed to set mongoDB on my computer (localhost). Then I went to my "app settings" in parse to start the data migration. Parse wanted me to paste the mongoDB connection URL which I had entered as "mongodb://localhost/". However, there was an error "no reachable servers". On my localhost, I am running the mongoDB using my terminal.
Any kind advise on this? This is my first time doing data migration and trying out mongoDB. Any help will be greatly appreciated!
Cheers
In your parse dashboard, go to App settings -> General. In this page you can find the "Export app data" button. Click and parse send you an email with the csv database data, use it for import your data in your local database (use rockmongo for example or mongoimport)

DB2 10.1 table data copy to IBM VSE 7.4 table

We have one application where we have db2 10.1 as database.
Now one requirement came in which we need to interface few tables to HOST which is on IBM DB2 VSE 7.4
I tried to execute load command with client option but it give "SQL1325N The remote database environment does not support the command or one
of the command options." error.
command is :"D:\tempdata>db2 load client from app.tbl of ixf
insert into host.tbl"
Many post says that its not allow to use load from 10.1 to VSE Z/OS.
Another option I tried is import but its too slow and we need to delete records every time as truncate is not available.
Replication can be think for option but we would like to avoid replication.
Can anyone suggest way to achieve this. Load can be use or not?
It seems its not allow to use load from remote machine. But wondering what is use of CLIENT option in load then.
Finally we decided to use import utility after deleting HOST DB2 records. In that we need to execute delete and import command on part of table. If we try to import or delete big table at one go it give error of log file size full.
Hope this will help.

How can you get super privilege for a google sql cloud instance?

I am working on a google app engine. In google cloud sql i have created one instance and whenever i import my sql file in cloud sql's instance then it shows me an error like below:
ERROR 1227 (42000) at line 1088: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
Operation failed with exitcode 1
What to do to get super privilege for my cloud sql instance?
You can't have SUPER root priviliges in CLoud SQL due to its restrictions [1]. Here [2] are some tips to import files thta might help.
[1] https://cloud.google.com/sql/faq
[2] https://cloud.google.com/sql/docs/import-export#import
Statement
DEFINER=username#`%
is an issue in your backup dump.
The solution that you can work around is to remove all the entry from sql dump file and import data from GCP console.
Use command to edit the dump file and generate new one -
cat DUMP_FILE_NAME.sql | sed -e 's/DEFINER=<username>#%//g' >
NEW-CLEANED-DUMP.sql
After removing the entry from dump and completing successfully you can try reimporting.
you can edit import sql database file and remove DEFINER from file.
I had the same problem few days ago.
I deleted "Definer = your username # local host" from MySql, and tried to import after, it worked. D

Google Cloud MySql Instance An unknown error occurred when importing

I created a mysql instance and everything is running on it. I used the process outlined in the cloud sql import to create a mysqldump file from my local production mysql instance running on windows.
mysqldump --databases xxxxx -uroot -p --hex-blob --default-character-set=utf8 > d:\database_file_feb2_2014.sql
Uploaded the file to the cloud storage and every time I try to import I get a Unknown error.
Things I have checked.
1) Made sure the USE database; command was in the file after the
CREATE DATABASE IF NOT EXISTS databasename;
command.
2) Made sure i was using the --hex-blob command.
created a export of smaller test db which was only 4.5MB instead of the 6GB file i was trying to import. Ran the first few lines from the sql prompt which ran fine.
Still unable to isolate at which line the import is breaking or Why
When I try to view the log from the old console, I get "An error has occurred. Please retry later."
I have dumped the whole instance and recreated the instance and still get the same error.
Origin OS: Windows Server 2003 R2
Running Mysql 5.1
Any advise on how I can troubleshoot this and move forward.
Thank you