Importing a large .sql file into a database - repeated timeout error myPHPAdmin - import

I have a .sql file (db) which I am trying to import using myphpadmin and keep getting a time out error.
The file is 46.6 MB (zipped)
Please note I am not on XAMPP but using a Godaddy myphpAdmin platform to manage the database.
What I've tried:
Re-downloaded the file as a zip file - and tried importing it. Still failed.
For this option given in phpmyadmin import, I tried UNSELECTING this option > "Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be a good way to import large files, however it can break transactions.)"....and I also tried importing the db keeping it selected, but this failed. Which should it be?
What else can I do?

nothing worked, except SSH.
What you need:
Database (that you are importing into) username and password
Cpanel username and password + IP address (for Putty)
I had to upload the .sql file to a folder on the public_html.
Download pUtty
In putty I needed the IP address (hosting server) as well as the cpanel username and password (so have that handy).
Once in, you have to enter your cpanel's password
Use the "cd" change directory command to change directory to where you have placed your .sql file.
Once there, use the following command:
mysql -p -u user_name database_name < file.sql*
(Note: replace 'user_name', 'database_name', and 'file.sql' with the actual name.)**
You will be prompted for your database password, and then your database will be imported.
Useful link: https://www.siteground.co.uk/kb/exportimport-mysql-database-via-ssh/

You can try unzipping the file locally and importing the uncompressed .sql file; the overhead of uncompressing the file in memory could be the problem for phpMyAdmin. Generally, though, what Shadow said is correct and you should use some other means for import (like the command-line client). You could also use the phpMyAdmin UploadDir feature to put the file on the file in a special folder that phpMyAdmin can directly access on the server. This can help with a lot of the resource limits the webserver imposes.

Related

How to backup postgres db hosted on cloud with pgadmin4?

I'm hosting my db using AWS RDS and I'm trying to backup tables. However once it's finished backing up, where is the downloaded on my computer?
Doesn't seem like theres a path to save the file
I've checked a couple of answers and others are having same issue
https://stackoverflow.com/a/29246636/11110509
The "Filename" element in that dialog box lets you pick a directory as well as file name. That is where it is. If you just typed in a filename without giving a path, then on Windows it is probably in your user's "Documents" folder.

PostgreSQL: Error importing csv file from shared network folder

My goal is to import csv file to postgresql database.
my file is located in network shared folder and I do not have no option to make it in a local folder.
My Folder located in :
"smb://file-srv/doc/myfile.csv"
When I run my this PostgreSQL script:
COPY tbl_data
FROM 'smb://file-srv/doc/myfile.csv' DELIMITER ',' CSV;
I would get this error :
ERROR: could not open file "smb://file-srv/doc/myfile.csv" for reading: No such file or directory
SQL state: 58P01
I have no problem to access the file and open it.
I am using PostgreSQL 9.6 under Ubuntu 16.04.
Please Advice how to fix this problem.
Update
When I try to access the file with postgres user I would have same error:
postgres#file-srv:~$$ cat smb://file-srv/doc/myfile.csv
cat: 'smb://file-srv/doc/myfile.csv' : No such file or directory
As I mention when I user mounted folder I created I can access the file.
it is about permission. you have to check read access on file and folders.
also, logging with superuser access may solve your problem.
In short, this is a permissions issue: Your network share is likely locally mounted to your user's UID, while the PostgreSQL server is running as the postgres user.
Second, when you log into your database, there is not an overlap between the database's users and the system's users, even if you have the same username. This means that when you request a file from your network share, the DB user, in this case postgres, does not have the necessary permissions.
To see this, and assuming you have root access on the box in question, you might try to become the postgres user and see that you cannot access the file:
$ sudo su - postgres
$ cat /run/user/.../smb.../yourfile.csv
Permission denied
The fix to your issue will involve -- somehow -- making the file or share accessible to the postgres user. Copying is certainly the quickest way. But that's off the table. You could mount the share (perhaps as read only) as the postgres user. You might do this in fstab.
However, unless this is going to be an automated detail that happens regularly, this seems like heroics. Without more information as to why you can't copy locally, I suggest copying the file locally.

How to COPY local file to remote database

I have remote postgresql database and a local csv file which I need to add to the database. I'm trying to do it with PyCharm.
Thus, I'm trying to copy data from a local file to a remote database.
If the database local is, then this command works:
COPY master_relationsextra(code, serial_number, member_type, characteristic, price_list)
FROM '/Users/name/Desktop/AUTOI.csv' with CSV HEADER delimiter ';' encoding 'ISO-8859-1';
But for the remote database it doesn't working.
Any advice how can I do that?
I'm using PyCharm thus I did with PyCharm's help. All queries and commands did PyCharm for me. I did it as follows:
I connected to the remote database from PyCharm database pane
Right click on table and then import from file
Choose all rules and import
That did the trick for me.

Export Postgres table to csv

I am trying to export my Postgres table to a csv on my desktop and I get this error:
ERROR: could not open file "C:\Users\blah\Desktop\countyreport.csv" for writing: Permission denied
SQL state: 42501
This is my query which I believe is the correct syntax
COPY countyreport TO 'C:\\Users\\blah\\Desktop\\countyreport.csv' DELIMITER ',' CSV HEADER;
According to the user manual:
Files named in a COPY command are read or written directly by the
server, not by the client application.
https://www.postgresql.org/docs/current/static/sql-copy.html
The common mistake is to believe that the filesystem access will be that of the (client) user, but it's not. It's normal to run the postgresql server as its own user. Therefore action carried out by the server will be done as a different OS user to the client. The server is usually run as an OS user postgres.
Assuming that you are running the server on your local machine then the simplest way to fix it would be to give postgres access to your home directory or desktop. This can be done by changing the windows security settings on your home directory.
Before you do this.... Stop and think. Is this what you are looking for? If the server is in development then will it always run on the user's machine. If not then you may need to use COPY to write to the stdout. See the manual for information on this.

Problem after migrating Magento

I am trying to create an exact mirror of a Magento production server on my local server for further development, but I have run into a few issues.
On the production server, our Magento is configured to run without displaying the index.php, but after attempting a migration to my local server, the index.php is required to access any links. Additionally, when I select a category to visit (for example), I am directed to http://localhost/category.html instead of http://localhost/my-magento-store.com/index.php/category.html
The other issue I've noticed is that I am unable to log in to the admin section. After entering the correct login credentials, I am redirected to the login screen again without any error messages.
I am running a MAMP stack on the local server, and here is what I have done:
Created a tar of the entire production server
Created a database backup in Magento System > Tools > Backups
Downloaded and extracted tar into local directory
Imported database dump into local MySQL using Alexey Ozerov's big dump script. (The .sql file is 1.3m lines)
Changed values of web/unsecure/base_url and web/secure/base_url in core_config_data table. (As I don't have a self-signed SSL cert, I put http://localhost:8888/my-magento-store/ for both values)
Dumped contents of var/cache and var/sesson
Changed permissions to 755 for all files on local dev server
Navigated to http://localhost:8888/my-magento-store/ but got the "Index of /" page instead.
Navigated to http://localhost:8888/my-magento-store/index.php and got an error.
Followed these steps to solve the error, reloaded the page, and the home page loaded correctly.
Any ideas?
URL Rewriting depends on your .htaccess file, so there are a couple of things to check:
web/seo/use_rewrites in core_config_data should be true.
when you created your tarball, did it include . files in the root directory especially .htaccess? If you used tar -cvf archive.tar * then it may have missed them. (Nice "feature" of *nix).
Check that your MAMP httpd.conf has AllowOverride All, otherwise your local .htaccess will be ignored.
I'm not familiar with MAMP, but it's possible that it's having a problem reading/interpreting your .htaccess, though this is unlikely. I'd focus on options 1 thru 3 first.
HTH,
JD