How to add Escape characters' " ' with gcloud sql import csv command? - postgresql

I want to upload data into Google cloudsql instance from csv file stored on GCS bucket. I am using postgresql database and to import csv files using gcloud sql import csv command in my shell script. There is an issue as some csv files contains " characters and to ignore that I want to add " as Escape character but gcloud sql import csv command doesn't have any fields to add escape character. Does anybody have anyidea on that?

As per documentation, to import CSV files to Cloud SQL PostgreSQL, the CSV file has to follow a specific format.
We can also see that for the command you're using there isn't any parameter that fits your requirements.
Instead, as an alternative, I'd use some sort of lint or text editor and try to massively remove the characters that conflict you, if possible.

Related

PostgresSQL unable to read csv files on my desktop

I am trying to import a CSV file into postgresSQL, however, I keep getting the error that no such file exists or directory.
this is the line of code I execute copy mu_data from
copy mu_data from 'users/mysurname/Desktop/FILE.CSV' DELIMITER ',' CSV
HEADER;
Can anyone suggest how to fix this?
copy is a command run on the server side. So unless your Postgres server happens to be on your localhost, the file very likely doesn't exist from the view of the server.
So one solution is you to transfer the file to the servers filesystem somehow. Or, if you're using the psql command line tool (or at least can use it for this task), you can use the \copy command there.

Import dataset into MongoDB

I am trying to insert this database into MongoDB using Studio 3T. I can import the bson without any issues (the countries and timezones) by selecting the parent folder and using BSON - mongodump folder option. However I cannot figure out how to import the split cities dataset.
I have tried all the options available on Studio3T and attempted to change the filename to gz however it always fails to import. I don't know what file format the cities are in.
Normally I do not have any issue importing but I cannot figure out how to do this. How would I achieve this?
The source DB is here https://github.com/VinceG/world-geo-data
This data is nothing but a big .bson file that has been gzipped up and split into various parts. I was not able to import the .bson file successfully. However, I could unzip the file at least without an error using the following commands and GZip for Windows
copy /b city_split_aa+city_split_ab+city_split_ac+city_split_ad+city_split_ae cities.bson.gz
gzip -d cities.bson

MongoDB - Error: field names cannot start with $ [$oid]

I'm facing an issue while trying to import .json files that were exported using the mongoexport command.
The generated .json files contains the character $ in some variables such as $oid and $numberLong().
{"_id":{"$oid":"55aff0e7b3bdf92b314b6fa6"},"activated":true,"authRole":"USER","authToken":"5bdad308-4a11-4890-8c3e-82c29530f1bc","birthDate":{"$date":"2015-08-06T03:00:00.000Z"},"comercialPhone":"99999994","email":"test#mail.com","mobilePhone":"99999999","name":"Test Test","password":"$2a$10$y","validationToken":"b2cd0d71-cb47-405d-bf7f-e46e1a8706e4","version":{"$numberLong":"35"}}
However, this format is not acceptable while importing the files. This format seems to be the strict mode, but I'd like to generate json files using the shell format which shows $oid as ObjectId.
Is there any workaround for this?
Thanks!

mass import .csv files into postgresql

i am using Postgresql 9.4 and trying to import all files in a specific folder into and existing table using the following command:
COPY xxx_table FROM '/filepath/filename_*.csv' DELIMITER ',' CSV HEADER;
marking * as a variable part of the file name.
However it results in an error. I have found similar question on here however non of them is related to "COPY" command or alternatively using psql.
Any help on this would be highly appriciated.
Do you need first create the table manually, then:
Copy data from your CSV file to the table:
Example:
\copy zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV
You can also specify the columns to read:
Example:
\copy zip_codes(ZIP,CITY,STATE) FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV

Importing CSV file into PostgreSQL

Using MySQL Administrator GUI tool I have exported some data tables retrieved from an sql dumpfile to csv files.
I then tried to import these CSV files into a PostgreSQL database using the postgres COPY command. I've tried entering
COPY articles FROM '[insert .csv dir here]' DELIMITERS ',' CSV;
and also the same command without the delimiters part.
I get an error saying
ERROR: invalid input syntax for integer: "id"
CONTEXT: COPY articles, line 1, column id: "id"
In conclusion my question is what are some thoughts and solutions to this problem? Could it possibly be something to do with the way I created the csv files? or have I made a rookie mistake elsewhere?
If you have header columns just add the header qualifier to the copy statement as per
documentation to skip that line
http://www.postgresql.org/docs/8.4/static/sql-copy.html