I am using a Mac laptop and I am trying to copy a local csv file and import it into a postgresql table. I have used the delimiter query and the following query works:
copy c2013_levinj.va_clips_translation
from local '/Users/jacoblevin/Desktop/clips_translation_table.csv'
Delimiter ',' skip 1 rejectmax 1;
However, each time the query is submitted, I receive a message that says "0 rows fetched." I have tried dropping the table and re-creating it as well as using the "select *" query. Suffice to say, i have been unable to pull any data. Does anyone have any ideas what's wrong? Thanks in advance.
What happens if you try this:
copy c2013_levinj.va_clips_translation
from local '/Users/jacoblevin/Desktop/clips_translation_table.csv'
WITH CSV HEADER;
That should be more robust and do what you want.
Related
Just posting this question and the solution since it took forever for me to figure this out.
Using CSV file, I was trying to import data into PostgreSQL with pgAdmin. I kept running into the same issue of "extra data after last expected column."
Solution that worked for me (instead of using Import module): copy tablename (columns) FROM 'file location .csv' CSV HEADER
Since some of the data included multiple commas within the cell, it was counting as a new column each time.
I am trying to delete the record from teradata and then write into the table for avoiding duplicates
So i have tried in many ways which is not working
I have tried deleting while reading the data which is giving syntax error like '(' expected between table and delete
spark.read.format('jdbc').options('driver','com.TeradataDriver').options('user','user').options('pwd','pwd').options('dbtable','delete from table').load()
Also tried like below, which is also giving syntax error like something expected between '('and delete
options('dbtable','(delete from table) as td')
2)I have tried deleting while writing the data which is not working
df.write.format('jdbc').options('driver','com.TeradataDriver').options('user','user').options('pwd','pwd').options('dbtable','table').('preactions','delete from table').save()
Possible solution is to call procedure which delete data.
import teradata
host,username,password = '','', ''
udaExec = teradata.UdaExec (appName="test", version="1.0", logConsole=False)
with udaExec.connect(method="odbc"
,system=host
,username=username
,password=password
,driver="Teradata Database ODBC Driver 16.20"
,charset= 'UTF16'
,transactionMode='Teradata') as connect:
connect.execute("CALL db.PRC_DELETE()", queryTimeout=0)
I have a csv file with 108 columns which i try to import in my postgresql table. It is obvious that I don't want to specify every columns in my CREATE TABLE statement. But when I enter
\COPY 'table_name' FROM 'directory' DELIMITER ',' CSV HEADER; this error message shows up: "ERROR: Extra Data after Last Expected Column". When having a few columns I know how to fix this problem but, like I said, i don't want to specified the entire 108 columns. By the way my table does contain any columns at all. Any help on how I could do that? Thx !
When dealing with problems like this, I often cheat. Plenty of tools exist online for converting CSV to SQL, https://www.convertcsv.com/csv-to-sql.htm being one of them.
Copy/paste your CSV, copy/paste the generated SQL. Not the most elegant solution, although will work as a one-off situation.
Now, if you're looking to repeat this process regularly (automated I hope), then Python may be a interesting language to explore to quickly write a script to do this for you, then schedule it at a CRON job or whatever method you prefer for invoking it automatically with the correct input (CSV file).
Please feel free to let me know if I've misunderstood your original question, or if I can provide any more help give me a shout and I'll do my best!
I am wondering how to create or export a CSV file from SQL? Is there any function for that similar to pgsql2shp?
I would appreciate your ideas, tip or solutions.
You can save a complete table as a file using this command:
COPY tablename TO STDOUT CSV
Ref: https://www.postgresql.org/docs/current/static/sql-copy.html
You can give this a try. But i believe there may be some syntax changes depending on the version.
COPY (SELECT foo,bar FROM whatever) TO ‘/tmp/dump.csv’ WITH CSV HEADER
If you use pgAdmin, you can export any query you run to a CSV file.
I've to import a file from an external source to a postgresql table.
I tried to do it with \copy from , but I keep getting errors (additional columns) in the middle of the file.
Is there a way to tell postgresql to ignore lines containing errors during a "\copy from" ?
Thanks
Give it a try with PostgreSQL Loader instead.
No. All data is correct or there is no data at all, those are the two options you have in PostgreSQL.