COPY command do not export json format properly - postgresql

I am trying to export a column type jsonb using the copy command as follow:
payload
-------
{"test": "testing"}
sudo -u postgres psql test_database -c "COPY (SELECT payload FROM test_table) TO STDIN CSV"
The output gives me quoted text which is not a correct json format:
"{""test"":""testing""}"
How can I get a correct json format ?

You've chosen CSV output format, which escapes quotes that way. COPY does not produce JSON.
Do not use COPY to get the output, rather see store postgresql result in bash variable or How to return a value from psql to bash and use it?:
psql -U postgres -d test_database -AXqtc "SELECT payload FROM test_table;"

Related

How to pass sql file in "\copy" psql command line

I am trying to find a way to pass a file to psql while using '\copy'. There is a question posted here use sql file in "\copy" psql command line asking a similar thing, but the accepted solution doesn't actually pass a file to psql, it passes the contents of the file, so this cannot be done when the contents of the sql file exceeds the maximum allowed length of the psql command.
e.g. something like this psql -c data_base "\copy \file_path <file.sql> To './test.csv' With CSV"
Using copy rather than \copy, you can do it like this:
(echo "copy ("; cat file.sql ; echo ") to STDOUT with CSV")| psql -X > ./test.csv

Syntax error in "psql" , command not get executed

I am using timescaledb.
The doumentation I am following is Using PostgreSQL's COPY to migrate data from a csv file to timescale db. The name of csv file is test.csv.
I created the db named test , the name of table is test1. Table is a hypertable as per the timescaledb documentation.
The table's structure and csv files structure are the same.
While executing the following command in cmd I am not getting a result other than an addition of - symbol in the console command test-#
psql -d test -c "\COPY test1 FROM C:\Users\DEGEJOS\Downloads\test.csv CSV"
If I put ; after the command
psql -d test -c "\COPY test1 FROM C:\Users\DEGEJOS\Downloads\test.csv CSV"; I am getting a syntax error at Line 1.
How can I solve this error and insert data from csv file to db.?
You are trying to run psql with \COPY inside psql session, thus you get an error in the second call, since psql keyword does not exist in SQL. psql is an executable.
To follow the instructions from Timescale, you need to call the command directly in CMD. I.e, call:
psql -d test -c "\COPY test1 FROM C:\Users\DEGEJOS\Downloads\test.csv CSV"
If you are in C:\Users\DEGEJOS as in your screenshoot, it will look like:
C:\Users\DEGEJOS\psql -d test -c "\COPY test1 FROM C:\Users\DEGEJOS\Downloads\test.csv CSV"

execute a postgresql query from a file and write the output to another csv file

I need to execute a query from a file and write the output of it to a CSV file. I want to keep the delimiter as ;. I have tried below queries.
psql -h localhost -p 5432 -U postgres -d postgres -f \path\to\sqlQuery.sql -o \path\to\result\result.csv
This query puts the result into a file but it is in a postgres result tabular format.
psql -h localhost -p 5432 -U postgres -d postgres -f copy(\path\to\sqlQuery.sql) to \path\to\result.csv csv header;
Above query is giving me a syntax error.
I'm looking for a way to use COPY command and \f or -f together so I'll be able to exceute the query from a file and also write the output to another CSV file with specified delimiter.
Change your SQL script to use COPY:
COPY (/* your query */) TO STDOUT
(FORMAT 'csv', DELIMITER ';');

How to restore pg_dump file into postgres database

So I regularly backup and restore databases and schema's using pgadmin4. I would like to do it with a batch file using commands as pg_dump and pg_restore. I however always fail to succeed in this and could use some help. The way I try to dump one schema (with data) is the following:
pg_dump -U postgres -F c -d database -n schema > mw2
Then I try to restore it with pg_restore:
pg_restore -U postgres -d otherdatabase -n schema mw2
First I tried to use .dump in stead of tar but the result stays the same; which is an empty schema in my database.
Or the following error message:
pg_restore: [archiver] input file does not appear to be a valid archive
https://www.postgresql.org/docs/current/static/app-pgrestore.html
--format=format Specify format of the archive. It is not necessary to specify the format, since pg_restore will determine the format
automatically. If specified, it can be one of the following:
custom, directory and tar
https://www.postgresql.org/docs/current/static/app-pgdump.html
--format=format
Selects the format of the output. format can be one of the following:
p plain Output a plain-text SQL script file (the default).
others are custom, directory and tar
in short - you used defualt plain format, which is meant for using with psql, not pg_restore. So either specify different format with pg_dump or use your file as
psql -f mw2.tar

psql - save results of command to a file

I'm using psql's \dt to list all tables in a database and I need to save the results.
What is the syntax to export the results of a psql command to a file?
From psql's help (\?):
\o [FILE] send all query results to file or |pipe
The sequence of commands will look like this:
[wist#scifres ~]$ psql db
Welcome to psql 8.3.6, the PostgreSQL interactive terminal
db=>\o out.txt
db=>\dt
Then any db operation output will be written to out.txt.
Enter '\o' to revert the output back to console.
db=>\o
The psql \o command was already described by jhwist.
An alternative approach is using the COPY TO command to write directly to a file on the server. This has the advantage that it's dumped in an easy-to-parse format of your choice -- rather than psql's tabulated format. It's also very easy to import to another table/database using COPY FROM.
NB! This requires superuser or pg_write_server_files privileges and will write to a file on the server.
Example: COPY (SELECT foo, bar FROM baz) TO '/tmp/query.csv' (format csv, delimiter ';')
Creates a CSV file with ';' as the field separator.
As always, see the documentation for details
Use o parameter of pgsql command.
-o, --output=FILENAME send query results to file (or |pipe)
psql -d DatabaseName -U UserName -c "SELECT * FROM TABLE" -o /root/Desktop/file.txt
\copy which is a postgres command can work for any user. Don't know if it works for \dt or not, but general syntax is reproduced from the following link Postgres SQL copy syntax
\copy (select * from tempTable limit 100) to 'filenameinquotes' with header delimiter as ','
The above will save the output of the select query in the filename provided as a csv file
EDIT:
For my psql server the following command works this is an older version v8.5
copy (select * from table1) to 'full_path_filename' csv header;
Use the below query to store the result in a CSV file
\copy (your query) to 'file path' csv header;
Example
\copy (select name,date_order from purchase_order) to '/home/ankit/Desktop/result.csv' cvs header;
Hope this helps you.
If you got the following error
ufgtoolspg=> COPY (SELECT foo, bar FROM baz) TO '/tmp/query.csv' (format csv, delimiter ';');
ERROR: must be superuser to COPY to or from a file
HINT: Anyone can COPY to stdout or from stdin. psql's \copy command also works for anyone.
you can run it in this way:
psql somepsqllink_or_credentials -c "COPY (SELECT foo, bar FROM baz) TO STDOUT (format csv, delimiter ';')" > baz.csv
COPY tablename TO '/tmp/output.csv' DELIMITER ',' CSV HEADER;
this command is used to store the entire table as csv
I assume that there exist some internal psql command for this, but you could also run the script command from util-linux-ng package:
DESCRIPTION
Script makes a typescript of everything printed on your terminal.
This approach will work with any psql command from the simplest to the most complex without requiring any changes or adjustments to the original command.
NOTE: For Linux servers.
Save the contents of your command to a file
MODEL
read -r -d '' FILE_CONTENT << 'HEREDOC'
[COMMAND_CONTENT]
HEREDOC
echo -n "$FILE_CONTENT" > sqlcmd
EXAMPLE
read -r -d '' FILE_CONTENT << 'HEREDOC'
DO $f$
declare
curid INT := 0;
vdata BYTEA;
badid VARCHAR;
loc VARCHAR;
begin
FOR badid IN SELECT some_field FROM public.some_base LOOP
begin
select 'ctid - '||ctid||'pagenumber - '||(ctid::text::point) [0]::bigint
into loc
from public.some_base where some_field = badid;
SELECT file||' '
INTO vdata
FROM public.some_base where some_field = badid;
exception
when others then
raise notice 'Block/PageNumber - % ',loc;
raise notice 'Corrupted id - % ', badid;
--return;
end;
end loop;
end;
$f$;
HEREDOC
echo -n "$FILE_CONTENT" > sqlcmd
Run the command
MODEL
sudo -u postgres psql [some_db] -c "$(cat sqlcmd)" >>sqlop 2>&1
EXAMPLE
sudo -u postgres psql some_db -c "$(cat sqlcmd)" >>sqlop 2>&1
View/track your command output
cat sqlop
Done! Thanks! =D
Approach for docker
via psql command
docker exec -i %containerid% psql -U %user% -c '\dt' > tables.txt
or query from sql file
docker exec -i %containerid% psql -U %user% < file.sql > data.txt