Can a CSV in a text variable be COPY-ied in PostgreSQL - postgresql

For example, say I've got the output of:
SELECT
$text$col1, col2, col3
0,my-value,text
7,value2,string
0,also a value,fort
$text$;
Would it be possible to populate a table directly from it with the COPY command?

Sort of. You would have to strip the first two and last lines of your example in order to use the data with COPY. You could do this by using the PROGRAM keyword:
COPY table_name FROM PROGRAM 'sed -e ''1,2d;$d'' inputfile';
Which is direct in that you are doing everything from the COPY command and indirect in that you are setting up an outside program to filter your input.

Related

How to get output of a sql with additional special characters

Long story...
I am trying to geenrate a crosstab query dynamically and run it as a psql script..
To achieve this, I want the last line of the sql to generated and appended to the top portion of the sql.
The last line of the sql is like this.... "as final_result(symbol character varying,"431" numeric,"432" numeric,"433" numeric);"
Of which, the "431", "432" etc are to be generated dynamically as these are the pivot columns and they change from time to time...
So I wrote a query to output the text as follows....
psql -c "select distinct '"'||runweek||'" numeric ,' from calendar where runweek between current_runweek()-2 and current_runweek() order by 1;" -U USER -d DBNAME > /tmp/gengen.lst
While the sql provides the output, when I run it as a script, because of the special characters (', "", ) it fails.
How should I get it working? My plan was then loop through the "lst" file and build the pivot string, and append that to the top portion of the sql and execute the script... (New to postgres, does not know , dynamic sql generation and execution etc.. but I am comfortable with UNIX scripting..)
If I could somehow get the output as
("431" numeric, "432" numeric....etc) in a single step, if there is a recommendation to achieve this, it will be greatly appreciated.....
Since you're using double quotes around the argument, double quotes inside the argument must be escaped with a backslash:
psql -c "select distinct '\"'||runweek||'\" numeric ,' from calendar where runweek between current_runweek()-2 and current_runweek() order by 1;"
Heredoc can also be used instead of -c. It accepts multi-line formatting so that makes the whole thing more readable.
(psql [arguments] <<EOF
select distinct '"'||runweek||'" numeric ,'
from calendar
where runweek between current_runweek()-2 and current_runweek()
order by 1;
EOF
) > output
By using quote_ident which is specifically meant to produce a quoted identifier from a text value, you don't even need to add the double quotes. The query could be like:
select string_agg( quote_ident(runweek::text), ',' order by runweek)
from calendar
where runweek between current_runweek()-2 and current_runweek();
which also solves the problem that your original query has a stray ',' at the end, whereas this form does not.

How do I replace (select current_timestamp) with a filename that houses this same select statement?

I am using PSQL. My command line is:
$\copy (select current_timestamp) to '/home/myname/outputfile.txt'
I would like to know, How do I replace "(select current_Timestamp)" with a filename that houses that same select statement?
ex:
$\copy (My_SQL_FILE.sql) to '/home/myname/outputfile.txt'
I've tried googling but I can't seem to find the answer.
$\copy (my_Sql_file.sql) to '/home/myname/outputfile.txt'
does not work
You want to run a query stored in a file in a \copy statement, i.e. execute that query and store the output in a file? That is doable.
I've come across this use-case myself and wrote psql2csv. It takes the same arguments as psql and additionally takes a query as parameter (or through STDIN) and prints the output to STDOUT.
Thus, you could use it as follows:
$ psql2csv [CONNECTION_OPTIONS] < My_SQL_FILE.sql > /home/myname/outputfile.txt
What psql2csv will basically do is transform your query to
COPY ($YOUR_QUERY) TO STDOUT WITH (FORMAT csv, ENCODING 'UTF8', HEADER true)
and execute that through psql. Note that you can customize a bunch of things, such as headers, separator, etc.

The value in "sym" file disappears when using splayed tables

I am using the following line:
`:c:/dir/ set .Q.en[`:c:/dir; tablename]
Everything is ok if I don't exit KDB, but if I do and then try to load the table using
get `dir
all the symbol columns are integer. I would really appreciate your help into understanding why this happens.
It looks like you forgot to repeat the table name on the l.h.s. of set.
Try
q)`:c:/dir/tablename/ set .Q.en[`:c:/dir; tablename]
This will correctly save table columns in c:/dir/tablename subdirectory and place the sym file alongside. Now you should be able to load both your table and the sym file by using the \l command or specifying c:/dir on the command line when you restart q
q c:/dir
or
q
q)\l c:/dir
(no backticks or leading :'s in either of those commands)
If you want to use get on this table, you will have to load sym separately:
q)load`:c:/dir/sym
q)get`:c:/dir/tablename/
(note the leading : in the path specs)
Finally, you may want to take a look at the rsave command which will save your table without you having to write tablename twice.
.Q.en takes 2 oarams - file handle and table data
Your first param isnt a hsym - should be backtick then colon then path to your db root
Also set takes 2 params - first in this case should be the path to where you want to save like dir/splayedTableName/

How to treat a comma as text in output?

There's 1 column that contains commas. When I output my query to csv, these commas break the csv format. What I've been doing to avoid this is a simple
replace(A."Sales Rep",',','')
Is there a better way of doing this so that I can actually get the commas in the final output without breaking the csv file?
Thanks!
You can use the COPY command to get PostgreSQL to build the CSV for you:
COPY -- copy data between a file and a table
Something like one of these:
copy your_table to 'filename' csv
copy your_table to 'filename' csv force quote *
copy your_table to stdout csv force quote *
copy your_table to stdout csv force quote * header
...
You have to be the super user to copy to a filename though. If you're inside psql, you can use the \copy command:
Performs a frontend (client) copy. This is an operation that runs an SQL COPY command, but instead of the server reading or writing the specified file, psql reads or writes the file and routes the data between the server and the local file system.
The syntax is pretty much the same:
\copy your_table to 'filename.csv' csv force quote * header
...
Quote the fields with "
a,this has a , in it,b
would become
a,"this has a, in it",b
and if the fields have BOTH a , and a ", double the quotes:
a,this has a " and , in it,b
becomes
a,"this has a "" and , in it",b

Why won't this ISQL command run through Perl's DBI?

A while back I was looking for a way to insert values into a text field through isql
and eventually found some load command that worked out for me.
It doesn't work when I try to execute it from Perl. I get a syntax error. I have tried two separate methods and both are not working so far.
I have the SQL statement variable print out at the end of each loop cycle so I know that the syntax is correct, but just not getting across correctly.
Here's the latest snip of code I was testing:
foreach(#files)
{
$STMT = <<EOF;
load from $_ insert into some_table
EOF
$sth = $db1->prepare($STMT);
$sth->execute;
}
#files is an array whose elements are a full path/location of a pipe-delimited text file (ex. /home/xx/xx/xx/something.txt)
The number of columns in the table match the number of fields in the text file and the type-checking is fine (I've loaded test files manually without fail)
The error I get back is:
DBD::Informix::db prepare failed: SQL: -201: A syntax error has occurred.
Any idea what might be causing this?
EDIT to RET's & Petr's answers
$STMT = "'LOAD FROM $_ INSERT INTO table'";
system("echo $STMT | isql $db")
I had to change it to this, because the die command would force an unnatural death and the statement had to be wrapped in single quotes.
Petr is exactly right, the LOAD statement is an ISQL or DB-Access extension, so you can't execute it through DBI. If you have a look at the manual, you'll see it is also invalid syntax for SPL, ESQL/C and so on.
It's not clear whether you have to use perl to execute the script, or perl is just a convenient way of generating the SQL.
If the former, and you want a pure-perl method, you have to prepare an INSERT statement (there's just one table involved by the look of it?), and slurp through the file, using split to break it up into columns and executing the prepared insert.
Otherwise, you can generate the SQL using perl and execute it through DB-Access, either directly with system or by wrapping both in either a shell script or DOS batch file.
System call version
foreach (#files) {
my $stmt = "LOAD FROM $_ INSERT INTO table;\n";
system("echo $stmt | dbaccess $database")
|| die "Statement $stmt failed: $!\n";
}
In a batch script version, you could write all the SQL into a single script, ie:
perl -e 'while(#ARGV){shift; print "LOAD FROM '$_' INSERT INTO table;\n"}' file1 [ file2 ... ] > loadfiles.sql
isql database loadfiles.sql
NB, the comment about quotes on the filename is only relevant if the filename contains spaces or metacharacters, the usual issue.
Also, one key difference in behaviour between isql and dbaccess is that when executed in this manner, dbaccess does not stop on error, but isql will. To make dbaccess stop processing on error, set DBACCNOIGN=1 in the environment.
Hope that's helpful.
This is because your query is not SQL query, it is an isql command that tells isql to parse the input file and generate INSERT statements.
If you think about it, the server can be on a completely different machine and has no idea what file are you talking about and how to access it.
So you basically have two options:
call isql and pipe the LOAD command to it - very ugly
parse the file yourself and generate the INSERT statements
Please note that the file Notes/load.unload is distributed with DBD::Informix and contains guidelines on how to handle UNLOAD operations using Perl, DBI and DBD::Informix. Somewhat to my chagrin, I see that it says "T.B.D." (more or less) for the LOAD section.
As other people have stated, the LOAD and UNLOAD statements are faked by various client-side tools to look like SQL statements, but the Informix server does not support them itself, mainly because of the issue with getting the file from a client machine (perhaps a PC) to the server machine (perhaps a Solaris machine).
To simulate the LOAD statement, you would need to analyze the INSERT INTO Table part. If it lists columns (INSERT INTO Table(Col03, Col05, Col09)), then you can expect three values in the load data file, and they go into those three columns. You would prepare a statement 'SELECT Col03, Col05, Col09 FROM Table' to get the types of the columns. Otherwise, you need to prepare a statement 'SELECT * FROM Table' to get the complete list of columns (and their types). Given the column names and the number of columns, you can create and prepare a suitable insert statement: 'INSERT INTO Table(Col03, Col05, Col09) VALUES(?,?,?)' or 'INSERT INTO Table VALUES(?,?,?,?,?,?,?,?,?)'. You could (arguably should) include column names in the second one.
With that ready, you now have parse the unloaded data. There is a document available in the SQLCMD program available from the IIUG Software Archive (which has been around a lot longer than Microsoft's upstart program of the same name). That describes the UNLOAD format in considerable detail. Perl has the ability to handle anything Informix uses - witness the UNLOAD information in the load.unload file distributed with DBD::Informix.
A quick bit of Googling showed that the syntax for load puts quote marks around the file name. What if you change your statement to be:
load from '$_' insert into some_table
Since your statement is not using place holders, you have to put the quotes in yourself, as opposed to using the DBI quoting functionality.