SQLite: set quote character exporting CSV - quoting

I have a SQL database. If I export it as CSV using sqlitebrowser there is an option to define a "Quote character". How can I set the quote character when doing the export with a script? The default is " and want it to be #.
Export script without quote-character definition:
sqlite3 my.db<<EOF
.mode csv
.separator |
.output out.csv
SELECT * FROM myTable
EOF
Desired output:
#This is#|# a
"table" with#|# three columns, #
# some
newlines,
and...
#|#two#|# "rows". #
I've looked in the documentation und couldn't find such an option. Does it exist?

Related

COPY Postgres table with Delimiter as double byte

I want to copy a Postgres (version 11) table into a csv file with delimiter as double byte character. Please assist if this can be achieved.
I am trying this:
COPY "Tab1" TO 'C:\Folder\Tempfile.csv' with (delimiter E'অ');
Getting an error:
COPY delimiter must be a single one-byte character
You could use COPY TO PROGRAM. On Unix system that could look like
COPY "Tabl" TO PROGRAM 'sed -e ''s/|/অ/g'' > /outfile.csv' (FORMAT 'csv', delimiter '|');
Choose a delimiter that does not occur in the data. On Windows, perhaps you can write a Powershell command that translates the characters.

Postgres copy with strings enclosed in quotes

I have a tab delimited file I am trying to import into Postgres. It looks like this -
"ID" "NBR"
"101931126593" "3"
I can successfully import this file using this command -
\copy service from test.txt with delimiter E'\t' null as 'NULL';
However, this is not omitting the quotes. For example, I want ID to be 101931126593 and not "101931126593".
I have tried this so far, but it still does not import it without the quotes -
\copy service from test.txt with CSV delimiter E'\t' QUOTE E'\b' null as 'NULL';
" is the default quoting character anyway:
\copy service FROM 'test.txt' (FORMAT 'csv', DELIMITER E'\t', NULL 'NULL')

How to import CSV file using PowerShell?

I want to import a CSV File (comma delimited with double quote) into SQLite using PowerShell script. I tried:
echo ".import export.csv mytable" | sqlite3.exe
I get this error:
export.csv:327: unescaped " character
I get this error for all lines. On SQLite's command line shell same command works:
sqlite > .import export.csv mytable
How can I make this command work using a PowerShell script?
This works for me in both PowerShell 5.1 and v7.0.
$params = #"
.mode csv
.separator ,
.import export.csv mytable
"#
$params | .\sqlite3.exe mydatabase.db
The following single command line works in both
cmd.exe (version 10.0.x.x via ver) and
powershell.exe (version 5.1.x.x via $PSVersionTable)
.\sqlite3.exe my.db ".import file1.csv table1 --csv"
This loads the contents of csv file file1.csv into table table1 within the sqlite database file my.db. The --csv flag will import the file and create the table structure based on the header column names in that file, and will approximately infer the data types.
You can import multiple files this way.
i.e.
.\sqlite3.exe my.db ".import file1.csv table1 --csv" ".import file2.csv table2 --csv"
You can chain commands together to immediately open the database for querying by appending ; .\sqlite3.exe my.db.
i.e.
.\sqlite3.exe my.db ".import file1.csv table1 --csv" ".import file2.csv table2 --csv; .\sqlite3.exe my.db"
what about sqlite3.exe ".import export.csv mytable".
You can check the documentation of sqlite3.exe to use it not being in its shell, how to pass parameter to sqlite3.exe.
You can try below line
Invoke-Expression 'sqlite3 yourdatabase.db < ".import export.csv mytable"'

In SAS, How Do I Create/Alter Postgres tables?

I have SAS code that will write to a postgres table if it is already created but still empty. How can I create/alter a postgres table from SAS (or using a script that pulls in SAS macro variables) if it does not exist or already has data? The number of fields may change. Currently, I use the filename option along with the pipe to write to the postgres file.
filename pgout pipe %unquote(%bquote(')/data/dwight/IFS6.2/app/PLANO/sas_to_psql.sh
%bquote(")&f_out_schema.%bquote(").&file_name.
%bquote(')
)
;
I've tried using this version, but it does not work:
filename pgout pipe %unquote(%bquote(')/data/dwight/IFS6.2/app/PLANO/sas_to_psql.sh
%bquote('')CREATE TABLE mdo_backend.fob_id_desc
SELECT * FROM &library_name..&file_name.
%bquote(")&f_out_schema.%bquote(").&file_name./('')/
%bquote(')
)
;
This is the script I use:
LOAD_TO_PSQL.SH
#!/bin/bash
. /data/projects/ifs/psql/config.sh
psql -d $DB -tAq -c "COPY $1 FROM STDIN USING DELIMITERS '|'"

mass import .csv files into postgresql

i am using Postgresql 9.4 and trying to import all files in a specific folder into and existing table using the following command:
COPY xxx_table FROM '/filepath/filename_*.csv' DELIMITER ',' CSV HEADER;
marking * as a variable part of the file name.
However it results in an error. I have found similar question on here however non of them is related to "COPY" command or alternatively using psql.
Any help on this would be highly appriciated.
Do you need first create the table manually, then:
Copy data from your CSV file to the table:
Example:
\copy zip_codes FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV
You can also specify the columns to read:
Example:
\copy zip_codes(ZIP,CITY,STATE) FROM '/path/to/csv/ZIP_CODES.txt' DELIMITER ',' CSV