How to dump data from tables in clickhouse using mremoteNg? - nosql

I use mRemoteNg to access my clickhouse db. I want to know how I dump table data/content from clickhouse in CSV format?

You can use clickhouse-client to download requred data to file:
clickhouse-client --host ch_server --user user_name --password user_password \
--query="SELECT * FROM db_name.table_name FORMAT CSVWithNames" \
> "/tmp/table_name_01.csv"
See available ouput formats by link Formats for Input and Output Data

Related

How to take the pg_dump of a table in csv format while copying it to S3 bucket

I have a requirement of taking the table backup into S3 in csv format . I already tried taking the pg_dump of the tables, but those are in .sql format. Also I tried using gz format, but they all contains the dump of the data. I have tried this command:
pg_dump -v -h <hostname> -d <dbname> -t <tablename> -U <username>| gzip | aws s3 cp - s3://xxx/xxx/tablename.csv.gz
Is it possible to get the tables exactly in csv file format so that it could be easy to create QuickSight reports out of that because we cant use pg_dump directly to create the QuickSight reports.
Any help would be much appreciated.
Note: the table size is between 50 to 100 GB
You can use COPY command to take data dump in CSV.
Server Side Export:
copy table_name to 'Absolute/path/to/filename.csv' csv header;
Client Side Export:
\copy table_name to 'Relative/path/to/filename.csv' csv header;

COPY command do not export json format properly

I am trying to export a column type jsonb using the copy command as follow:
payload
-------
{"test": "testing"}
sudo -u postgres psql test_database -c "COPY (SELECT payload FROM test_table) TO STDIN CSV"
The output gives me quoted text which is not a correct json format:
"{""test"":""testing""}"
How can I get a correct json format ?
You've chosen CSV output format, which escapes quotes that way. COPY does not produce JSON.
Do not use COPY to get the output, rather see store postgresql result in bash variable or How to return a value from psql to bash and use it?:
psql -U postgres -d test_database -AXqtc "SELECT payload FROM test_table;"

How to write the file output to the server

In MySQL I can do select * from some_table into outfile 'myfile.csv'
Is there something similar in MongoDB? Cannot find relevant information in the documenatation. I'm working with mongodb 2.2.3.
You might want to use mongoexport, which produces a JSON or CSV export of data stored in a MongoDB instance
mongoexport --db dbName --collection collectionName --jsonArray --pretty --query '{"key": "value"}' --output output.json
For more details, please refer to mongoDB documentation.

How to restore pg_dump file into postgres database

So I regularly backup and restore databases and schema's using pgadmin4. I would like to do it with a batch file using commands as pg_dump and pg_restore. I however always fail to succeed in this and could use some help. The way I try to dump one schema (with data) is the following:
pg_dump -U postgres -F c -d database -n schema > mw2
Then I try to restore it with pg_restore:
pg_restore -U postgres -d otherdatabase -n schema mw2
First I tried to use .dump in stead of tar but the result stays the same; which is an empty schema in my database.
Or the following error message:
pg_restore: [archiver] input file does not appear to be a valid archive
https://www.postgresql.org/docs/current/static/app-pgrestore.html
--format=format Specify format of the archive. It is not necessary to specify the format, since pg_restore will determine the format
automatically. If specified, it can be one of the following:
custom, directory and tar
https://www.postgresql.org/docs/current/static/app-pgdump.html
--format=format
Selects the format of the output. format can be one of the following:
p plain Output a plain-text SQL script file (the default).
others are custom, directory and tar
in short - you used defualt plain format, which is meant for using with psql, not pg_restore. So either specify different format with pg_dump or use your file as
psql -f mw2.tar

I want to export a script for my database in postgres

I've created a database in pgAdmin, in which I've 1 public schema and 4 custom schema's. These schema contains several functions, sequences and Tables. Now I want to export a script that can create same database with the same structure without any data. Please help me with this....
You are looking for pg_dump with the (default) plain format.
If you can access to the command line, type this
pg_dump --create --clean --schema-only -U your_user -d db_name > file_name.sql
Then in the server you need to create same database run this
psql -U db_user -d db_name < file_name.sql
--clean option tries to drop all objects before creating them.
--create Begin the output with a command to create the database itself and reconnect to the created database.