Conditional based pg_restore - postgresql

I have searched for the answer on following question for a long time and didn't find anything except of restoring several tables. I have a directory based dump of postgresql database, what I want to do is perform restore of this database based on some data condition like: WHERE [SomeField] > 10. Is it possible? And if yes, could you please advice me how to do this? Thank you

Related

how to take backup of specific data from a table and dump it in existing database and table?

I am working on a PostgreSQL database and I need to take a backup of specific data from a table and restore it into the same or a different database. I am looking for a way to accomplish this task using command-line tools or scripts.
I have tried using the pg_dump and pg_restore command-line utilities, but they seem to only allow for backing up and restoring an entire database or table, rather than specific data within a table.
Is there a way to take a backup of specific data from a table in PostgreSQL and restore it into a database? If so, could you provide an example of the command or script that would accomplish this task?
Thanks in advance for your help!

How to produce a postgresql dump of multi-valued INSERTs instead of COPY and to batch the inserts into explicit transactions?

I cannot find anywhere any info on how exactly should I bulk my inserts into batches of 100-inserts-per-transaction while producing a database dump via pg_dump utility. How exactly do I need to perform it? I failed to find any parameters in
> man pg_dump
to perform this. Even the most elaborate answer on StackOverflow on the topic by #CraigRinger does not describe the way the dump of extended imports may be produced. Could anyone please share their recipe here?
--rows-per-insert was added to pg_dump for v12. Before that, there is no clean way to do this.
With plain text dumps, the transaction control will be decided based on how you replay the dump, not on how you take it. If you just stream the dump into psql with no options, each insert will naturally be its own transaction.

MongoDB in Luigi Python

I would like to know if there is a way to output to a MongoDB in Luigi. I see in the documentation they support files (local FS, HDFS), S3, PostgreSQL but not MongoDB. If not, could someone explain me why not? Maybe it is a bad idea to have it? I would like to store the data in a database because then I can explore it by querying it. However I am using mongodb and I would not like to install another database. I do not need a relational database as I am using the database only to store and query ( NoSql ) without relationships, so the best option is mongodb.
Basically I need a task to read the data and save it in the database. Then the next task take this output and process the data.
Any recommendation, suggestion or clarification is more than welcome. Thanks!
You can try using mortar-luigi.
Check out this link for MongoDB tasks and this example.

DB2 System Tables Logs?

I'm using DB2 LUW. I'm working on looking at how often data is being changed or added into our database, and I was curious if there was a system table that I might be able to find this information?
It depends a bit on what you mean. You will for example find the #commit and #rollback in sysibmadm.snapdb, #rows_written per table can be found in sysibmadm.snaptab. If you take snapshots from those on a regular basis you get an idea on how often data is updated. Was that what you had in mind, or is it something else you are looking for?

suggest a postgres tool to find the difference between the schema and the data

Dear all ,
Can any one suggest me the postgres tool for linux which is used to find the
difference between the 2 given database
I tried with the apgdiff 2.3 but it gives the difference in terms of schema not the data
but I need both !
Thanks in advance !
Comparing data is not easy especially if your database is huge. I created Python program that can dump PostgreSQL data schema to file that can be easily compared via 3rd party diff programm: http://code.activestate.com/recipes/576557-dump-postgresql-db-schema-to-text/?in=user-186902
I think that this program can be extended by dumping all tables data into separate CSV files, similar to those used by PostgreSQL COPY command. Remember to add the same ORDER BY in SELECT ... queries. I have created tool that reads SELECT statements from file and saves results in separate files. This way I can manage which tables and fields I want to compare (not all fields can be used in ORDER BY, and not all are important for me). Such configuration can be easily created using "dump schema" utility.
Check out dbsolo DBSOLO. It does both object and data compares and can create a sync script based on the results. It's free to try and $99 to buy. My guess is the 99 bucks will be money well spent to avoid trying to come up with your own software to do this.
Data Compare
http://www.dbsolo.com/help/datacomp.html
Object Compare
http://www.dbsolo.com/help/compare.html
apgdiff https://www.apgdiff.com/
It's an opensource solution. I used it before for checking differences between differences in dumps. Quite useful
[EDIT]
It's for differenting by schema only