How to run SQL script without losing data in Pg Admin |||? - postgresql

I am working on PostgreSQL database and after some months I have added new columns in many tables according to the new requirements on my test server and also I have done changes in functions.
Now what I want that I want to do same changes on my Live server without losing data.
I have taken backup on test server schema using below answer
https://stackoverflow.com/a/7804825/7223676

Related

Recreate SQL Commands from db

I have created a db long ago using django. Now as we are migrating the application, so I need all the CREATE TABLE sql queries which django might have run to create the entire db for our service (which has around 70-80 tables and each table has avg 30-70 columns).
Both the servers old and new are using Postgres for databases.
But the technology stack is completely different (A 3rd party proprietary application which will host the service) instead of django.
If I start to write all the tables again from scratch, it will take at least a week or two.
Is there any way either from Postgres or from django which can generate the CREATE TABLE sql schema for an entire db keeping all the relationship as is?
Also, I have to do minor modification to that schema as per customer requirement.
p.s - pg_dump won't work as I need actual schema itself to get it reviewed from client.

PostgreSQL external transaction feature

I have a big web application and tests which make requests to app running in sandbox. After each test I used to rollback database using db migrate rollback && db migrate && db seed. But now after test quantity rised, It takes much time. So, I am looking for feature which can wrap some amount of database command into a transaction and after test finish cancel transaction without modifying app source code (or make this by another way). May be there are some postgres database parameters or extensions?
I found another way..
I can make a dump one time and then drop and restore dump every time next, much faster)
look this topic:
Truncating all tables in a Postgres database

Maintaining a development database with exactly the same schema

I'm trying to run a different database for development as my initial product release is coming so I'd like to know how to maintain two different databases. I'm using postgresql as my DBMS
I want development database and production database to have exactly the same schema. Is there a way to do this automatically? If I have to to manually, what would be the best way to update schema?
thank you
I want development database and production database to have exactly the same schema.
Then just create 2 databases with the same schema(s).
Or you can read more about template databases - https://www.postgresql.org/docs/11/manage-ag-templatedbs.html.
The idea is that when you create new database, it's actually copied from template1, thereof you can edit template1, and every new database will have schemas/tables that you need.

How can I obtain the creation date of a DB2 database without connecting to it?

How can I obtain the creation date or time of an IBM's DB2 database without connecting to the specified database first? Solutions like:
select min(create_time) from syscat.tables
and:
db2 list tables for schema SYSIBM
require me to connect to the database first, like:
db2 connect to dbname user userName using password
Is there another way of doing this through a DB2 command instead, so I wouldn't need to connect to the database?
Can db2look command be used for that?
Edit 01: Background Story
Since more than one person asked why do I need to do this and for what reasons, here is the background story.
I have a server with DB2 DBMS where many people and automated scripts are using it to create some databases for temporary tasks and tests. It's never meant to keep the data for long time. However for one reason or another (ex: developer not cleaning after himself or tests stopping forcefully before they can do the clean up) some databases never get dropped and they start to get accumulated till the hard disk is filled out eventually. So The idea of the app is to look up the age of the database and drop it, if it's older than 6 months (for example).

Live sync between SQL Server 2008 R2 and MongoDB with Express

I have created custom script in Express that actually migrates SQL Server database to MongoDB.
But I am facing problems in live syncing between the two databases.
Currently I have added a column updated_by in both the databases.
Then I fetch the latest updated_by row from MongoDb and SQL Server database.
Then I check the date difference and based on it I update my MongoDB database.
There are lots of db tables and I am finding it difficult to identify that, which table is being updated.
Is there any log in SQL Server 2008 R2 that states which table is updated and at what time?
I need a mechanism like, any data update in the db table should immediately sync that rows into my MongoDB.
Any more suggestions on live data syncing is also welcome.
Thanks in advance. :)
When i have such requirement to Sync between Relational DB say (MYSQL) and Non-Relational DB (Mongodb).
I had followed following steps which may help others in future. and the concept is generally called as Change Data Capture
Capture changes (For MYSQL iam using triggers.)
Transform changes to a suitable changes
ie RDBMS to Non RDBMS
Update changes
Remember to sync the structural changes of database and corresponding implementaions.
Following links may help
https://www.flydata.com/blog/what-change-data-capture-cdc-is-and-why-its-important/