I have a table in an MS Access db that I want to export to a PostgreSQL database. Every 2 or so months, I want to move all records from the Access table to a table in Postgres.
Right now, I am using the Export to ODBC option in Access to do this, but every time it exports as an entirely new table in Postgres. Is there a way for me to routinely append the records in the Access table to an existing table in my Postgres database? I have come across the option of a FDW but I am not familiar with how to install/use it.
I am new to using PostgreSQL, and have little to no experience working with databases other than Access, so any input/advice would be greatly appreciated.
Related
I have a database with 20 tables. Currently, we are using the pg_dump command daily to archive our database.
a few tables in this database are very big. We are working to make a light version of this database for testing purposes and small tickets.
So, I need a way to use pg_dump command and save all tables with only 1000 rows in each table. I tried to find anything like that in Google, but without success.
I have created a db long ago using django. Now as we are migrating the application, so I need all the CREATE TABLE sql queries which django might have run to create the entire db for our service (which has around 70-80 tables and each table has avg 30-70 columns).
Both the servers old and new are using Postgres for databases.
But the technology stack is completely different (A 3rd party proprietary application which will host the service) instead of django.
If I start to write all the tables again from scratch, it will take at least a week or two.
Is there any way either from Postgres or from django which can generate the CREATE TABLE sql schema for an entire db keeping all the relationship as is?
Also, I have to do minor modification to that schema as per customer requirement.
p.s - pg_dump won't work as I need actual schema itself to get it reviewed from client.
I have to create an app which transfer data from snowflake to postgres everyday. Some tables in postgres are truncated before migration and all data from corresponding snowflake table is copied. While for other tables, data after last timestamp in postgres is copied from snowflake.
This job has to run at night sometime and not when customers are using the service at daytime.
What is the best way to do this ?
Do you have constraints, limiting your choices in:
ETL or bulk data tooling
Development languages?
According to this site, you can create a foreign data wrapper on Postgresql for snowflake
I've been asked to modify an Access database by putting the data themselves into a Postgres database while keeping the old Access file as a frontend. So far everything has worked just fine, with every linked table, query and form working just like before when viewed.
The issue is, however, that all of the forms call on MS Access queries which users can insert data into, but after the tables have been migrated into PostgreSQL, those queries no longer allow for data inserts, which means the forms no longer allow for data inserts. I can edit the rows already entered, but I cannot make new rows, and I can insert new rows into the linked tables. This is as a superuser.
I have made Access queries in the past that allowed for data entry to a Postgres database, but I don't have access to those files now, and I can't for the life of me figure out what I did diferently back then.
Highly appreciate any leads. Couldn't find anything on this. Using MS Access 2010 and PostgreSQL 9.1
Solved
Andre pointed out that these MS Access queries must include the primary key to give the option of creating new rows. Once I added the id field to the query, the forms worked like they did before.
The answer, supplied by Andre, is that simple MS Access queries allow for inserts into PostgreSQL if the queries include the primary key of the queried table. Cheers!
I am working on a project where I need to programmatically validate and/or compare a database schema between product releases.
I am using Perl and am looking for a cross-platform method to collect the database schema. I am currently able to perform database queries by utilizing the dbisql.exe command and then parsing the results.
I am wondering if there is potentially a stored procedure or set of queries that I can run to collect the database schema.
It appears that the dbunload.exe command could be used to generate a SQL regeneration script however I am thinking that this output may be difficult to parse.
Any feedback would be greatly appreciated.
If you would like to retrieve the DB schema data on a really low level you could query the corresponding system tables. They are in the SYS-Namespace, especially SYSTABLE (for all tables) and SYSCOLUMN for all fields in those tables.
Check the ASA SQL Reference Handbook for the schema of those system tables.
With Perl's DBI you can easily fire queries on those tables. But you will have to create some local storage for the schema to compare the query results with.
Sybase Central v3.0 has the possibility to export DDL with all DB objects;
and I think SC v6.0 can't connect to ASA 11 :(