I know it can be quite odd question, but I was wondering if there's a tool that allows me to convert an Access DB to PostgreSQL with the table structure... I've found some third-party tools as dBForce that import data but not the structure.
Thanks
Related
I'm trying to build an app using Node/Express and RDS PostgreSQL on the back-end to get some more experience with these two technologies. More specifically, I'm looking to build this using the node-postgres package and without the aid of an ORM. I currently have a .sql file in my app that contains the desired schema.
What would be considered "best practice" when implementing a schema for the first time? For example, is it considered better to import a schema via the command line, use something like pgAdmin, or throw a bunch of "CREATE TABLEs" into queries through node-postgres?
Thanks in advance for the help!
I have not found real sync from postgreSql to new graph database like neo4j, so I've decide to use same postgresql to sync one normalise tables with json table on the same postgresql with a differenta name for a database. So i have the best of two worlds, sql and nosql database.
When i see sql is more fast than graphql i can choose, and in the future, when i moved the nosql tables to a real graph database like neo4j and i ll be able to sync, i dont need to change the app that can use both database synced
Someone did already this? or it's a dumm idea? Or someone already use automatic libraries to sync from postgresql to neo4j ? and the another sens too ? or must I write sync scripts from scratch if i want to sync two databases?
I am wondering if there is a way to import data from an HTTP source from within an pgsql function.
I am porting an old system that harvests data from a website. Rather than maintaining a separate set of files to manage the downloading of the data, I was hoping to put the import routines directly into stored procedures.
I do know how to import data with COPY, but that requires the data to already be available locally. Is there a way to get the download the data with PL/PGSQL? Am I out to lunch?
Related: How to import CSV file data into a PostgreSQL table?
Depending what you're after, the Postgres extension www_fdw might work for you: http://pgxn.org/dist/www_fdw/
If you want download custom data by HTTP protocol, then PostgreSQL extensive support for different languages might be handy. Here is the example of connecting to Google Translate service from Postgres function written in Python:
https://wiki.postgresql.org/wiki/Google_Translate
I do not have much experience in postgresql but would like some clues on how to do the following:
I intend to feed a DB PostgreSQL/PostGIS through data that are in Informix DB, which I have access via ODBC.
In short, I intend to do a "select" in Informix DB and be able to import that information directly into a DB PostgreSQL/PostGIS.
From what I understood it seems possible to do it via DBLink. Is that so?
Where can I get detailed information about this process?
I would suggest dump the data from whatever DB you have in text format like CSV, then use COPY command to load the data into PostgreSQL.
Any idea how to go about doing that through a tool(preferred). Any alternate ways to do that.
You can check out the migration studio from EnterpriseDB here, although I have no experience with it.
There is no comparison to doing it yourself though - if you're not familiar with Postgres then this will get you familiar, and if you are, then aside from the data entry aspect, this should be old hat.
Use maxdb tools to generate a SQL text export of the database. Then import this file in PostgreSQL, luckily you won't need prior processing of the data dump.