How to manage multiple database connections? - postgresql

I want a nice and clean way for accessing my Postgres databases in python.
Currently, I have a file called database.py which has some imports within it.
The imports look like this:
from DatabaseA import dbA
from DatabaseB import dbB
DatabaseA and DatabaseB each have their own methods for performing various queries against their respective database connection.
I would now, in my file where I need to use a database, add the import import database and then do something like:
database.DbA.create_table(table_name)
database.DbB.do_something(some_args)
My database scripts (eg DatabaseA) each get their connection details from a config.ini file.
This all works, but I'd like to have it more object orientated.
It would be nice to be able to specify a master DB "manager", and then access the respective DB by passing in a configuration name (so that it can get the required connection details and return the correct database file). For example:
DBManager().get_db("DbA_config_name").create_table("test_table")
Where "DbA_config_name" is the name of the configuration in the config.ini file (accessed using configparser).
What patterns exist for this type of thing in Python?

Related

How to create a copy of a linked database in LibreOffice Base

I have a connection to a Microsoft Access to Libreoffice Base using a JDBC connection. When I save the database in Base it retains the connection.
I want to create a copy as a standalone Base database which includes all the imported tables, so that it no longer looks to the Access file for its data.
How do I achieve this clone?

How do i dump data from an Oracle Database without access to the database's file system

I am trying to dump the schema and data from an existing Oracle DB and import it into another Oracle DB.
I have tried using the "Export Wizard" provided by sqldeveloper.
I found answers using Oracle Data Pump, however i do not have access to the filesystem of the DB server.
I expect to get a file that i can copy and import into another DB
Without Data Pump, you have to make some concessions.
The biggest concession is you're going to ask a Client application, running somewhere on your network, to deal with a potentially HUGE amount of data/IO.
Withing reasonable limits, you can use the Tools > Database Export wizard to build a series of SQLPlus style scripts, both DDL (CREATEs) and DATA (INSERTs).
Once you have those scripts, you can use SQLPlus, SQLcl, or SQL Developer to run them on your new/target database.

Where is the DATA_DUMP_DIR in sql developer

I'm trying to import a .dmp file using the Data Pump Import tool in oracle sql developer.
I'm connected to an oracle database running in a container on my local machine.
When I get to the step where I specify where the dump file is to import, where should I place the .dmp file?
DATA_PUMP_DIR is a default Oracle directory object. It isn't part of SQL Developer; the import tool is really just giving you a GUI equivalent of running impdp from the command line.
You can find the operating system location that Oracle directory object points to by querying the data dictionary:
select directory_path from all_directories where directory_name = 'DATA_PUMP_DIR';
The path that returns is on the database server (in your case that'll be inside your container too), and your dump file needs to go there.
You might want to create additional directory objects pointing to other locations, and grant suitable privileges to users to be able to access them; but they all need to be on the DB server and read/writable by the Oracle process owner on that server.
(They could be remote filesystems mounted on the server, they don't necessarily have to be local storage, but that's another issue and more operating-system specific. Again, in your case, you might be able to share a folder on your local machine with the container, if you don't want to copy the file into the container.)

IBM WCS and DB2 : Want to export all catentries data from one DB and import into another DB

Basically i have two environments, Production and QA. On QA DB, the data is not same as it is on production so my QA team is unable to test it properly. So want to import all catentries/products related data in QA DB from production. I searched a lot but not found any solution regarding this.
May be i need to find all product related tables and export them one by one and then import into dev db but not sure.
Can anyone please guide me regarding this. How i can do this activity with best practices?
I am using DB2
The WebSphere Commerce data model is documented, which will help you identify all related tables. You can then use the DB2 utility db2move to export (and later load) those tables in one shot. For example,
db2move yourdb export -sn yourschema -tn catentry,catentrel,catentdesc,catentattr
Be sure to list all tables you need, separated by commas with no spaces. You can specify patterns to match table names:
db2move yourdb export -sn yourschema -tn "catent*,listprice"
db2move will create a file db2move.lst that lists all extracted tables, so you can load all data with:
db2move yourQAdb load -lo replace
running from the same directory.

How to access the database imported through datapump

I just imported data dump through below command:
IMPDP user/pass FULL=Y DUMPFILE=BIRDV24012014.DMP LOGFILE=BIRDV24012014.log;
The dump has been restored the issue is i dont know how to connect to this database that i just imported, what service or TNS does it resides and how can i query it?
You didn't import a database, you imported the contents of your file into your existing database. If you could successfully run impdp user/pass then your ORACLE_SID etc. is already set and you should be able to log in and query with sqlplus user/pass.
If you've come from another RDBMS background you may be confusing 'database' with 'schema'. Depending on what was in the dump, you've probably created a load of schema objects and data under the USER schema or whatever your real 'user' value was).
The import makes no difference to this, but if you want to access the database from another client (e.g. from another machine, or over JDBC) then you'll need to check your listener configuration to get the hostname/IP address and port it's listening on, and get the service name for the database; all of which can be obtained from lsnrctl services if you have permission to run that. You can then use those values for a JDBC URL, or in a tnsnames.ora entry, or ODBC, etc.
Look at your ORACLE_SID environment variable. There you'll find the instance ID. If you ran the IMPDP tool as user Oracle, you should also be able to connect to the database using
sqlplus / as sysdba
If all fails, look at your /etc/oratab file to see which instances are available on this host.
On another note, your command seems incomplete. Datapump requires a DIRECTORYparameter to know where to look for the dumpfile you specified.