why does mysql workbench allow just one self-contained file to be imported into db in? - mysql-workbench

I wonder why MySQL workbench allows just one self-contained file to be imported into a DB? Whereas when you export MySQL DB in workbench, you can export everything in just one step.
I think this is quite unreasonable and there's got to be a way to import multiple files at once.
As you can see, there is no way to import multiple SQL scripts.
does that mean that if I have 100 SQL scripts to import, I have to do this 100 times?
Your Help would be very appreciated:)!!
MY LOCAL MACHINE: windows 10

Related

How do i dump data from an Oracle Database without access to the database's file system

I am trying to dump the schema and data from an existing Oracle DB and import it into another Oracle DB.
I have tried using the "Export Wizard" provided by sqldeveloper.
I found answers using Oracle Data Pump, however i do not have access to the filesystem of the DB server.
I expect to get a file that i can copy and import into another DB
Without Data Pump, you have to make some concessions.
The biggest concession is you're going to ask a Client application, running somewhere on your network, to deal with a potentially HUGE amount of data/IO.
Withing reasonable limits, you can use the Tools > Database Export wizard to build a series of SQLPlus style scripts, both DDL (CREATEs) and DATA (INSERTs).
Once you have those scripts, you can use SQLPlus, SQLcl, or SQL Developer to run them on your new/target database.

Where is the DATA_DUMP_DIR in sql developer

I'm trying to import a .dmp file using the Data Pump Import tool in oracle sql developer.
I'm connected to an oracle database running in a container on my local machine.
When I get to the step where I specify where the dump file is to import, where should I place the .dmp file?
DATA_PUMP_DIR is a default Oracle directory object. It isn't part of SQL Developer; the import tool is really just giving you a GUI equivalent of running impdp from the command line.
You can find the operating system location that Oracle directory object points to by querying the data dictionary:
select directory_path from all_directories where directory_name = 'DATA_PUMP_DIR';
The path that returns is on the database server (in your case that'll be inside your container too), and your dump file needs to go there.
You might want to create additional directory objects pointing to other locations, and grant suitable privileges to users to be able to access them; but they all need to be on the DB server and read/writable by the Oracle process owner on that server.
(They could be remote filesystems mounted on the server, they don't necessarily have to be local storage, but that's another issue and more operating-system specific. Again, in your case, you might be able to share a folder on your local machine with the container, if you don't want to copy the file into the container.)

Setting up environment for SQL queries

I know the basic syntax of queries but otherwise I'm a beginner with SQL.
I have an SQL file (.sql) and I downloaded a couple programs (pgadmin and sql workbench).
I have no idea how to get from where I am now to actually writing queries and finding information. How do I set up so I can actually import my SQL file and start writing queries?
pgAdmin is the default GUI for PostgreSQL.
SQL Workbench is a free, DBMS-independent, cross-platform SQL query tool.
Either way, you need to connect to a database to actually run queries. The DBMS can either run on your local machine or you can connect to a remote server - where you need access privileges of course.

How to export data from SQL Server to PostgreSQL?

I need to export all tables from SQL Server to PostgreSQL.
Try: I tried from SQL Server IDE but at some stage its giving the error about data types are different.
Question:How can I do export of data from SQL Server to PostgreSQL? Is COPY does my job? If yes, then how can I export all tables including records?
You can't export data from MSsql then import to PostgreSql because it is not same syntax, data type, but you can use tool to migration data from mssql to postgreSql,
See more in topic
migrate data from MS SQL to PostgreSQL?
Use https://dbeaver.io/
Create MS SQL and PostgreSQL database connections (login)
Create target tables in PostgreSQL (same structures in MS SQL)
F5 to see new tables
Right-click on new tables -> 'Import Data' -> You will see 'Data Transfer' window
Choose 'Table' type then click 'Next' -> You will see 'Select input object', where you can choose tables from MS SQL connection
Just 'next' and check settings that you need, done :D
First export the schema into a file and run it against PostgreSQL until you've removed all incompatibilities.
You could try to do the same with the data you want to export but you may be better off writing a Python script to migrate it.
There is an absolutely simple way using built-in SSIS tool using Management Studio. You can find the detailed answer here.
Use https://dbeaver.io/ , as An Le mentioned.
After 40 years of DB development, migrating DB data is still a challenge. DBeaver is a free tool to use for data migration. But you still have to migrate the schema.
Exporting data from DBeaver
From contextual menu of your SQLServer database or schema select Tools > Create new Task > Common > Data Export
You will generate SQL insert files or CSV files. For migration between database types use CSV files.
Cons of SQL Server Migration Tool
Unable to migrate rows containing booleans.
Export ended up in errors of migrationg data with Bool columns, complaining that value is not boolean, although both source and destination columns where of boolean type.
Unable to continue with the next tables afer one table migration fails.
SQL Server - A single error stops all migration even for tables that are not related to the initial error.
Configuring the tool over and over again, trying to export your data is a waste of time. SQL Server migration task does not save the configuration of the source and destination connections. And the wizard is not user friendly, spending your time on it is frustrating. I assume the migration project was abandoned for at least 10 years.

Oracle 10g Enterprise Edition Backup to .sql file

I have oracle 10g enterprise edition. I want to perform table backup and full database backup to .sql files to be able to move to another server/PC ...
If possible, image explanation ...
If you want to move to a different Oracle server, you should use:
expdp
To export your data. And:
impdp
To import it on the other database. Here's a WIKI entry that might help you. It's the easy way of moving data around Oracle servers.
On the other hand, ff you want to move your data to a different DATABASE SERVER (not Oracle branded) you would need some sort of translator in the way.