In the GUI-Version of SQLDeveloper, there is the possibility to do an export of a database-schema (Tools - Database Export). The result is a SQL-File ("export.sql" by default).
Now I read that since Version 4 there is a CLI-Version of SQL-Developer called "sdcli.exe". Is it possible to do the same task with it? The result should be a SQL-File containing the DDL of the schema.
Unfortunately I wasn't able to find a suitable command via "-help". But maybe I'm just overlooking it.
I know there are other possibilities to export a db-schema, but in my use case I would prefer a human readable SQL-File and I'm trying to avoid writing my own script ;-)
Related
There is a statements logging in Oracle SQLDeveloper:
Is there any way to export them as plain text or log them to file?
UPD: The reason I want to collect statements to file is for easy diff (to compare expected vs truncated export). I have a schema which export is not completely performed by 'Tools -> Database export'. Indexes, constraints, packages and synonyms are missing in resulting file while they are obviously present in database and visible in SQLDeveloper.
No, just copy and paste.
You could always do a client based jdbc trace or a database session trace if you wanted that to go to a file.
I am trying to import to pgAdmin a big table with more than 100 columns. Is there any way to import the table without creating those 100 columns in a table within the pgAdmin? That would be a considerably time-consuming task.
You are not importing data into pgAdmin, you are importing it into Postgres, and using pgAdmin to help you in that task. Graphical tools like pgAdmin are, at heart, just convenience wrappers around the actual functionality of the database, and everything they do can be done in other ways.
In the case of a simple task like creating a table, the relevant SQL syntax is well worth learning. It will work in any database tool, even (with some minor changes) on other SQL databases (e.g. MySQL), can be saved in version control, and manipulated with an editor of your choice.
You could even go so far as to write a script in the language of your choice that generates the SQL for you based on some other data (e.g. the headings of the CSV file) - although make sure you don't run that with third-party data without checking the result or taking extreme care with code injection and other security concerns!
The Postgres manual has an introduction to tables and creating them which would be a good place to start.
I rely on SQLDeveloper to edit and export a schema.
It works like a charm, and I can run import with sqlplus.
I have tried using sqlplus to generate the same schema export, with no result.
I cannot use the Oracle expdp tool, because I need an ASCII file to be able to diff it.
So the only option I have is SQLDeveloper.
I would like to automate the export (data + DDL) with a cron job on a Linux box, but I can't find a way to use SQLDeveloper from a command line to generate the export.
Any clue?
Short answer: no.
For just the schema side of things you may want checkout show create table equivalent in oracle sql which will get you the SQL source of the DDL.
Are you sure you want an ASCII file for the automated export of an entire DB though? I would be surprised if you really want to diff an entire export of a DB. This SO Answer may help a little though.
If you really want to get a full data dump plus DDL you will have to write your own script that gets the DDL as described in the first link and then select * and process each result into a sql insert.
I'm running PostgreSQL 9.3 and want to import some daily generated csv files into specific tables.
I started playing with FDW (Foreign Data Wrapper) and pointed to a specific csv, where I can query and append/upsert to a table.
But I have two more needs:
- The file generation date and source branch is present in the filename, and only there.
I need to get this information and insert also in the table.
- As expected, the files names are not fixed, so the FDW doesn't know where to get the information.
I thought about solving this using some unix tools (although my Postgres runs on windows), basically for each file in a list (from a previously created index), the script would rename the file and pass the branch and date as parameters to a psql.exe command line, where the import would be from a fixed name in FDW.
This would work, but this script sound a bit like a hack and not a very "elegant" solution.
Does anyone has an better suggestion?
Thanks!
Any idea how to go about doing that through a tool(preferred). Any alternate ways to do that.
You can check out the migration studio from EnterpriseDB here, although I have no experience with it.
There is no comparison to doing it yourself though - if you're not familiar with Postgres then this will get you familiar, and if you are, then aside from the data entry aspect, this should be old hat.
Use maxdb tools to generate a SQL text export of the database. Then import this file in PostgreSQL, luckily you won't need prior processing of the data dump.