Using MySQLdump for sub-set of database migration - select

I've been using both mysql and mysqldump to teach myself how to get data out of one database, but I consider myself a moderate MySQL newbie, just for the record.
What I'd like to do is get a sub-set of one database into a brand new database on a different server, so I need to create both the db/table creation sql as well as populating the data records.
Let's say my original db has 10 tables with 100 rows each. My new database will have 4 of those tables (all original columns), but a further-refined dataset of 40 rows each. Those 40 rows are isolated with some not-so-short SELECT statements, one for each table.
I'd like to produce .sql file(s) that I can call from mysql to load/insert my exported data. How can I generate those sql files? I have HEARD that you can call a select statement from mysqldump, but haven't seen relevant examples with select statements as long as mine.
Right now I can produce sql output that is just the results set with column names, but no insert code, etc.
Assistance is GREATLY appreciated.

You will probably have to use mysqldump to dump your tables one at a time and using the where clause
-w, --where='where-condition'
Dump only selected records. Note that quotes are mandatory:
"--where=user='jimf'" "-wuserid>1" "-wuserid<1"
For example:
mysqldump database1 table1 --where='rowid<10'
See docs: http://linux.die.net/man/1/mysqldump

mysqldump each table with a where clause like Dennis said above. one table at a time and merge the scripts with cat
cat customer.db order.db list.db price.db > my_new_db.db

Related

Backup file and query issue

Implement SQL script that find the differences between the contents of a relational table BOOK and a relational table with the same name as _DOC.
The script must first list the rows added to the relational table BOOK after the backup file was created, and finally list the rows changed in the relational table BOOK after the backup file was created.
In brief, the script must first list all added rows, then all deleted rows, and finally all changed rows in a relational table BOOK. It is allowed to use more than one SELECT statement to implement this task.
Created data as the backup file and unable to query the issue

How to get a history of latest postgres table writes regardless of which table it is

Assume I don't know which tables have been written to (not queries, I mean writes). Can I find out names of tables that were last written to?
Don't want constant reporting. Just a query I can run after testing newly added code.
Once I know the names, I'm set; I can just query them using normal sql and see the records. But need to know which tables in a 200 table database. Something like:
select names of last 10 tables that have been written to

Auto generate script for CREATE TABLE including all indices, constraints, etc (not via SSMS)

I have a data anonymization process that takes a production copy of a database and turns it into an anonymized copy by UPDATE-ing some columns.
Some of the tables contain several million rows so instead of UPDATE-ing the columns, which is very log intensive, I went down the way of
SELECT
Id,
CAST('Redacted' AS NVARCHAR(255)) [ColumnRequiringAnonymization]
INTO MyTable_New
FROM MyTable
EXEC sp_rename MyTable, MyTable_old
EXEC sp_rename MyTable_new, MyTable
DROP TABLE MyTable_old
The problem with this approach is that the "new" table no longer has any of the keys, indices and other dependent objects. I have figured out the keys and indices using SPs to generate the DROP and CREATE scripts. The SPs are based on manually written SQL as can be seen e.g. in this answer.
The next problem is that we have a schemabound view on top of this table, which has indices and a full-text index on its own. The number of SPs to generate scripts is growing and I am sure there will be mistakes.
Is there a way to completely script a table/view by using SQL commands only? ie. just like SSMS does when you click "Script table as - CREATE to" but within a stored procedure?
Right-click on the database, select Tasks; there is Generate Scripts there. Just follow prompts or Google for additional information.

Copy table data from one database to another

I have two databases on the same server and need to copy data from a table in the first db to a table in the second. A few caveats:
Both tables already exist (ie: I must not drop the 'copy-to' table first. I need to just add the data to the existing table)
The column names differ. So I need to specify exactly which columns to copy, and what their names are in the new table
After some digging I have only been able to find this:
pg_dump -t tablename dbname | psql otherdbname
But the above command doesn't take into account the two caveats I listed.
For a table t, with columns a and b in the source database, and x and y in the target:
psql -d sourcedb -c "copy t(a,b) to stdout" | psql -d targetdb -c "copy t(x,y) from stdin"
I'd use an ETL tool for this. There are free tools available, they can help you change column names and they are widely used and tested. Most tools allow external schedulers like the windows task scheduler or cron to run transformations based on whatever time schedule you need.
I personally have used Pentaho PDI for similar tasks in the past and it has always worked well for me. For your requirement I'd create a single transformation that first loads the table data from the source database, modify the column names in a "Select Values"-step and then insert the values into the target table using the "truncate" option to remove the existing rows from the target table. If your table is too big to be re-filled each time, you'd need to figure out a delta load procedure.

Select query on multiple databases

What I am trying to do is verify a URL. I just need to be able to select that single value from all databases that we have currently in SQL Server 2008. All the databases are the same, just multiple instances of the same database for different users.
I am looking to pull one item from one table in each database.
Each database contains a table SETTINGS and within that table a value for MapIconURL. I need that value from each table from within each database. I am looking at around 30 or so databases that would have this value.
So I found the "undocumented" Stored Proc sp_MsForEachDb and have working....to a point.
The code I am using is this:
EXEC sp_MsForEachDb 'use ?; SELECT "?" as databasename,SETTINGSKEYID, SECTION, NAME, INIVALUE, DESCRIPTION
FROM ?.dbo.SETTINGS
WHERE [NAME] = "MapIconURL"'
I have noticed that it is not selecting all the databases, but that it is also selecting the master table as well as other system tables, and am thinking that may be why it is not selecting all tables. Is there a way to exclude the system related tables?
If the number (and name) of the databases is fixed, then you could simply do:
SELECT MapIconUrl FROM db1.dbo.SETTINGS
UNION ALL
SELECT MapIconUrl FROM db2.dbo.SETTINGS
...
However, if either the number or names of the databases is not fixed, then you have to build the query dynamically.
First, run a query such as SELECT name FROM master..sysdatabases to get the names of the databases.
Then, loop over the result set (in T-SQL, use a CURSOR) and build your query and execute it (in T-SQL, use sp_executesql).