Azure SQL DB - Export data from a DB and insert into another DB? - import

I use Azure SQL DB (Single DB, Basic, DTU, Provisioned).
There are two different DBs, say, DB-1 and DB-2.
For DB-1, I have Admin access.
For DB-2, I have read-only access. (No access to create new table.)
The two DBs have no links. I access them using SSMS.
The requirement:
In DB-2, there is a table [EMP] with 1000 rows.
Only 250 of them to be exported and inserted into a new table in DB-1 (with all columns).
How can I achieve in SSMS?
Thanks in advance!

There is no way to do this in only SSMS. If this is an ad-hoc project, I would query the records, copy and paste them into Excel, configure them in Excel for an insert statement, then paste them into an insert statement against DB-1.
If this is something that will need to be sustainable, I'd recommend looking into Azure Data Factory.

Related

Can I use Azure Data Factory for ETL Testing as well?

If I have to do data migration and ETL testing for data in Azure SQL DB's then can I use Azure Data Factory?
If yes the please provide some links explaining how? or some tutorials or page where I can find some details?
Thanks in advance!
Sunil
Yes, you can do ETL testing by using data factory. I copied all tables from one database to another database by using below process:
I check all the tables by using below command:
select TABLE_SCHEMA, TABLE_NAME FROM information_schema.
TABLES Where TABLE_TYPE = 'BASE TABLE' and TABLE_SCHEMA = 'dbo'
I created pipeline and performed Lookup activity to retrieve the tables of database by entering below query.
Output:
I created linked service for source and sink on success of lookup activity I implemented foreach activity by enabling sequential and added copy activity to it. I created dynamic dataset by using linked service that created for source. to retrieve all the tables from database I added the data dynamically as below:
Created schema and table parameters enter dynamic content for schema is #dataset().schema and for table is #dataset().table entered values for schema is #item().TABLE_SCHEMA for table is #item().TABLE_NAME .
create dataset by using linked service that created for sink and created parameters and entered values same as source in the sink and enabled auto create table option.
Executed the pipeline. It executed successfully.
All tables are copied to target database.
In this way you can copy all the tables from one database to another database.

Transfer data from redshift to postgresql

I tried searching for it but couldn't find out
What is the best way to copy data from Redshift to Postgresql Database ?
using Talend job/any other tool/code ,etc
anyhow i want to transfer data from Redshift to PostgreSQL database
also,you can use any third party database tool if it has similar kind of functionality.
Also,as far as I know,we can do so using AWS Data Migration Service,but not sure our source db and destination db matches that criteria or not
Can anyone please suggest something better ?
The way I do it is with a Postgres Foreign Data Wrapper and dblink,
This way, the redshift table is available directly within Postgres.
Follow the instructions here to set it up https://aws.amazon.com/blogs/big-data/join-amazon-redshift-and-amazon-rds-postgresql-with-dblink/
The important part of that link is this code:
CREATE EXTENSION postgres_fdw;
CREATE EXTENSION dblink;
CREATE SERVER foreign_server
FOREIGN DATA WRAPPER postgres_fdw
OPTIONS (host '<amazon_redshift _ip>', port '<port>', dbname '<database_name>', sslmode 'require');
CREATE USER MAPPING FOR <rds_postgresql_username>
SERVER foreign_server
OPTIONS (user '<amazon_redshift_username>', password '<password>');
For my use case I then set up a postgres materialised view with indexes based upon that.
create materialized view if not exists your_new_view as
SELECT some,
columns,
etc
FROM dblink('foreign_server'::text, '
<the redshift sql>
'::text) t1(some bigint, columns bigint, etc character varying(50));
create unique index if not exists index1
on your_new_view (some);
create index if not exists index2
on your_new_view (columns);
Then on a regular basis I run (on postgres)
REFRESH MATERIALIZED VIEW your_new_view;
or
REFRESH MATERIALIZED VIEW CONCURRENTLY your_new_view;
In the past, I managed to transfer data from one PostgreSQL database to another by doing a pg_dump and piping the output as an SQL command to the second instance.
Amazon Redshift is based on PostgreSQL, so this method should work, too.
You can control whether pg_dump should include the DDL to create tables, or whether it should just load the data (--data-only).
See: PostgreSQL: Documentation: 8.0: pg_dump

Get information about schema, tables, primary keys

How to get the name of the schema, tables and primary keys?
How to know his authorizations?
The only information I have is obtained by the command below:
db2 => connect
Database Connection Information
Database server = DB2/AIX64 11.1.3.3
SQL authorization ID = mkrugger
Local database alias = DBRCF
You can use the command line (interactive command line processor), if you want, but if you are starting out then it is easier to use a GUI tool.
Example free GUI, IBM Data Studio, and there are many more (any GUI that works with JDBC should work with Db2 on Linux/Unix/Windows). These are easy to find online and download if you are permitted.
To use the Db2 command-line (clp) which is what you show in your question,
Example command lines:
list tables for all
list tables for user
list tables for schema ...
describe table ...
describe indexes for table ...
Reference for LIST TABLES command
You can also use plain SQL to read the catalog views, which describes the schemas, tables, primary keys as a series of views.
Look in the online free documentation for details of views like SYSCAT.TABLES, SYSCAT.COLUMNS , SYSCAT.INDEXES and hundreds of other views.
Depending on which Db2 product is installed locally, there are a range of other command-line based tools. One in particular is db2look which lets you extract all of the DDL of the database (or a subset of it) into a plain text file if you prefer that.

Linking MS Access table to PG Admin schema

I would like to link a MS Access table to a table in PG admin if it is possible for use in a Postgres query. I have searched for an answer but all I can find is answers for listing postgres tables in Access which is almost the opposite of what I want to do.
I want to be able to access the data entered in an access form without having to continually import the data into a table in PG Admin.
I'm not even sure that is possible but any method that is easier than importing the table into PG Admin every day would be useful.
Thanks
Gary
Try the PostgreSQL OGR Foreign Data Wrapper. Its built for spatial data, but it works perfectly well with non-spatial tables. If you have the PostGIS extension installed you will already have it.
https://github.com/pramsey/pgsql-ogr-fdw
There are several examples on that page, but the command
ogr_fdw_info -s <pathToAccessFile> -l <tablename>
will return create server and a create foreign table statements which you can edit as required then run in pgAdmin.

Transferring data from one database to another (Postgres)

I have 2 exactly same databases on 2 different machines(with different data that is), and I want to transfer contents of one table to the table from the other database, how do I do that from PgAdmin? I'm new to PostgreSQL Database, I'd do that easily with mysql phpmyadmin just export sql and I'd get text file with bunch of insert into statements, is there equivalent with PgAdmin ?
Yes, backup using "PLAIN" format (SQL statements) and then (when connected to the other DB) open the file and run it.
Or you could select "COMPRESS" format in the "backup" dialogue, and then you could use the restore dialogue.
Also there's an equivalent of phpMyAdmin for Postgres, called "phppgadmin". Select the table in question and then use the "Export" tab.
pg_dump from the command line