TSQL: How do I move data between SQLServer instances? - tsql

I have two SQLSever instances, each of them have an identical schema. One is running in SQLAzure, the other is a standared SQLServer 2008 instance. I need to copy the data from the Azure database to my local instance.
Essentially I want to do this:
insert LOCAL_TABLE (col1, col2)
select col1, col2
from AZURE_TABLE
How would I do that?

In order to move data between SQL Servers, and if one of them is SQL Azure you have couple of options:
SQL Azure Data Sync
Using SSIS
Write your own code that will move data using, most probably SqlBulkCopy class.
If you would like to just copy all the data, you could also use SQL Azure Migration Wizard - you can omit the option for coping the schema, and let it just copy the data.
EDIT
And as, by the original answer from Matthew PK, you could link to your SQL Azure server from your on-prem Server, but this is only an option when you just want to do some ad-hoc testing. I would not use this approach in production for constantly syncing data.

You could accomplish that in a single statements using linked servers.
http://msdn.microsoft.com/en-us/library/ms188279(v=sql.100).aspx
EDIT: Here is a link which appears to explain how to link to SQL Azure:
http://blogs.msdn.com/b/sqlcat/archive/2011/03/08/linked-servers-to-sql-azure.aspx
EDIT: Here is a write-up on connecting to Azure with SSMS
http://www.silverlighthack.com/post/2009/11/11/Connecting-to-SQL-Azure-with-SQL-Server-Management-Studio-2008-R2.aspx
Otherwise I believe you need to do it in two statements.

Linked Server is not officially supported. However, Here are couple of resources that are supported and would help you do what you are looking for:
1) Check out SQL Azure Dac Examples: http://sqldacexamples.codeplex.com/
2) The other options is SQL Azure Data SYNC.

Use a product like "SQL Data Compare" from redgate. I am not a Azure user, but I am guessing it would work, I have used it for a few years and its pretty solid.

Is this a one-time copy or ongoing?
If one-time, then use the SQL Azure Migration Wizard (from Codeplex)
If ongoing then use SQL Azure data sync
Also you can verify that the schema is compliant with SQL Server Data Tools in VS, just set the target to SQL Azure or to SQL Server 2012, or 2008 and then build and any/all schema errors will show up.

Related

Implement Oracle external table like functionality in Azure managed postgresql

Currently we are using Oracle 19c external table functionality on-prem whereby CSV files are loaded to a specific location on DB server and they get automatically loaded into an oracle external table. The file location is mentioned as part of the table DDL.
We have a requirement to migrate to azure managed postgresql. As per checking the postgresql documentation, similar functionality as oracle external table can be achieved in standalone postgresql using "foreign tables" with the help of file_fdw extension. But in azure managed postgresql, we cannot use this since we do not have access to the DB file system.
One option I came across was to use azure data factory but that looks like an expensive option. Expected volume is about ~ 1 million record inserts per day.
Could anyone advise possible alternatives? One option I was thinking was to have a scheduled shell script running on an azure VM which loads the files to postgresql using PSQL commands like \copy. Would that be a good option for the volume to be supported?
Regards
Jacob
We have one last option that could be simple to implement in migration. We need to use Enterprise DB (EDB) which will avoid the vendor lock-in and also it is free of cost.
Check the below video link for the migration procedure steps.
https://www.youtube.com/watch?v=V_AQs8Qelfc

IdentityServer4 entity framework SQL Server connection string

I am trying to follow quickstart to setup SQL Server (not LocalDb version of SQLServer that comes with Visual Studio) as my data store. Looks like that two databases will be needed - one for configuration and the other for operation. But my problem is that I couldn't figure out what db names I should use. I created two databases using names I came up with and ran the scripts I downloaded from quickstart to create all the tables. Now, when I try to make connection, I think I will need to specify db names in my connection string, don't I? What should I use to replace the original connection string provide by quickstart - "Data Source=(LocalDb)\MSSQLLocalDB;database=IdentityServer4.Quickstart.EntityFramework-4.0.0;trusted_connection=yes;" ?
You can have one database for both. But in general I would keep the configuration part in memory if the number of clients is small. Why spend hours keeping the config in a database for just a few clients and resources?
Better to just keep the users and persisted grants in a database.

Moving EntityFramework database to Azure, MigrationHistory table is lost

I'm in the process of migrating our on-premises hosted ASP.NET application into the cloud and I'm running into an issue with the migration of my database.
We're using EF5 for database interaction and we've enabled the code first migrations in order to deal with database changes. However, whenever I generate a BACPAC file the __MigrationHistory table is ignored and when I use the SSIS tool to migrate data between my local and Azure database the values in the identity columns are changed regardless of whether I select 'enable identity insert', messing up my entire database.
What am I doing wrong here? I've tried various routes and none seem to have the desired effect.
Thanks,
Patrick

How can I create & edit database of Sql Azure using SQL Server 2008 - R2

I have sql azure database and to create and edit database using portal is very boring task due to it' user interface, when i will connect it with my local sql server R2 then i can not able to edit , create table from there.
Is there any way to make it possible , Please give me some solution for that
At this time, the two options available are the web user interface (which will be improved over time) and SQL Server Mgmt Studio (using queries; no user interface) for which SQL Azure support will also improve over time.
After all i found one 3rd party client to manage SQL Azure and that is RazorSQL- Awesome tool! I have write down about it in my blog, see here
Navicat is a commercial application that offers access.
http://www.navicat.com/products/navicat-for-sqlserver
Personally I vastly prefer it to the Microsoft web interface

How do I setup DB2 Express-C Data Federation for a Sybase data source?

I wish to make fields in a remote public Sybase database outlined at http://www.informatics.jax.org/software.shtml#sql appear locally in our DB2 project's schema. To do this, I was going to use data federation, however I can't seem to be able to install the data source library (Sybase-specific file libdb2ctlib.so for Linux) because only DB2 and Infomatix work OOTB with DB2 Express-C v9.5 (which is the version we're currently running, I also tried the latest V9.7.)
From unclear IBM documentation and forum posts, the best I can gather is we need to spend $675 on http://www-01.ibm.com/software/data/infosphere/federation-server/ to get support for Sybase but budget-wise that's a bit out of the question.
So is there a free method using previous tool versions (as it seems DB2 Information Integrator was rebranded as InfoSphere Federation Server) to setup DB2 data wrappers for Sybase? Alternatively, is there another non-MySQL approach we can use, such as switching our local DBMS from DB2 to PostgreSQL? Does the latter support data integration/federation?
DB2 Express-C does not allow federated links to any remote database, not even other DB2 databases. You are correct that InfoSphere Federation Server is required to federate DB2 to a Sybase data source. I don't know if PostgreSQL supports federated links to Sybase.
Derek, there are several ways in which one can create a federated database. One is by using the federated database capability that is built in to DB2 Express-C. However, DB2 Express-C can only federate data from specific data sources i.e. other DB2 databases and industry standard web services. To add Sybase to this list you must purchase IBM Federation Server product.
The other way is to leverage DB2 capability to create User Defined Functions in DB2 Express-C that use OLE DB API to access other data sources. Because OLE DB is a Windows-based technology, only DB2 servers running on Windows can do that. What you do is create a table UDF that you can then use anywhere you would expect to see a table result set e.g view definition. For example, you could define a view that uses your UDF to materialize the results. These results would come from a query (via OLE DB) of your Sybase data (or any other OLE DB compliant data source).
You can find more information here http://publib.boulder.ibm.com/infocenter/idm/v2r2/index.jsp?topic=/com.ibm.datatools.routines.doc/topics/coledb_cont.html