I have an on-premise database. At the same time I have the database on cloud. When the on-premise database gets updated the SQL Azure database should also get updated. Only the changed fields should be updated. The rest should remain the same. How can this be achieved in minimal time?
There is a No code solution called Data SYNC CTP2 but you need to request access which unfortunately stopped for now. (http://connect.microsoft.com/sqlazurectps)
You could try using SYNC framework. Have a look at this article: http://blogs.msdn.com/b/sync/archive/2010/08/31/sql-server-to-sql-azure-synchronization-using-sync-framework-2-1.aspx
just a note. Neither Sync Framework or Sql Azure Data Sync does column level change tracking or synchronization. When a column in a row is changed, the entire row is sent during synchronization.
As Paras mentioned, Sql Azure Data Sync is in CTP stage (CTP2 now, with CTP3 supposed to come out this summer).
Sync Framework 2.1 however already supports synching with Azure.
check out Synchronizing with SQL Azure using Sync Framework
for links to various walkthroughs/samples
Related
My company has lots of data(Database: PostgreSQL) and now the requirement is to add search feature in that,we have been asked to use Azure cognitive search.
I want to know that how we can transform the data and send it to the Azure search engine.
There are few cases which we have to handle:
How will we transfer and upload on index of search engine for existing data?
What will be the easy way to update the data on search engine with new records in our Production Database?(For now we are using Java back end code for transforming the data and updating the index, but it is very time consuming.)
3.What will be the best way to manage when there's an update on existing database structure? How will we update the indexer without doing lots of work by creating the indexers every time?
Is there anyway we can automatically update the index whenever there is change in database records.
You can either write code to push data from your PostgreSQL database into the Azure Search index via the /docs/index API, or you can configure an Azure Search Indexer to do the data ingestion. The upside of configuring an Indexer to do the ingestion is that you can also configure it to monitor the datasource on a schedule for updates, and have those updates reflected into the search index automatically. For example via SQL Integrated Change Tracking Policy
PostgreSQL is a supported datasource for Azure Search Indexers, although the datasource is in preview (not get generally available).
Besides the answer above that involves coding on your end, there is a solution you may implement using Azure Data Factory PostgreSQL connector with a custom query that tracks for recent records and create a Pipeline Activity that sinks to an Azure Blob Storage account.
Then within Data Factory you can link to a Pipeline Activity that copies to an Azure Cognitive Search index and add a trigger to the pipeline to run at specified times.
Once the staged data is in the storage account in delimitedText format, you can also use built-in Azure Blob indexer with change tracking enabled.
I use Model-first with EF, and I want to have an automated gap DDL script when I change my model. With "Entity Framework Database Generation Power Pack" We had it in past, but I read that was not supported in VS2012.
Any changes about that?
For Who dont't understand this need, I would like to remmember that in production enviroments, development team dosen't have access to DB. We must create and send to production Support team, DDL deployment scripts that preserve data and all DB without any recreation.
You should have a look at Database.SetInitializer, which mainly determines what happens if there is no database present when the application is started for the first time, and migrations which can be used to update the datebase when a new application version (which requires an updated database) has been deployed. If the built-in support for migrations data aren't enough, you also have the ability to add raw SQL data to handle migrating to a new version.
Is there any way to migrate and synchronize data between an on-premise SQL database and SQL Azure database apart from sync framework?
The sync framework works fine for small databases, but when it comes to large databases its not working. Is there any possible way to migrate and synchronize using change datacapture and SSIS?
you might want to clarify what you mean by "its not working". Are you having issues during the initial sync or incremental sync? what's the sync direction required for your sync? upload/download/bidirectional? do you get an error?
there are many ways to do the migration,
for synchronization, apart from Sync Framework, you may also look at Sql Azure Data Sync (largely based on Sync Framework too).
and yes, you can use Change Data Capture and SSIS if you want but note that Sql Azure doesnt have the same Change Data Capture feature as Sql Server so you'll be fine using CDC and SSIS for on-premise to Sql Azure sync only. You have to figure out another way to do change tracking on the Sql Azure side.
I have an app the runs at the store level extracting data out of a POS system. This app asks the POS system for data which in return produces some .DBF files. The app loads the DBFs in memory and saves it to SQL Server 2008 Express at the store level. This happens at 3 different stores.
I have been looking into the Microsoft Sync Framework, but have not come across any good examples of how to sync tables in one direction only, from each of the stores into a single database at the corporate level.
The data at each one of the stores is being managed by the app(delete, update, insert).
Does anyone know a good article, that I could read about synchronizing SQL data?
Thank you.
Try the following link
http://www.codeproject.com/KB/database/DTS_SQLExpress.aspx
My small team used asp.mvc 2.0/entity framework 4.0(model first approach)/Windows Server 2008r2/Sql Server 2008 r2 stack in out web site project. We've already complete developing process, and come to the web deployment stage. In this stage we are faced with the problem - ok we'll use vs2010 features for initial server/db deploy, but what we'll do in the future? Obviously some of our models can be modified after publishing in order to satisfy new conditions, and of course our server db will contains users data sets, articles etc. Is there any approach to update servers db with new db modification, without dropping db, and converting data from old instance to the new one?
Now we have found only DAC/DACPAC approach to update server db, but we don't know how to bind auto EF model generation with DAC.
May be there is exists another solution? Is there any standard way to resolve this kind of situation? Any advice?
Thanks
I'd be interested to know if you have found a solution to this yet?
Have you tried simply generating a database based on your EF model, and using a schema comparison tool such as SQL Compare to deploy changes from the EF-generated database and your target production server?