Which ways do I have to import my tables and data from SQL Server to SQL Azure? - sql-server-2008-r2

I'm looking for the easiest way to import data from SQL Server to SQL Azure.
I'd like to work locally, would there be a way to synchronize my local database to SQL Azure all the time?
The thing is I wouldn't like to update each time I add a table in my local database to SQL Azure.

I HIGHLY recommend using the SQL Database Migration Wizard: http://sqlazuremw.codeplex.com/ it is the best free tool I've used so far. Simple and works much easier the the SSMS and VS built in tools. I think the Red-Gate tools now work well with SQL Azure too - but I haven't had a chance to use them.

Have you looked at SQL Data Sync? A new October update just came out today.
http://msdn.microsoft.com/en-us/library/hh456371

Microsoft SQL Server Data Tools (SSDT) was developed by Microsoft to make it easy to deploy your DB to Azure.
Here's a detailed explanation: http://msdn.microsoft.com/en-us/library/windowsazure/jj156163.aspx or http://blogs.msdn.com/b/ssdt/archive/2012/04/19/migrating-a-database-to-sql-azure-using-ssdt.aspx
and here's how to automate the process of publishing: http://www.anujchaudhary.com/2012/08/sqlpackageexe-automating-ssdt-deployment.html
To look: SQL Server Data Tools Team Blog

There are a few ways to migrate databases, I would recommend you to do it by using the generate scripts Wizard.
Here are the steps to follow
http://msdn.microsoft.com/en-us/library/windowsazure/ee621790.aspx
Also there are others tools like Microsoft Sync Framework.
Here you'll find more information about it
http://msdn.microsoft.com/en-us/library/windowsazure/ee730904.aspx

Related

How to share a postgreSQL database?

I'm currently working on a project with some colleagues and a colleague of mine linked a database created in her postgreSQL server to our visual studio project, but we don't know how she can share the database with the rest of us, or how can we modify the database without having it.
We're using postgreSQL 14.
One option is to create your database in a cloud provider such as AWS.
You can take a look at: https://aws.amazon.com/rds/postgresql/
This way all of you will be able to access the database.

Execute PostgreSQL statements upon deployment to Elastic Beanstalk

I am working on an application that has source code stored in GitHub, build and test is done by CodeShip, and hosting is done in Amazon Elastic Beanstalk.
I'm at a point where seed data is needed on the development database (PostgreSQL in Amazon RDS) and it is changing regularly in development.
I'd like to execute several SQL statements that are stored in GitHub when a deployment takes place. I haven't found a way to do this with the tools we're using, so I'm wondering if there are some alternatives.
If these are the same SQL statements, then you can simply create an .ebextension (see documentation) that will execute them after each deploy.
If the SQLs are dynamic per deploy, then I'd recommend a database migrations management tool. I'm familiar with rails that has that by default abut there's also a standalone migrations tool for non-rails projects. Google can suggest many other options.

How to import sql developer connections-file in 0xDBE

Trying the 0xDBE from jetbrains. Does anybody know how I can import my old Oracle SQL Developer's connection file?
SQL Developer exports connections in XML. I do not believe that Datagrip has a connection importer which can be configured to ready the connections structure for SQL Developer at the moment.
To make things a little easier you can use the TNS entries housed in SQL Developer's app/client/product/.... path and load those into Datagrip.
You can try DataGrip plugin "sqldeveloper connections importer"
Also follow a feature request in our bug tracker.

Pentaho CE stack deployment from test to production environment

I have a task to find out a way to deploy Pentaho BI CE server, Kettle jobs/transformation from testing to production environment servers.
It is pretty clear how to deploy Pentaho BI server, but the problem is that I do not have clear understanding how can I transfer all my data (Schemas, CDE dashboards, Saiku queries etc.) from one environment to another. Should I download all data and just upload to production server? But that does not seems to be viable solution.
For Kettle jobs/transformation the best approach would be to use repositories, but I still can not figure it out how it could be moved from testing to prod environments.
So my question is - what is the best approach to deploy Pentaho to different environments?
P.S. I am still very new to Pentaho and all BI stuff, so if I made some technical mistakes, please correct me.
Download/Upload is the usual procedure. You can also take a look at the Repository Synchronizer in the marketplace. That might be helpful to ease part of the work.

Synchronizing EF Code First to SQL Azure

I am trying to find the best way to synchronize/migrate EF Code First databases from localhost to SQL Azure without using EFCF Migrations. I know that I could use this approach, but I want to look at different, less automagic options.
The following process, or variations of such, is the one I'd like to follow:
Develop locally, letting EFCF build the databse on localhost
Synchronize the local database with the stage database on SQL Azure using some tool
Run tests in the staging environment
Synchronze/migrate the database (local or stage) to the production database on SQL Azure
Using MySQL, this is a breeze. The MySQL Workbench can synchronize a schema model to the database in question, plain and simple. In this case, I don't have a schema model per se, but the database on localhost generated by EFCF could be concidered the schema.
Which tools are available to perform this task? Is it possible to do this using SSMS?
Update: How I did it:
After the tip from Craig to use a Visual Studio 2012 Database Project, I did the following:
Created an empty VS 2012 database project and set its target platform to SQL Azure
Did a new schema compare, source database = local db and target = database project
Updated the target. This brought the database project up to speed
Did a new compare, source= database project and target = SQL Azure stage db
Updated the target. This brought the SQL Azure stage db up to speed
This was exactly what I was looking for
The Visual Studio 2012 database project can do it, I do it all the time.
It's not free, but Red-Gate's SQL Compare would handle the schema replication