copy Dynamics CRM on Premise to another instance - copy

I am asking for the actions needed to be taken to clone Dynamics CRM on-premise from the production environment to test environment.
Is it just
open production SQL server,
Backup PROD_MSCRM Database
move the file to the test SQL server.
restore the backup file to Test_MSCRM DB
Can someone help me if this correct and if there any thing else I should do?

Yes, after restore you need to create a new Dynamics CRM instance. You do this using the management tooling. Create the organization and choose to import an existing database.

Related

CI/CD of Database in multiple instance - Azure Devops

I have written a step to deploy database build in respective database. But at a time, i am able to deploy one database ie one step for one db. is it possible to deploy same db build in multiple db at a step in relase?
is it possible to deploy same db build in multiple db at a step in relase?
AFAIK, there is no build-in task to do it. But you could create a powershell/batch script that loops runs the sqlpackage.exe to deploy DAC to multiple db at a step in relase:
You could check the similar thread Deployment to several databases using SQL Server Data Tools and Team foundation Server for some details.
Besides, there are many extensions in Marketplace can do it, so choose some of them that meet your requirement:
https://marketplace.visualstudio.com/search?term=sql&target=AzureDevOps&category=All%20categories&visibilityQuery=all&sortBy=Relevance
Hope this helps.

Is there any way to implement CI/CD for on premises Postgres SQL using Azure Devops Pipelines?

I want to create one click deployment on azure pipelines to move Postgres Sql changes from dev to QA environment,similar to what we implement using SQL Server Database project where a Powershell script deploy the changes to the remote server.
I have tried pg_dump and psql commands which will create dump file and restore it on the remote server. It does not perform diffing ie(comparing database changes on source and destination , and only replicating the missing changes)
You've stumbled upon one of the features lacking in the Postgres ecosystem. One of the more elegant ways to solve migrations using Postgres' own tooling is to package up your migrations as a Postgres Extension. This requires you to generate the deployment scripts yourself, but it is a neat way of applying and packaging up the deployments.
There are a number of commercial tools that will assist in this process, such as Datical, Liquibase, and Flyway. Note, some of these still require you to generate the change statements yourself, some attempt to create them for you.
Generating change statements is a whole different animal and I recommend you look at schema diffing tools for Postgres to find what best suites your needs.

Migrate from SQL Server 2008 R2 to Azure SQL v12

I have a set of SQL Server 2008 R2 databases. I need to migrate them to Azure SQL V12. Steps I have completed so far:
Updated the .sqlproj to reflect Microsoft Azure SQL Database V12
Completed all the schema changes in the solution which include added Master key, creating a credential object, adding reference tables (for all the cross referenced tables between databases), And also took care of the scripts/schema related to tempdb tables.
Build the sqlproj projects. SystemDB and ClientDB.
Also completed deploying this updated schema from solution to the Azure SQL database on standard tier databases to validate that the schema is correct for the target.
Now, How do I go about moving the data from my original database tables into the newly created Azure SQL databases?
I have tried to follow document here. Recipe 3. But, My new sqlproj project does not deploy to the copy of databases which is on SQL Server 2008 R2. What am I missing here? Is there an Azure SQL compatible SQL Server version that I need to move my original databases to?
Any help will be greatly appreciated.
Thanks.
You can export the database to a .bacpac file, upload it to Azure storage and import it to Azure SQL Database. Here're the detailed steps:
Assess the database for compatibility using the latest version of
Data Migration Assistant (DMA).
Prepare any necessary fixes as
Transact-SQL scripts.
Make a transactionally consistent copy of the source database being migrated - and ensure no further changes are being made to the source database (or you can manually apply any such changes after the migration completes). There are many methods to quiesce a database, from disabling client connectivity to creating a database snapshot.
Deploy the Transact-SQL scripts to apply the fixes to the database
copy.
Export the database copy to a .BACPAC file on a local drive.
Import the .BACPAC file as a new Azure SQL database using any of
several BACPAC import tools, with SQLPackage.exe being the
recommended tool for best performance.
You can find more details in the online document.

Switching databases as Azure Project Develops, but Azure Website Keeps pointing to Linked resource

I think I may be missing something and hope you can advise
I have been developing a project using VS2013 with EF6. I use Visual Studio each time I want to deploy the latest version of the system to my Azure Website.
The Azure Website has a linked database resource (SQL Azure database).
This has been going great. However, yesterday I decided to create a Virtual Machine and move the SQL database to a dedicated Azure Virtual Machine. So I did this and now I have a new database as well as the old linked resource one
So, i'm ready to publish the APP and set the new database settings on the VM.
I changed the connection string in the publish wizard and published being sure to have the right settings, i.e. use this connection string at runtime and execute code first migrations etc
However, it took me a while to realise that the APP on the cloud server I just published too is still pointing to the OLD linked resource Azure database
I'm not sure what else I have to do to, I thought it was only about changing the publish setting for the database connection string
Am I missing something, should I delete the linked resource in the Azure Website settings, if i do would that make it work. Just weird because like I say i'm publishing the site again with new settings, or does Azure read the portals publish settings and somehow overidde what I want it to point to database wise
Please advise, many thanks
John
PS I can connect fine to the new database from my local management studio. I have no errors i'm just not sure how to tell Azure to use the connect string in publish profile other than what i am doing
The "linked resource" in the Windows Azure management portal should have no impact on your application's functionality. It is really just a way to help you understand / visualize the resources your application is using.

How to manage database context changes in production / CI

I've spent the past few months developing a webApi solution that I'm ready to push up to Azure and hook into an Azure SQL Database. It was built with EF Code First.
I'm wondering what standard approaches there are to making changes to the database while in production. I've been using database initializers up to this point but they all blow away data and re-seed.
I have a feeling this question is too broad for a concise answer, so I'd like to ask: what terminology / processes / resources should a developer look into when designing a continuous integration workflow for a solution built with EF Code First and ASP.NET WebAPI, hosted as an Azure Service and hooked up to Azure SQL?
On the subject of database migration, there was an interesting article on ASP.NET about this subject: Strategies for Database Development and Deployment.
Also since you are using EF Code First you will be able to use Code First Migrations here for database changes. This will allow you to better manage the changes you make to the database.
I'm not sure how far you want to go with continuous integration but since you are using Azure it might be worth it to have a look at Continuous delivery to Windows Azure by using Team Foundation Service. Although it relies on TFS in the cloud it's of course also possible to configure it with for example Jenkins. However this does require a bit more work.
I use this technic:
1- Create a clone database for your development environment if it doesn't exist.
2- Make the necessary changes in your dev environment and dev
database.
3- Deploy to your staging environment.
4- If you added some static datas
that should also exist in your prod database, use a tool like
SQLDataExaminer to find the data differences and execute the
insert, update, deletes for according rows. Use Schema Compare in VS2012 to find differences between your dev
and prod environment by selecting source as dev and target as prod.
And execute the script in your prod.
5- Swap the environments