Dacpac publish how to ignore create database statement? - azure-devops

Due to shared database, we don't have admin privs. Only we have schema id.
In dacpac build or sqlpackage.exe publish is there anyway to ignore the Create database statement ?
2020-03-23T21:48:54.5875811Z ##[error]*** Could not deploy package.
2020-03-23T21:48:54.5930784Z ##[error]Error SQL72014: .Net SqlClient Data Provider: Msg 262, Level 14, State 1, Line 1 CREATE DATABASE permission denied in database 'master'.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE [$(Datab
2020-03-23T21:48:54.6200608Z ##[error]aseName)] COLLATE SQL_Latin1_General_CP1_CI_AS
i tried with below option in sqlpackage.exe and it's not working
/p:CreateNewDatabase=False /p:ExcludeObjectTypes=Users;Logins;RoleMembership;Permissions;Credentials;DatabaseScopedCredentials
is it possible to hack the dacpac sql script comment the create database and from next version it should able to skip because of incremental load? Appreciate your idea.
For Azure sql datawarehouse is there any alternative to DACPAC deployment ?

You can use sqlpackage.exe to generate update scripts (/a:Script) between your dacpac and target database. Then you can remove unneeded instruction. Here is example to create pipeline: Azure Pipelines - Generating DB Script with SqlPackage.exe

Related

How do you run many SQL commands against an Azure SQL database using an Azure Automation powershell runbook

I'm using Azure Automation to move an Azure SQL database from one resource to another(from Prod to Dev for example). After the database is copied, I would then like to run SQL script that adds some users and permissions. This would mean I need to run a handful of commands like "Create user..." and "alter role....". Most examples I've found use powershell to execute a single SQL command, but using that code to run many commands seems like it would result in an excessively long powershell script. In the on-prem world, I probably would have .sql file that gets executed. Any suggestions on how to achieve this easily using powershell in Azure Automation. Thanks!

Azure SQL Database DevOps Release pipeline returns Error CREATE DATABASE permission denied

I have created build and release pipelines for Azure SQL Database according to these instructions.
I am able to publish my database project directly from Visual Studio.
My Build pipeline works without problems.
However on release pipeline I am getting the following error:
Error SQL72014: .Net SqlClient Data Provider: Msg 262, Level 14, State 1, Line 1 CREATE DATABASE permission denied in database 'master'.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE [$(Datab
aseName)] COLLATE SQL_Latin1_General_CP1_CI_AS;
Any idea what I may be doing wrong or how to find where the problem is?
Feeling a bit stupid right about now. I finally found how to debug the pipeline (variable system.debug = true) and it was telling me that it can't find the database defined. Which I had mistyped.
However the error without the debugging option was NOT helpful :D So if anyone is getting this error they should recheck connection information.

Entity Framework Migration Azure DevOps Release Pipeline

I'm trying to run migration on Azure DevOps Release Pipeline. Because I want to run my DB scripts automatically on every release. My release pipeline does not have source code, I just have compiled DLLs.
When I execute this command on my local machine, it runs successfully. How can I convert this command so I can use it with DLLs.
dotnet ef database update --project MyEntityFrameworkProject --context MyDbContext --startup-project MyStartupProject
Another approach is to generate migration script (a regular sql script) during build pipeline and make this script a part of your artifact. To do so run following command:
dotnet ef migrations script --output $(build.artifactstagingdirectory)\sql\migrations.sql -i
Note -i flag which makes this script runnable multiple times on the same database
Once you have this script as a part of your artifact you can run it on database in your release pipeline by using Azure SQL Database Deployment built in task.
Check this link for more info
EDIT: As #PartickBorkowicz pointed out there are some issues related to the fact that database is not available in a regular way from Build/Release Agent perspective. Here are some additional tips how to live without a database and connection string stored anywhere in your code.
1. Build pipeline
If you do nothing, an Build Agent will require database connection in order to run dotnet ef migrations script script. But there's one trick you can do which will allow you to work without database and connection string: IDesignTimeDbContextFactory
It's enough that you create class like this:
public class YourDesignTimeContextFactory : IDesignTimeDbContextFactory<YourDbContext>
{
public YourDbContext CreateDbContext(string[] args)
{
var databaseConnectionString = "Data Source=(LocalDB)\\MSSQLLocalDB;Initial Catalog=LocalDB;Integrated Security=True;Pooling=False";
var builder = new DbContextOptionsBuilder<YourDbContext>();
builder.UseSqlServer(databaseConnectionString);
return new YourDbContext(builder.Options);
}
}
Once it is present in your project (you don't need to register it anyhow) your Build Agent will be able to generate sql script with migrations logic without access to actual database
2. Release pipeline
Now, you're having sql script generated and being part of artifact from a build pipeline. Now, release pipeline is the time when you want this script to be run on actual database - you'll need a connection string to this database somehow. To do so in secure manner you should not store it anywhere in the code. A good way to do it is to keep password in Azure Key Vault. There's built in task in Azure Release pipelines called Azure Key Vault. This will fetch your secrets which you can use in next step: Azure SQL Database Deployment. You just need to setup options:
AuthenticationType: Connection String
ConnectionString: `$(SqlServer--ConnectionString)` - this value depends on your secret name in Key Vault
Deploy type: SQL Script File
SQL Script: $(System.DefaultWorkingDirectory)/YourProject/drop/migrations/script.sql - this depends how you setup your artifact in build pipeline.
This way you're able to generate migrations without access to database and run migrations sql without storing your connection string anywhere in the code.
If you don't want to include your source code with the artifacts you can use the following script:
set rootDir=$(System.DefaultWorkingDirectory)\WebApp\drop\WebApp.Web
set efPath=C:\Program Files\dotnet\sdk\NuGetFallbackFolder\microsoft.entityframeworkcore.tools\2.1.1\tools\netcoreapp2.0\any\ef.dll
dotnet exec --depsfile "%rootDir%\WebApp.deps.json" --additionalprobingpath %USERPROFILE%\.nuget\packages --additionalprobingpath "C:\Program Files\dotnet\sdk\NuGetFallbackFolder" --runtimeconfig "%rootDir%\WebApp.runtimeconfig.json" "%efpath%" database update --verbose --prefix-output --assembly "%rootDir%\AssemblyContainingDbContext.dll" --startup-assembly "%rootDir%\AssemblyContainingStartup.dll" --working-dir "%rootDir%"
It turns out you can get away with the undocumented dotnet exec command like the following example (assuming the web application is called WebApp):
Note that the Working Directory (hidden under Advanced) of this run command line task must be set to where the artifacts are (rootDir above).
Another option is to install Build & Release Tools extension and use the "Deploy Entity Framework Core Migrations from a DLL" task.
You can read more info here, here and here.

Azure SQL Database Deployment task

We have a set of Azure SQL Database Deployment tasks set up in Azure Devops, deploy type SQL DACPAC File using Hosted VS2017. Most pass however some result in error:
2019-04-04T23:51:59.1581965Z Initializing deployment (Start)
2019-04-04T23:52:26.0452995Z Initializing deployment (Complete)
2019-04-04T23:52:26.0453268Z Analyzing deployment plan (Start)
2019-04-04T23:52:26.1340183Z Analyzing deployment plan (Complete)
2019-04-04T23:52:26.1346216Z Updating database (Start)
2019-04-04T23:52:31.2433080Z Creating Name...
2019-04-04T23:52:37.8176073Z Updating database (Failed)
2019-04-04T23:52:37.9828381Z ##[error]*** Could not deploy package.
2019-04-04T23:52:37.9918864Z ##[error]Error SQL72014: .Net SqlClient Data Provider: Msg 42019, Level 16, State 4, Line 1 CREATE DATABASE operation failed. Internal service error.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE [$(D
2019-04-04T23:52:38.0103747Z ##[error]atabaseName)] COLLATE SQL_Latin1_General_CP1_CI_AS;
Error SQL72014: .Net SqlClient Data Provider: Msg 0, Level 20, State 0, Line 0 A severe error occurred on the current command. The results, if any, should be discarded.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE [$(DatabaseName)] COLLATE SQL_Latin1_General_CP1_CI_AS;
Each of these tasks utilise the same:
Authentication Type
Azure SQL Server
Login
Password
Firewall set IPAddressRange 0.0.0.0 / 255.255.255.255
Not sure why this is occurring when the same login details are used and deploys to some but not others.
We have deployed the dacpac locally and works with no issues.
Any ideas as to why this is occurring?

VS403250: The dacpac or source database is not a detached TFS Collection database

I am working on migrating the projects from on premise TFS Server to VSTS, for that I followed this link.
But when I run this command TfsMigrator import /importFile:C:\TFSDataImportFiles\import.json, I got the error like VS403250: The dacpac or source database is not a detached TFS Collection database even though I detached the collection from my on premise TFS server.
Can you please tell me how to resolve the above error as soon as possible?
I got confused with the difference between detaching the database and detaching the collection. You have to detach he collection from either the TFS admin console or the tfsconfig command line utility on the application tier. I prefer the command line. FYI: My first attempt to re-attach the collection using the admin console got hung up and appeared to run for six hours before I restarted the app tier and it finally errored out. Attaching a 160 GB collection with 48 projects only took a few minutes when it worked correctly using tfsconfig.
You create your dacpac or backup the collection after detaching the collection. Then if it is a dry run, re- attach the collection.
According to the troubleshooting document you need to detach the collection database, then generate the DACPAC again.
VS403250: The dacpac is not a detached TFS Collection database.
The DACPAC is not built off a detached collection. The collection
database will need to be detached and the DACPAC generated again.
I am having the exact same issue and I have already detach the collection and have also create the DACPAC file.
[Info #11:26:21.170] Import couldn't be started due to an error!
[Error #11:26:21.171] + [Error] VS403250: The dacpac or source database is not a detached Azure DevOps Server Collection database. Please refer to the troubleshooting documentation for more details; https://aka.ms/AzureDevOpsImportTroubleshooting
[Info #11:26:21.172] See our documentation for assistance: https://aka.ms/AzureDevOpsImportTroubleshooting
[Error #11:26:21.180] Microsoft.VisualStudio.Services.WebApi.SourceIsNotADetachedDatabaseException: VS403250: The dacpac or source database is not a detached Azure DevOps Server Collection database. Please refer to the troubleshooting documentation for more details; https://aka.ms/AzureDevOpsImportTroubleshooting
at Microsoft.VisualStudio.Services.WebApi.VssHttpClientBase.<HandleResponseAsync>d__53.MoveNext()
I followed all the steps mentioned in https://learn.microsoft.com/en-us/azure/devops/migrate/migration-import-large-collections?view=azure-devops as my DB is huge.
For dryrun, i create a collection on a new server and restore the DB and that was successful but if I do from the actual server (on-perm) where I have Azure server i get this error.
Screenshot of the server