How to pass a variable to sql deploy on devops? - azure-devops

I need that the post deploy script, part of the database project, to set some specific things depending on the environment it will run.
How pass an environment variable that the script can access?
Here what i'm trying to do.
the yaml file:
- task: SqlDacpacDeploymentOnMachineGroup#0
displayName: Install database
inputs:
TaskType: 'dacpac'
DacpacFile: '**/app-db.dacpac'
TargetMethod: 'server'
ServerName: '(localdb)\MSSQLLocalDB'
DatabaseName: 'app-dev'
AuthScheme: 'windowsAuthentication'
The log:
Starting: Install database
==============================================================================
Task : SQL Server database deploy
Description : Deploy a SQL Server database using DACPAC or SQL scripts
Version : 0.3.23
Author : Microsoft Corporation
Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/deploy/sql-dacpac-deployment-on-machine-group
==============================================================================
*** Could not deploy package.
Warning SQL72013: The following SqlCmd variables are not defined in the target scripts: env.
Error SQL72014: .Net SqlClient Data Provider: Msg 137, Level 15, State 2, Line 14 Must declare the scalar variable "#env".
Error SQL72045: Script execution error. The executed script:
IF (#env = 'DEV')
BEGIN
...
END

Here the complete solution i didn't find anywhere:
Declare a variable for the dacpac. Inside Visual studio, go to the property project page, tab SQLCMD Variables. Add the env variable like this:
In the sql script, write the variable inside a string, so it won't break the build with a SQL syntax error. It is not a SQL variable. The string '$(env)' will be replaced as a token before it runs:
if ('$(env)' = 'DEV') ...
You can test it still in visual studio by publishing the project.
Finally go the deploy task definition and setup the variable like this:
- task: SqlDacpacDeploymentOnMachineGroup#0
displayName: Install database
inputs:
TaskType: 'dacpac'
DacpacFile: '**/app-db.dacpac'
TargetMethod: 'server'
ServerName: '(localdb)\MSSQLLocalDB'
DatabaseName: 'app-dev'
AuthScheme: 'windowsAuthentication'
AdditionalArguments: /v:env=DEV

Related

Use Azure Devops variable in azure-pipelines.yml powershell script

I think I may be approaching this the wrong way. I have an azure-pipelines.yml file where I am deploying infrastructure through this pipeline using Terraform. So far, the pipeline installs Terraform in the environment with no issue. I am trying to run terraform init using a Powershell script, and am running into an error. Within the Powershell script command I am trying to reference a pipeline variable for access_key and secret_key. When executing the pipeline, I am getting the error no valid credential sources for S3 Backend found. This is happening, most likely, because I am referencing the variables I have set incorrectly. I have also set the variables in my Terraform variables file, but I think that may not be necessary since I am trying to read in from the pipeline variables. Below is the code for azure-pipelines.yml and the error I am getting in the pipeline output. Any advice would be appreciated.
azure-pipelines.yml
trigger:
- master
pool:
vmImage: ubuntu-latest
stages:
- stage: TerraformInstall
displayName: Terraform
jobs:
- job: InstallTerraform
displayName: Install Terraform
steps:
- task: charleszipp.azure-pipelines-tasks-terraform.azure-pipelines-tasks-terraform-installer.TerraformInstaller#0
- stage: Init
displayName: Init
jobs:
- job: init
displayName: Terraform init
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: 'terraform init -var access_key=${env:ACCESS_KEY} -var secret_key=${env:SECRET_KEY}'
Error
==============================================================================
Task : PowerShell
Description : Run a PowerShell script on Linux, macOS, or Windows
Version : 2.200.0
Author : Microsoft Corporation
Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/utility/powershell
==============================================================================
Generating script.
========================== Starting Command Output ===========================
/usr/bin/pwsh -NoLogo -NoProfile -NonInteractive -Command . '/home/vsts/work/_temp/6e333d67-4373-4ae7-bc4b-96cc38572961.ps1'
Initializing the backend...
╷
│ Error: error configuring S3 Backend: no valid credential sources for S3 Backend found.
│
│ Please see https://www.terraform.io/docs/language/settings/backends/s3.html
│ for more information about providing credentials.
│
│ Error: NoCredentialProviders: no valid providers in chain. Deprecated.
│ For verbose messaging see aws.Config.CredentialsChainVerboseErrors
│
│
│
╵
##[error]PowerShell exited with code '1'.
Finishing: PowerShell
Without knowing more about how your variables are set it's hard to give a complete solution.
First, I don't think the Terraform CLI command Init accepts input variables (Correct me if I'm wrong). Just making a guess here, you're passing ACCESS_KEY and SECRET_KEY to be used with your Terraform backend provider. If that's the case, see this StackOver flow answer on how to do that. To summarize what that answer says
Create a separate .tfvars file that stores the variables that will be used for your backend provider
Use that .tfvars file with your Terraform Init like so:
terraform init -backend-config=backend.tfvars
Terraform aside, If you're using Azure DevOps Library variable groups, you can pass the variables to your script using the example below. Again, I don't think this will help you initialize your Terraform.
Note: You may need to play with the quotes depending on the OS of your agent.
This example assumes you have created an ADO library variable group named YourVariableGroupNameHere and have created two variables in that group named ACCESS_KEY and SECRET_KEY.
- stage: Init
displayName: Init
jobs:
- job: init
variables:
- group: YourVariableGroupNameHere
displayName: Terraform init
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: 'terraform init -var access_key=$(ACCESS_KEY) -var secret_key=$(SECRET_KEY)'

Entity Framework Core Migration in Azure Pipeline

Does anyone know why a connectionString is required when running ef migration from YAML and not Developer Command Prompt?
I believe it might have something to do with not pulling the connectionString value from the Pipeline Variable.
I am setting the connection string here:
builder.Service.AddDbContext<myContext>(options=>
options.UseSqlServer(builder.Configuration.GetConnectionString("MyDB")));
I have tried my Pipeline Variable's names as
MyDB
ConnectionStrings_MyDB
ConnectionStrings:MyDB
The following YAML gives me the error "Value cannot be null. (Parameter 'connectionString')"
- task: DotNetCoreCLI#2
displayName: Create SQL Scripts
inputs:
command: custom
custom: 'ef '
arguments: migrations script --output $(sqlOutputPath) --idempotent --project $(solution)
However running the following command from Developer Command Prompt executes successfully:
dotnet ef migrations script --output complete.sql --idempotent --project myproject
It turned out that I forgot to add the reference for Azure KeyVault to my project; therefore it was not even looking at keyvault for the ConnectionString. Also the name would need to be ConnectionStrings--MyDB

Env variable not being recognized

CICD: Azure devOps
Task: Azure CLI
Script Location: Inline Script
Task Version: 1
Agent: Self-hosted
Language: Go
I am facing a strange issue when I pass the env variable from Azure Pipeline.
Command (Inline Script):-
$env:ENV="FOO"
Output:-
2021-09-03T13:49:28.9213455Z
2021-09-03T13:49:28.9214265Z c:\Agent\_work\r1\a>$env:ENV="FOO"
**2021-09-03T13:49:28.9219788Z The filename, directory name, or volume label syntax is incorrect.**
2021-09-03T13:49:28.9298991Z ##[error]Script failed with error: Error: The process 'c:\Agent\_work\_temp\azureclitaskscript1630676963575.bat' failed with exit code 1
The same command works perfectly fine in local VM PowerShell but not sure why it doesn't from Pipeline. Any suggestions?
Note: Directory path is correct ONLY.
The script seems to be in .bat and $env:ENV=... is not a supported bat command.
$env:ENV="FOO"
This script is PowerShell Script.
You need to specify the use of PowerShell in the Azure CLI task to run the script.
In Azure CLI task V1 , it seems that there is no option to achieve this.
I suggest that you can use the Azure CLI task version 2:
- task: AzureCLI#2
displayName: 'Azure CLI '
inputs:
azureSubscription: xxx
scriptType: ps
scriptLocation: inlineScript
inlineScript: '$env:ENV="FOO"'

How to fix Azure-Powershell "hello world"-task that fails with error: Could not find the modules: 'Az.Accounts' with Version: '6.2.3'

I'm setting up an Azure devops pipeline in which I want to manipulate resources in my subscription. For this I try to use an AzurePowerShell task.
I trimmed down my attempts to a most basic hello world example that connects with my subscription:
pool:
vmImage: windows-2019
trigger: none
steps:
- task: AzurePowerShell#4
displayName: 'hello world'
inputs:
azureSubscription: 'azure-connection-dev'
azurePowerShellVersion: '6.2.3'
inline: |
Write-Output "Hello"
Write-Output "world"
When I trigger this pipeline I expect the pipeline to print "Hello world" but instead it fails with
==============================================================================
Task : Azure PowerShell
Description : Run a PowerShell script within an Azure environment
Version : 4.157.4
Author : Microsoft Corporation
Help : [Learn more about this task](https://go.microsoft.com/fwlink/?LinkID=613749)
==============================================================================
##[error]Could not find the modules: 'Az.Accounts' with Version: '6.2.3'. If the module was recently installed, retry after restarting the Azure
What is wrong with the above hello world example?
The error message is unambiguously telling you the exact problem: The hosted agent doesn't have Az 6.2.3 on it. The available versions are documented. If you need a version that's not installed by default, you'll need to install it with Install-Module first.
The Azure Powershell task starts by logging in to your Azure subscription with the specified service connection. Since you're telling it to use an unavailable version, it's trying to load the module and immediately failing.

get a sql server (express) during a pipeline build azure devops

i'm setting up a pipeline for asp.net application. During integration tests task i need to connect to a SQL server. How can i say to the pipeline that i need a sql service ?
I have tried with multiple microsoft hosted agent pool (Windows Server 1803, Hosted 2017 & 2019)
I use Windows Server 1803 and issue is:
The operating system of the container does not match the operating system of the host.
I would like setting up correctly a temporaly sql server to running tests.
I have used localdb instead.
i run this script before my intregration tests task
SqlLocalDB.exe create "DeptLocalDB"
SqlLocalDB.exe share "DeptLocalDB" "DeptSharedLocalDB"
SqlLocalDB.exe start "DeptLocalDB"
SqlLocalDB.exe info "DeptLocalDB"
To connect with powershell: Invoke-Sqlcmd -Query "SELECT GETDATE() AS TimeOfQuery;" -ServerInstance "(localdb)\.\DeptSharedLocalDB"
To connect with sqlcmd: sqlcmd -S (localdb)\.\DeptSharedLocalDB
To connect a c# app (connectionString): "Data Source=(localdb)\.\DeptS
haredLocalDB;Initial Catalog=DeptLocalDB;Integrated Security=True;"
If someone know how to mount a sql server in a container on azure pipeline, it will be appreciated. Thank for reading
Chocolatey is installed on windows-latest.
So, if you in you YAML file define:
pool:
vmImage: windows-latest
you can then install SQL Server Express using choco:
- script: choco install sql-server-express
azure-pipelines.yml:
pool:
vmImage: 'windows-latest'
...
- task: PowerShell#2
displayName: 'start mssqllocaldb'
inputs:
targetType: 'inline'
script: 'sqllocaldb start mssqllocaldb'
After the following connection string is valid:
Data Source=(localdb)\\MSSQLLocalDB;Initial Catalog=my-db;Integrated Security=True;
Source:
https://www.jannikbuschke.de/blog/azure-devops-enable-mssqllocaldb/
In my case (some .NET Core integration tests that needed LocalDB), just using the windows-latest image did the job:
pool:
vmImage: 'windows-latest'
The image comes with Visual Studio which includes LocalDB.
Here a full example of how I achieved running Integration tests using Sql Express in Azure Devops. I had to use the YAML based pipelines so I could use simonauner's approach using Chocolatey to install Sql Express. Make note that I had to install EF Core tools since I use .Net Core 3.1 in this pipeline. Also note that I generate an EF Code First migration SQL file on the fly so that the rigged SQL Express instance is filled with contents.
Deploy SQL Express instance in Azure Devops, install and generate and run EF Code first migration sql script to update database with schema and seed data using EF Code First tools.
# ASP.NET Core (.NET Framework)
# Build and test ASP.NET Core projects targeting the full .NET Framework.
# Add steps that publish symbols, save build artifacts, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/dotnet-core
trigger:
- feature/testability
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- script: choco install sql-server-express
- task: NuGetToolInstaller#1
- task: VisualStudioTestPlatformInstaller#1
displayName: 'Visual Studio Test Platform Installer'
inputs:
versionSelector: latestStable
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: DotNetCoreCLI#2
displayName: Build
inputs:
command: build
projects: '**/*.csproj'
arguments: '--configuration Debug' # Update this to match your need
- script: 'dotnet tool install --global dotnet-ef'
displayName: 'Generate EF Code First Migrations SQL Script Update script'
- script: 'dotnet ef migrations script -i -o %BUILD_ARTIFACTSTAGINGDIRECTORY%\migrate.sql --project .\SomeAcme\SomeAcme.csproj'
displayName: 'Generate EF Code First Migrations migrate.sql'
- script: 'sqlcmd -S .\SQLEXPRESS -Q "CREATE DATABASE [SomeAcmeDb]"'
displayName: 'Create database SomeAcmeDb in Azure Devops SQL EXPRESS'
- script: 'sqlcmd -i %BUILD_ARTIFACTSTAGINGDIRECTORY%\migrate.sql -S .\SQLEXPRESS -d SomeAcmeDb'
displayName: ' Run migrate.sql on SQL EXPRESS in Azure Devops'
# PowerShell
# Run a PowerShell script on Linux, macOS, or Windows
- task: PowerShell#2
inputs:
targetType: 'inline' # Optional. Options: filePath, inline
#filePath: # Required when targetType == FilePath
#arguments: # Optional
script: 'gci -recurse -filter *.dll' # Required when targetType == Inline
#errorActionPreference: 'stop' # Optional. Options: stop, continue, silentlyContinue
#failOnStderr: false # Optional
#ignoreLASTEXITCODE: false # Optional
#pwsh: false # Optional
#workingDirectory: # Optional
- task: VSTest#2
displayName: 'VsTest - testAssemblies'
inputs:
testAssemblyVer2: |
**\*SomeAcme.Tests.dll
!**\*TestAdapter.dll
!**\obj\**
vsTestVersion: toolsInstaller
testFiltercriteria: 'Category=IntegrationTest'
runInParallel: false
codeCoverageEnabled: false
testRunTitle: 'XUnit tests SomeAcme solution integration test starting'
failOnMinTestsNotRun: true
rerunFailedTests: false
I need localdb in order to run database unit tests, and I also require at least SQL Server 2016 SP2 because my team is using some of the newer language features.
On a windows-2019 image, simply use choco or powershell to execute the following command:
choco upgrade sqllocaldb