Azure DevOps pipeline for SSDT project can not find the variables when it does not need to create object - azure-devops

I am using Azure Devops to deploy my SSDT project. I am trying to update my Azure SQL Data Warehouse where I have DATABASE SCOPED CREDENTIAL and EXTERNAL DATA SOURCE.
I found this article and I did those steps. https://techcommunity.microsoft.com/t5/azure-synapse-analytics/how-to-securely-manage-load-credentials-with-ssdt-azure-key/bc-p/1397979
In my release pipeline I have this setting to deploy my SSDT project. As you can see i am using the values from my Azure Key Vault.
- task: AzureKeyVault#1
inputs:
azureSubscription: '<My Azure Subscription>'
KeyVaultName: '<My Key Vault>'
SecretsFilter: '*'
...
- task: SqlAzureDataWarehouseDacpacDeployment#1
inputs:
azureSubscription: '<My Azure Subscription>'
AuthenticationType: 'server'
ServerName: 'ABC.database.windows.net'
DataWarehouse: '$(SynapseName)'
SqlUsername: '$(SynapseSQLUsername)'
SqlPassword: '$(SynapseSQLPassword)'
deployType: 'DacpacTask'
DeploymentAction: 'Publish'
DacpacFile: 'SQL_ASynapse\bin\Release\SQL_ASynapse.dacpac'
AdditionalArguments: '/p:IgnoreAnsiNulls=True /p:IgnoreComments=True /v:DatabaseScopeCredentialSecret=$(DatabaseScopeCredentialSecret) /v:DatabaseScopeCredentialIdentity=$(DatabaseScopeCredentialIdentity) /v:ExternalDataSourceMarineTrafficLocation=$(ExternalDataSourceMarineTrafficLocation)'
IpDetectionMethod: 'AutoDetect'
I am passing three value for my three variable in my two blow scripts.
$(DatabaseScopeCredentialSecret)
$(DatabaseScopeCredentialIdentity)
$(ExternalDataSourceMarineTrafficLocation)
I have below code in two separated SQL files.
ADLSCredential.sql :
CREATE MASTER KEY;
GO
CREATE DATABASE SCOPED CREDENTIAL ADLSCredential
WITH
IDENTITY = '$(DatabaseScopeCredentialIdentity)',
SECRET = '$(DatabaseScopeCredentialSecret)'
;
AzureDataLakeStoreMarineTraffic.sql :
CREATE EXTERNAL DATA SOURCE AzureDataLakeStoreMarineTraffic
WITH (
TYPE = HADOOP,
LOCATION='$(ExternalDataSourceMarineTrafficLocation)',
CREDENTIAL = ADLSCredential
);
When I don't have those objects on my DW (Synapse), My pipeline is able to find values from Azure key Vault and assign to my parameters and create both objects but next time I have below error.
##[error]*** Could not deploy package.
##[error]Warning SQL72013: The following SqlCmd variables are not defined in the target scripts: DatabaseScopeCredentialSecret DatabaseScopeCredentialIdentity ExternalDataSourceMarineTrafficLocation.
Error SQL72014: .Net SqlClient
It seams when I don't need to ran those scripts, by passing values to my parameters SQLCMD has problem to find those variables because it had not created them.
Is there any way to have public variable somewhere or to tell SQLCMD to do not pass values for secound time?

I found the problem that I had.
I forgot to create variables in SSDT project. After creating them, every things work well.

Related

Equivalent predefined variables in azure from gitlab

I am migrating from .gitlab-ci.yml to azure-pipelines.yml
In one of the lines inside the .gitlab-ci.yml, there are variables
$CI_PROJECT_ID,$CI_PIPELINE_ID, $CI_JOB_ID.
I have figured out the equivalent for $CI_PROJECT_ID. In azure it is $(System.TeamProjectId)
However, need help in figuring out : $CI_PIPELINE_ID and $CI_JOB_ID
Looking forward for some suggestions
I'm not sure what these variables mean but consider this on Azure Pipelines:
$(System.DefinitionId) - The ID of the build pipeline.
$(System.JobId) - A unique identifier for a single attempt of a single job. The value is unique to the current pipeline.
$(Build.BuildId) - The ID of the record for the completed build.
All predefined variables you can find here

Synapse - Can't properly deploy SQL linked server

Hello I’m having a problem when I try to deploy my Synapse workspace to another environment (eg. Development to Test).
The problem is that the SQL linked services doesn’t seems to deploy properly. The first screenshot is from the development synapse and the second screenshot form test. As you can see the settings of the linked service are completely different.
Development
Test enviroment
I’m using the standard synapse deployment task in my DevOps pipeline
- task: Synapse workspace deployment#1
displayName: 'Synpase deployment task for workspace: syn$(name)$(environment)'
inputs:
TemplateFile: '$(System.DefaultWorkingDirectory)/$(cicd_synapse_workspace_origin)/TemplateForWorkspace.json'
ParametersFile: '$(System.DefaultWorkingDirectory)/$(cicd_synapse_workspace_origin)/TemplateParametersForWorkspace.json'
AzureSubscription: '${{ parameters.sercon }}'
ResourceGroupName: 'rg-$(name)-$(environment)'
TargetWorkspaceName: syn$(name)$(environment)
OverrideArmParameters: '
-ls_sq_mark_connectionString $(Connectionstring)
Whereby I overwrite the linked service with a variable form the DevOps library (it contains the following code “("Integrated Security=False;Encrypt=True;Connection Timeout=30;Data Source=databaseserver;Initial Catalog=database")”
The thing I noticed when I looked into the JSON file (TemplateForWorkspace.json) that the linked service is defined as follow:
"ls_sq_mark_connectionString": {
"type": "secureString",
"metadata": "Secure string for 'connectionString' of 'ls_sq_mark'"
},
Maybe the problem is, is that it's suddenly a securestring? But i have no idea how to fixed this issue.

How do I load values from a .json file into a Devops Yaml Pipeline Parameter

Microsoft Documentation explains the use of parameters in Yaml Pipeline jobs as
# File: azure-pipelines.yml
trigger:
- master
extends:
template: simple-param.yml
parameters:
yesNo: false # set to a non-boolean value to have the build fail
But instead of statically specifying the value of yesNo: I'd prefer to load it from a completely separate json config file. Preferably a json file that both my Build Job and my Application could share so that parameters specified for the Application could also be used in the Build Job.
Thus the question:
How do I load values from a .json file into a Devops Yaml Pipeline Parameter?
I've been using this marketplace task:
https://marketplace.visualstudio.com/items?itemName=OneLuckiDev.json2variable
And it's been working great so far. Haven't tried it builds, but can't see why it wouldn't work with separate build pipelines/multi-staged builds. There are a few things you have to be aware of/stumble upon, like double escaping slashes in directory paths - and you'll have to fetch secrets from someplace else, like traditional variable groups.

Forcing data seeding during release stage of the pipeline

I am having trouble seeding data into thre database using devops. I have a YAML with the following build step (I've stripped out irrelevant steps):
- task: CmdLine#2
inputs:
script: |
dotnet tool install --global dotnet-ef --version 3.0
dotnet tool restore
dotnet ef migrations script -p $(Build.SourcesDirectory)/$(My.SQLProject)/$(My.SQLProject).csproj -o $(Build.ArtifactStagingDirectory)/migrations/script.sql -i
- task: PublishBuildArtifacts#1
This creates a migration SQL script just fine and pops it into drop.
During release I create the database using an ARM deployment task I then run the SQL script:
- task: SqlAzureDacpacDeployment#1
inputs:
azureSubscription: 'my-sub'
ServerName: 'my-server.database.windows.net'
DatabaseName: 'my-db'
SqlUsername: 'my-sqluser'
SqlPassword: 'my-password'
deployType: SqlTask
SqlFile: '$(Pipeline.Workspace)/drop/migrations/script.sql'
This works fine - the schema in the DB is created.
I then create the App Service with connection string and the App Service connects to the DB just fine.
The bit I can't seem to get to work is the data seeding. I've googled lots and there are plenty of articles that talk about creating migrations and creating SQL scripts and then running the script in Devops. And there are plenty that talk about seeding data outside of Devops, but the bit I'm struggling with is how to get it to seed the data in Devops. One odd thing I have noticed is that if I re-run the build / deploy YAML, it then seeds the data without me having to tell it. So, I guess there are two questions:
Is data seeding something that MUST (or SHOULD) be done in the App Service code during App Service Startup, or is it something that should be instigated during the release pipeline in Devops? (I'm not the App Service developer. The Dev says he thinks it should be happening at App Startup. It doesn't so my thinking is that if he's missing something in his code, perhaps I can say "don't worry, I can kick off the data seeding myself in Devops".
If it should be done in Devops, how should it be done? I would have thought that "dotnet ef database update -p " ought to do it, but that doesn't seem to work in the release pipeline.
Many thanks
After some experimentation I have an answer. It's not pretty but it works. I don't know why the dev's seeding isn't working, but I can force it in Devops like this:
Deploy the App Service (AzureWebApp#1)
Add the Connection Strings (AzureAppServiceSettings#1)
Re-deploy the App Service (AzureWebApp#1)
That seems to "force" the seeding. Job done.

Azure Pipepline with task Sonarqube https

I added a Sonarqube task into my azure build pipeline, in order to login to my sonarqube server I need to run a command, which uses trunst store ssl.
my pipeline looks just like this:
- task: SonarSource.sonarqube.15B84CA1-B62F-4A2A-A403-89B77A063157.SonarQubePrepare#4
displayName: 'Prepare analysis on SonarQube'
inputs:
SonarQube: abc-sonarqube
scannerMode: CLI
configMode: manual
cliProjectKey: 'abc'
cliProjectName: 'abc'
cliSources: src
extraProperties: |
sonar.host.url=https://sonarqube.build.abcdef.com
sonar.ce.javaAdditionalOpts=-Djavax.net.ssl.trustStore=mvn/sonar.truststore -Djavax.net.ssl.trustStorePassword=changeit
I am not sure, if this command "sonar.ce.javaAdditionalOpts=-Djavax.net.ssl.trustStore=mvn/sonar.truststore -Djavax.net.ssl.trustStorePassword=changeit" correct is.
I got the error "API GET '/api/server/version' failed, error was: {"code":"UNABLE_TO_VERIFY_LEAF_SIGNATURE"}
"
PS: my project is angular project.
any solutions?
Azure Pipepline with task Sonarqube https
This issue should be related in how the configure task works. So, even if we add the certificate to the java trustore, the task that sets the configuration uses a different runtime (not java at least) to communicate with the server, that’s why you still get that certificate error.
To resolve this issue, you could try to:
set a global variable, NODE_EXTRA_CA_CERTS, and set it to a copy of
the root cert we had stored locally in a directory. See this article.
Check the related ticket for some more details.
Hope this helps.