Duplicate Override ARM Template Parameters while doing deployment - azure-devops

I am testing the Azure data factory deployment using ARM Templates, where I am deploying the all the ADF Instances (Data factory pipeline, linked services, data sets, data flow, Trigger etc.) from Development to UAT and Production, but before deploying to UAT and production. I included below activity in Azure DevOps pipeline-
Stopping the ADF Trigger using 'Azure PowerShell' in built task of azure Devops release pipeline. In this task I am using the Inline script to stop trigger before deployment to UAT/PROD environment.
ARM Template deployment = using ARM Template I configured following value in it.
Template -> the ARM template of the pipeline ARMTemplateForFactory.json
Template Parameters -> the ARM \TemplateParametrsForFactory.json
Override template Parameter-> When I tried to enter the Values for UAT/Prod environment. some parameter are showing double.
Q-1. PFA, the details about each step and the logs. Please guide me why trigger not taking proper format. what could be the reason behind this? How should I correct this so that it can take trigger parameters only once.
Q-2. Also Please let me know Do I need Azure function app key for UAT/Prod environment to enter that value while override template parameter? Can someone please take a look and guide me what I am missing here?
Thanks

This question is answered in Microsoft Q&A platform. Thank you #sachingupta-1921 and #MartinJaffer-MSFT for the workable solution, posting it as an answer here to help the other community members.
A-1: Changing the name of the triggers without spaces resolved the problem mentioned in Q-1.
A-2: If you have different keys for dev / uat/ prod, then you would definitely need to include that.
Since you have a Key Vault, you could put the Function App key in there, and let it be taken care of by the change of key vaults. Then you would not need to enter the Function App key in the parameters.

Related

Access agent hostname for a build variable

I've got release pipelines defined that have worked. I've got a config transform that will write a API url to a config file (currently with a hardcoded api url).
What I'd like to do is be able to have the config be re-written based on the agent its being deployed on.
eg. if the machine being deployed to is TEST-1, I'd like to write https://TEST-1.somedomain.com/api into a config using that transform step.
The .somedomain.com/api can be static.
I've tried modifying the pipeline variable's value to be https://${{Environment.Name}}.somedomain.com/api, but it just replaces the API_URL in the config with that literal string (does not populate machine name in that variable).
Being that variables are the source of value that is being written to configs during the transform, I'm struggling to see another way to do this.
some gotchas
Using non yaml pipeline definitions (I know I saw people put logic in variable definitions within yaml pipelines)
Can't just use localhost, as the configuration is being read into a javascript rich app that would have js trying to connect to localhost vs trying to connect to the server.
I'm interested in any ways I could solve this problem
${{Environment.Name}} is not valid syntax for either YAML or classic pipelines.
In classic pipelines it would be $(Environment.Name).
In YAML, $(Environment.Name) or ${{ variables['Environment.Name'] }} would work.

Release Pipeline error when using Azure Dacpac Task

I'm new to using Azure release pipelines and have been fighting issues trying to deploy a database project to a new Azure SQL database. Currently the pipeline is giving me the following error...
TargetConnectionString argument cannot be used in conjunction with any other Target database arguments
I've tried deploying with and without the TargetConnectionString included in my publish profile. Any suggestions or something else to try? I'm out of ideas.
TargetConnectionString
Specifies a valid SQL Server/Azure connection string to the target database. If this parameter is specified it shall be used exclusively of all other target parameters. (short form /tcs)
So please remove all other TargetXXX arguments.
(if you don't have them can you show what arguments you have inline and in publish profile - of course without data)

Configuration Management in AKS Deployment with Azure Pipelines for Different Environments

I have created a ASP.NET Core WebAPI and deployed in an Dev Environment (Kubernetes) using Azure Pipelines. How can I update the configuration in the pipeline if I need to publish the same API in another environment (eg. SIT). Since I have different settings/configuration for Dev and SIT environments.
Kindly guide me.
You can use release variables to do this. Feel free to reachout if you need any assistance.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch
https://learn.microsoft.com/en-us/azure/devops/pipelines/release/?view=azure-devops#how-do-i-specify-variables-i-want-to-edit-when-a-release-is-created
The problem occurs when I want to deploy the same API to other
environments like QA/UAT/Prod. Since each environment is having
separate databases
For this issue , there are several ways to achieve this. You can add Replace Tokens extension to the job to replace the database connection string in appsettings.json.
You can define your variable like below:
{
"ConnectionStrings": {
"DefaultConnection": "#{connectstring}#"
}
}
You can refer to this case and lab for details.
Here are some reference for the same issue:
Replacing database connection strings in the Docker image
Set Json Property task to replace the ConnectionStrings

Not Able to Publish ADF Incremental Package

As Earlier Posted a thread for syncing Data from Premises Mysql to Azure SQL over here referring this article, and found that lookup component for watermark detection is only available for SQL Server Only.
So tried a work Around, that while using "Copy" Data Flow task ,will pick data greater than last watermark stored from Mysql.
Issue:
Able to validate package successfully but not able to publish same.
Question :
In Copy Data Flow Task i'm using below query to get data from MySql greater than watermark available.
Can't we use Query like below on other relational sources like Mysql
select * from #{item().TABLE_NAME} where #{item().WaterMark_Column} > '#{activity('LookupOldWaterMark').output.firstRow.WatermarkValue}'
CopyTask SQL Query Preview
Validate Successfully
Error With no Details
Debug Successfully
Error After following steps mentioned by Franky
Azure SQL Linked Service Error (Resolved by re configuring connection /edit credentials in connection tab)
Source Query got blank (resolved by re-selection source type and rewriting query)
Could you verify if you have access to create a template deployment in the azure portal?
1) Export the ARM Template: int he top-right of the ADFv2 portal, click on ARM Template -> Export ARM Template, extract the zip file and copy the content of the "arm_template.json" file.
2) Create ARM Template deployment: Go to https://portal.azure.com/#create/Microsoft.Template and log in with the same credentials you use in the ADFv2 portal (you can also get to this page going in the Azure portal, click on "Create a resource" and search for "Template deployment"). Now click on "Build your own template in editor" and paste the ARM template from the previous step in the editor and Save.
3) Deploy template: Click on existing resource group and select the same resource group as the one where your Data Factory is. Fill out the parameters that are missing (for this testing it doesn't really matter if the values are valid); Factory name should already be there. Agree the terms and click purchase.
4) Verify the deployment succeeded. If not let me know the error, it might be an access issue which would explain why your publish fails. (ADF team is working on giving a better error for this issue).
Did any of the objects publish into your Data Factory?

Nintex Workflow 2007 - global settings / variables

The problem:
When we create a workflow on UAT/Stage environment we need to import it to Production environment. Then we need to change the environment URLs (Web Service calls and such) and email addresses.
Is it possible:
To store URLs and emails in some global configuration where the Nintex Workflow would pick it up so whenever we deploy the workflow again to production we wouldn't need to go to each step and edit its settings?
Got an email from Nintex Support:
Dear Jakub,
Thank you for your e-mail.
The only suggestion I have is to use Workflow Constants. You can
configure workflow constants in Central Administration > Application
Management > Manage Workflow Constants.
In your workflow you use a lookup reference which points to the
appropriate workflow constant. As long as the Workflow constant name
is the same in both environments the workflow will pick it up but the
Workflow constant will contain the relevant URL for the environment.
Hope this helps.
Kind Regards
Simon Muntz Software Support Engineer - Team Lead Nintex Workflow for
Everyone® | nintex.com
Alternatively, use a list on your site as a configuration settings table. Set up a list of name/value pairs, then at runtime your workflow can query the list by name and retrieve the required value.