If I have a yaml file and I set variables which are counters, how can I access that variable from Blazor Server?
I.e. my yaml may look like this:
variables:
version.Major: '1',
version.Minor: $[counter(variables['verion.Major'], 1)]
version.Revision: $[counter(variables['verion.Minor'], 1)]
versionName: '$(version.Major).$(version.Minor).$(version.Revision)']
And I'd like to access versionName from a Blazor Server component...
You will need to burn those variables into something that is part of the blazor app. There are many options:
overwrite the assembly version
add the values to the config file
write the value into some other file
set the value as a extended property of the deployment (in an Azure web app for example)
Access Azure DevOps YAML variables in Blazor server app?
Variables are affiliated products of Azure devops. We could not directly access these azure devops-specific products without the help of REST API.
To resolve this, the ReplaceToken task from the Marketplace https://marketplace.visualstudio.com/items?itemName=qetza.replacetokens should be correct way.
Related
We Have Automated scripts that we would like to build and Test on Azure DevOps but our pipeline cannot run our Test Scripts on Azure
We have a Database Service Account that we want to configure on Azure but we don't know how to go about it. Please assist.
Here is a well explained video (by Hassan Habib from Microsoft) on exactly how to run a console app (you create) in an Azure Pipeline that securely gets credentials to immediately do stuff in Azure (https://youtu.be/ht0xhQyF1x4?t=1688)
He basically, in a handful of minutes shows exactly how to:
Link Pipeline Variables to KeyVault Secrets, so when accessed, the variables do a get() from KeyVault and return that value.
Securely links Pipeline Variables to Azure Environment Variables.
As a step in the release pipeline the console app reads the Azure Environment Variables to get credentials to do stuff in Azure.
In his case he created an Azure Resource Group in Azure.
In your case if I’m understanding correctly. You could possibly make a simple console app that runs in the pipeline, that gets creds\connections strings for your database to do whatever in the DB and could possibly test your scripts.
I have an YAML/JSON files and we have the base serve endpoint defined as seen in the below screenshot.
How do we filter only the respective base URL for specific environment
For instance:
Server: dev files should be deployed to DEV environment, Stage files should be deployed to Stage environment and so on
Note: I'm using Azure pipeline for deployment.
In your current situation, in the devops pipeline, we do not have this function/option to do this. We recommend you can try to create a New Generic service connection and use it in your different deploy steps.
I am new to using Azure Data Factory v2 and have a few questions regarding general transforming connection strings / LinkedServices when deploying to multiple environments.
Coming from SSIS background:
we used to define connection strings as project parameters. This allowed transforming the connecting string when deploying the artifacts onto different environments.
How can I accomplish the same using Azure Data Factory v2 ?
Is there an easy way to do this ?
I was trying to set up linked services with connection strings as parameters which then could be passed along with the triggers? Is this feasible ?
This feature is now avaialble from URL below. Are you the one who requested the feature? :)
https://azure.microsoft.com/en-us/blog/parameterize-connections-to-your-data-stores-in-azure-data-factory/
Relating to SSIS (where we would use configuration files - .dtsconfig for deployment to different deployments), for ADFV2 (& ADFV1 too) we could look into the option of using ARM templates where for every different environment (dev, test & prod) to deploy the ADF solution that many deployment files(.json) could be made and script the deployments using PowerShell. It is possible to use ARM template parameters to parameterize connections to linked services and other environment specific values. Then there are ADFV2 specific PowerShell cmdlets for creation/deployment of ADFV2 pipelines.
Also you can use PowerShell to parametrize connections to linked services and other environment specific values.
With the ADFV2 UI the VSTS GIT integration is possible so is the deployment and integration. VSTS GIT integration allows to choose a feature/development branch or create a new one in the VSTS GIT repository. Once the changes are merged with the master branch it could be published to data factory using ADFV2 UI.
I ended up solving this issue with setting up an azure key vault per environment each having a connection string secret (more details here : https://learn.microsoft.com/en-us/azure/data-factory/store-credentials-in-key-vault)
- dev
- dev-azure-datafactory
- dev-key-vault
- key: db-conn-string
value: dev-db.windows.net
- qa
- qa-azure-datafactory
- qa-key-vault
- key: db-conn-string
value: qa-db.windows.net
- production
- prod-azure-datafactory
- prod-key-vault
- key: db-conn-string
value: prod-db.windows.net
In Azure Data Factory
Define an Azure Key Vault linked service
Use the azure key vault linked service while defining connection string(s) for other linked services
This approach removes any changing of parameters in the actual linked service
The connection string with azure key vault linked service can be changed as part of your azure pipeline deployment (more details here : https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment)
Each azure data factory can be given access to its azure key vault using MSI (automated it with terraform in our case)
We need to create and Key Vault and populate it (or generate) with secrets, then reference them as passwords to SQL servers (PaaS) at the next build step. What would be the best approach to do that?
There are many ways to create/update Azure Key Vault, such as Azure PowerShell, Azure CLI, REST API, also there are Azure PowerShell and Azure CLI tasks in VSTS build/release. So do it with Azure PowerShell or Azure CLI.
To create/update the variable in build/release, you can use Logging Command (##vso[task.setvariable]value), then the variable can be used in subsequent task.
On the other hand, if you just want the variable secret, you just need to add a build or release variable and click lock icon to set the variable secret.
I'm deploying Azure app services with Git continuous deployment and using post deployment action hooks to log the deployment to a Slack channel. My action hooks are written as PowerShell scripts.
From within my PowerShell scripts how do I access Azure or Kudu environmental variables or app settings? It's clear how to do this via deploy.cmd but I'm having no luck from PowerShell.
Ideally I'd like to be able to access things like:
Azure app service name
Deployment slot name
Deployment source/target paths
App settings and/or connection strings
Ok figured this out, apparently all of the Azure environment variables available within your website app service are available to PowerShell scripts running as post deployment actions.
To get the site name within PowerShell:
$siteName = [environment]::GetEnvironmentVariable("WEBSITE_SITE_NAME");
In addition to site name there are dozens of other Azure environment variables plus your app settings and connection strings.