I’m following the new CICD guide for ADF https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment-improvements
I am then publishing the ARMTemplates generated from the npm export pipeline to my ADF Dev using Azure Resource Group ARM Template deployment described here: https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment#script
Looks like this:
- task: AzureResourceGroupDeployment#1
displayName: 'Azure Deployment:Create Or Update Resource Group action on adf-dev-rg'
inputs:
ConnectedServiceName: 'guycarpenter-privatenonprod-Contributor'
resourceGroupName: 'gc-adf-nasa-prinonprod-dev-rg'
location: 'East US 2'
csmFile: '$(Agent.BuildDirectory)/ARMTemplate/ARMTemplateForFactory.json'
csmParametersFile: '$(Agent.BuildDirectory)/ARMTemplate/ARMTemplateParametersForFactory.json'
After I publish the new ARMTemplate to my ADF Dev, ADF git repo Configure gets disconnected.
How should I publish the new ARMTemplate to my ADF Dev without disconnecting the repo?
Edit:
I also found that setting includeFactoryTemplate=false solves the disconenction, but I need it set to true to parametrize ADF for other environments.
Edit #2:
This solved the problem: https://stackoverflow.com/a/56863897/13570809
How should I publish the new ARMTemplate to my ADF Dev without disconnecting the repo?
There is a known user voice about this:
Retain GIT configuration when deploying Data Factory ARM template
You could vote this request and check the feedback.
And the Jason replied:
This has been implemented by the repoConfiguration properties in the
Azure Resource Manager template for the Data Factory resource. See
here for reference -
https://learn.microsoft.com/en-us/azure/templates/microsoft.datafactory/2018-06-01/factories
Related
I would like to know the way to consume or call terraform modules from one project in one organisation to another project from another organisation using azure devops. I tried to explore ways but found one solution using the below but my IT team is not letting to use this method as this is braking the subsequent pipelines. Any suggestions please?
Also, requirement is I just need to refer the modules of terraform which are in another organization but as per my POC its downloading/checkout the code from that organization/project and then I am able to refer those modules. I would like to only refer those modules instead checkout the code from another organization and utilising/referencing.
Below is the reply from pipeline team:
Can you exclude this part as it is not ideal and you need to take a different approach?
echo "Git config update start"
MY_PAT=$(yourPAT)
B64_PAT=$(printf "%s"":$MY_PAT" | base64)
git config --global http.extraheader "Authorization: Basic ${B64_PAT}"
echo "Git config update end"
terraform init
terraform plan
you are introducing your cred in .gitconfig that's breaking all subsequent pipelines
in the agent.
POC: The below code is cloning the entire modules code from another organization and we are referecing those modules but I just need to refer those modules directly instead of downloading and calling/referencing modules.
resources:
repositories:
- repository: Modules
type: git
name: 'Compute Platforms/CES-Terraform-Automation-Service'
endpoint: Repo-bp-digital # Azure DevOps service connection
ref: Modules
- repository: self
type: git
name: 'Cloud Onboarding/terraform-testing-by-vivek'
AFAIK, There’s only one option to connect to the project of another Azure DevOps organization that is by creating a Service Connection in the organization from where you want to run the pipeline and by creating a PAT token in the target organization and referencing it in the service connection,
I created 2 Organizations, 1) Organization alpha1 and 2) Organization beta2. I created 2 projects in both organizations with one YAML script and a task.
Created a PAT Token in Organization beta2.
Created service connection in the Alpha organization from where I am running the pipeline to beta org by referencing PAT token from beta org like below:-
trigger:
- master
variables:
pythonVersion: '3.8'
vmImageName: 'ubuntu-latest'
resources:
repositories:
- repository: remoteRepo
type: git
name: remote-access/shared-common-install
endpoint: remoteaccesstemp # Service connection name
ref: refs/heads/main
stages:
- stage: remote_git_test
jobs:
- job: git_test
steps:
# Running the template from the same repsitory
- template: templates/hello-alpha.yaml
# Checkout the remote repository
- checkout: remoteRepo
persistCredentials: true
# Call the template that is located in another repository in another organization
- template: templates/hello-beta.yaml#remoteRepo
Alternatively, you can create a terraform task in Azure DevOps and call your terraform module from another organization with the below script:-
terraform init -backend-config="repository=organization-beta2/project-beta2/_git/beta-2" -backend-config="token=Pat-token"
and
provider "azuredevops"{
org_service_url = var.org_service_url
personal_access_token = var.personal_access_token
}
You can add this code in your terraform init script in your Organization repo from where you’re running pipeline and reference the template in System.Artifacts.
Even Azure DevOps Rest API does not support connecting to different Azure DevOps organizations.
References:-
GitHub - Azure-Samples/azure-pipelines-remote-tasks
Trying to setup an Azure DevOps organization using Terraform :: my tech ramblings — A blog for writing about my techie ramblings By Carlos
Azure DevOps Git: Fork into another Repo using Azure DevOps REST API - Stack Overflow By Andi Li-MSFT
Hello everyone I'm trying to automate the process to deploy our script but I'm new in azure devops and I don't know where to start.
I want to create a pipeline that everytime new code its pushed to the master branch it will be automatically deploy to the destination server.
here an example:
we have an instance of azure devops running in one of our servers (server1), this is where our script repo are, once the code its merged in the master branch the pipeline should deploy the script to e:\scripts in server2.
the repository only contains powershell script and just need to move the files from the repo the destination server.
these servers are windows and the azure devops version its Dev17.M153.5
There are many ways to copy files onto a target machine.
Start with a pipeline that is triggered when the file paths in the repository change:
- trigger:
branches:
include:
- develop
paths:
include:
- the/nameof/thefolder/withyour/scripts
Next, you'll need to specify which build agent you want to use. The build-agent is responsible for running the pipeline and it can be on the same machine as your Azure DevOps Server, but not always. Check out "Settings" (bottom-left) -> "Pipelines > Agent Pools" to get the name of your pool.
- pool:
name: 'Our Build Agents'
Next, the build agent will need to be able to talk to the machine. There are several factors to consider:
Network: Make sure you don't have any firewall rules between your build agent and the target machine.
Permissions: Make sure you have the correct permissions to write files to the remote machine. Ideally, the user account your build agent uses should be an administrator of the target machine.
There are several options on how to copy files, but the easiest is the build in CopyFiles task.
steps:
- task: CopyFiles#2
inputs:
sourceFolder: $(Build.SourcesDirectory)/the/nameof/thefolder/withyour/scripts
targetFolder: \\MACHINENAME\$E\Scripts
Other options:
running a PowerShell/bash script to copy the items
rsync, robocopy,etc
In Azure DevOps Multi Stage YAML Pipeline, under resources section I had defined 2 repo resources Demo2 and Demo3. I want to access the changes happening between the builds for the repo Demo2 and Demo3. In pipeline summary page, there is an option view changes, which gives the commits from the repo and I am trying to get that details via RestAPI.
I tried to find details via Azure DevOps RestAPI page and az devops CLI but can't find anything helpful, so reaching out here for help.
resources:
repositories:
- repository: Demo2
type: git
name: 'Test/Repo2'
- repository: Demo3
type: git
name: 'Test/Repo3'
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- checkout: Demo2
- checkout: Demo3
- script: echo Hello, world!
displayName: 'Run a one-line script'
Azure Devops RestAPI to access Resources in YAML pipelines
I am afraid there is no such documented REST API to get the details via option view changes.
But we could try to use F12 to grab the URL:
https://dev.azure.com/{organization}/{project}/_traceability/runview/changes?currentRunId={Build Id}
Then we will get the feedback with HTML type, we could convert it the Json type, we could get some info about the commit:
If we need to check the context of the commit, we could use the REST API [Commits - Get] 2 to get the details.
I am using Bot Framework Virtual Assistant template to Create and configure Bot in Azure,
For this process i have ARM template is in place for creating resources,
Deploy PS script is used to create knowledgebase and (Deploy.ps1) once qnamaker resources are created.
In current implementation, If i execute script from local Powershell tool everything works fine:
Creating Resources
Creating Knowledgebase
KnowledgeBase configuration
I am stuck at configuring this set up in Azure DevOps, How do i configure ARM deployment and PowerShell script execution in CI/CD pipeline.
So that once resources are created through ARM deployment, Knowledgebase creation should automatically trigger ?
Any help is appreciated
First you need to put the ARM template in a source repository(Github or Azure Repos). See document Create a new Git repo in your project
Then Create the pipeline(Yaml or Classic). See YAML example here. For Classic UI pipeline check out this example.
Before you can deploy to your Azure subscription. You need to create an azure Resource Manager service connection to connect your Azure subscription to Azure devops. See this thread for an example
In your pipeline use ARM template deployment task to deploy the ARM template. And use Azure powershell task to execute the Deploy PS script. See below example
trigger:
- master
pool:
vmImage: windows-latest
steps:
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'ARM Template deployment: Subscription scope'
inputs:
azureResourceManagerConnection: 'my-azure-sub'
resourceGroupName: 'azrue resource group'
location: 'West Europe'
csmFile: **/template.json
csmParametersFile: **/parameter.json
deploymentMode: Incremental
- task: AzurePowerShell#5
displayName: 'Azure PowerShell script: FilePath'
inputs:
azureSubscription: 'my-azure-sub'
ScriptPath: **/Deploy.ps1
azurePowerShellVersion: LatestVersion
See this tutorial for more information.
I would put my questions through following points, hope it's make clear now:
The application source code is in application_code repo.
The pipeline code(YAMLs) are in pipeline_code repo. Because I'd like to version it and don't like to keep in application_code repo. Just to avoid giving control to Dev team to manage it.
Problem statement:
The pipeline YAML won't be triggered unless it's in the source code repository based on the events pr, commit etc.
Can we trigger or execute YAML file which is in pipeline_repo whenever there's event triggered in application_code repo?
I've tried achieving above using Classic pipeline and YAML template but this don't work together. As I can execute a YAML template from a YAML pipeline only not from a classic pipeline like below:
#azure-pipeline.yaml
jobs:
- job: NewJob
- template: job-template-bd1.yaml
Any ideas or better solution than above?
The feature Multi-repository support for YAML pipelines will be available soon for azure devops service. This feature will support for triggering pipelines based on changes made in one of multiple repositories. Please check Azure DevOps Feature Timeline or here. This feature is expected to be rolled out in 2020 Q1 for azure devops service.
Currently you can follow below workaround to achieve above using Build Completion(the pipeline will be triggered on the completion of another build).
1, Setup the triggering pipeline
Create an empty classic pipeline for application_code repo as the triggering pipeline, which will always succeed and do nothing.
And check Enable continuous integration under Triggers tab and setup Bracnh filters
2, setup the triggered pipeline
In the pipeline_code repo using Checkout to Check out multiple repositories in your pipeline. You can specifically checkout the source code of application_code repo to build. Please refer below example:
steps:
- checkout: git://MyProject/application_code_repo#refs/heads/master # Azure Repos Git repository in the same organization
- task: TaskName
...
Then in the yaml pipeline edit page, click the 3dots on the top right corner and click Triggers. Then click +Add beside Build Completion and select above triggering pipeline created in step 1 as the triggering build.
After finishing above two steps, when changes made to application_code repo, the triggering pipeline will be executed and completed with success. Then the triggered pipeline will be triggered to run the real build job.
Update:
Show Azure DevOps Build Pipeline Status in Bitbucket.
you can add a python script task at the end of the yaml pipeline to update the Bitbucket build status. You need to set a condtion: always() to always run this task even if other tasks are failed.
You can get the build status with env variable Agent.JobStatus. For below example:
For more information, please refer to document Integrate your build system with Bitbucket Cloud, and also this thread.
- task: PythonScript#0
condition: always()
inputs:
scriptSource: inline
script: |
import os
import requests
# Use environment variables that your CI server provides to the key, name,
# and url parameters, as well as commit hash. (The values below are used by
# Jenkins.)
data = {
'key': os.getenv('BUILD_ID'),
'state': os.getenv('Agent.JobStatus'),
'name': os.getenv('JOB_NAME'),
'url': os.getenv('BUILD_URL'),
'description': 'The build passed.'
}
# Construct the URL with the API endpoint where the commit status should be
# posted (provide the appropriate owner and slug for your repo).
api_url = ('https://api.bitbucket.org/2.0/repositories/'
'%(owner)s/%(repo_slug)s/commit/%(revision)s/statuses/build'
% {'owner': 'emmap1',
'repo_slug': 'MyRepo',
'revision': os.getenv('GIT_COMMIT')})
# Post the status to Bitbucket. (Include valid credentials here for basic auth.
# You could also use team name and API key.)
requests.post(api_url, auth=('auth_user', 'auth_password'), json=data)