Azure DevOps: Why does new pipeline commit the yaml file to default branch - azure-devops

I created a new pipeline in Azure DevOps, and created a new branch for it.
As a result, DevOps automatically committed the YAML file for the new pipeline to my 'development' branch.
None of the other pipelines I've created have YAML files committed into the repo...
Why does it do this?
Do we have to keep the YAML file there?
It has nothing to do with the source code of the application, so doesn't seem to make sense why its stored there.

YAML is code for how your application is deployed, thus it is part of the source code. By putting it under source control it can keep track of version changes and any additional changes to parameters or variables that are determined or inserted in the build process.
This is opposed to the older ways of doing things where it was updated via UI and not source control and did not have peer reviews, branching merging, and additional polices that can be applied.
This on top of the the YAML Pipelines for Releases going GA the other week will make YAML under a repo even more powerful as the YAMLs will not only build but also release code.

In Azure Devops Service we define pipelines using the YAML syntax or through the user interface (Classic). So there're two kinds of pipelines, Yaml pipelines and Classic UI(Classic build and release) pipelines.
None of the other pipelines I've created have YAML files committed
into the repo...
Why does it do this?
It's expected behavior when defining pipelines using Yaml syntax: The pipeline is versioned with your code. It follows the same branching structure.
And one advantage for this is: A change to the build process might cause a break or result in an unexpected outcome. Because the change is in version control with the rest of your codebase, you can more easily identify the issue.
To sum up, the yaml pipeline will be added into version control and it's by-design behavior. If you don't want this behavior, you can feel free to use Classic Build and Classic Release pipelines. It's also a good choice! About the differences between these formats you can check Feature availability. Hope it helps :)

Related

Is there a pattern for including Release Pipelines in Azure Dev Ops via Source Control and Automation?

Our team uses Azure Dev Ops for our source control and release pipelines. The release pipelines are not included in our source control and are created through ADO UI. These can be very complex; I exported one release and it was 7500 lines of json. I would like these pipeline definitions to be in the same source control as our source code for easy maintenance and review.
I see that there are tools for exporting the pipeline definition to json and an option in the UI to create a pipeline by importing json. Is there a pattern to use automation to leverage these import/export functions and have a pipeline that is updated based on the definition in my repo?
We are currently just using the UI to update the pipelines without any additional version control or review controls. I'm expecting some functionality like GitHub Actions where the pipeline definition is contained within the repo and automation picks up the files.
If you want to version control your pipelines, I suggest that you start using the yaml pipelines that exist in Azure Devops.
These can handle both build and release pipelines. The "Releases" tab in the GUI becomes obsolete, as all pipelines (both build and release) instead show up in the "Pipelines" tab. For release pipelines, use the deployment job type, which gives you access to automatic artifact handling and environment management.
Edit: So to answer the actual question, I do not know any way to version control the GUI (classic) pipelines. My strong recommendation is to migrate to yaml pipelines.

Is there any way to simulate dynamic YAML pipelines in Azure DevOps?

What I'm thinking about is to have a step in the pipeline to generate a full-blown pipeline to run afterwards.
Apparently this particular thing is not there yet (feature requests here, here). But maybe somebody has some fresh thoughts on workarounds?
Not really. It's a pain that's just a fact of life when working with YAML pipelines. It's especially annoying when trying to work through runtime vs compile time variable resolution issues.
Commit, run, commit, run, commit, run, over and over.
For the dynamic work you could set up a repo with one dummy yaml file and a pipeline registration that targets this yaml file.
From a static pipeline that is responsible for kicking off a dynamic pipeline you then execute two steps:
Create a fresh branch of that "dynamic" yaml file and commit the required dynamic workload
Not sure about branch limits though. You could also decide to reuse a branch.
Kick off this "dynamic" pipeline through az devops cli using the static pipeline's access token
Also see the following documentation:
https://learn.microsoft.com/en-us/azure/devops/cli/azure-devops-cli-in-yaml?view=azure-devops

Azure DevOps Build Best Practices for Multiple YMLS

I have a question on general best practices for Azure DevOps. When building a project, we have two build configs, debug or release. At some point in the deployment pipeline across a multi-stage environments, these need to be changed, which means two builds from what I can understand.
Is it better to have one yml, with the build config being set as a condition from the source branch (i.e. if source branch contains "release" build config is Release, if source branch isn't from release, then config is Debug, or should I be having multiple builds and a pipeline triggered by two different artifacts?
It is much better to have one yml. By having only the one build pipeline and using a simple condition, you limit the possibility of bugs arising due to differences in the separate yml definitions. Additionally, it is more maintainable as changes are only made in one place.

Deployment configured as YAML as part of a Pipeline

We have been using a YAML file to do our CI in Azure DevOps for a few months with the idea that we would get our release configured using YAML in the future.
Well that time is now and I'm confused by how you would introduce a CD process. With the MyProject-CI.yml being a Build Pipeline and our Releases being Classic Pipelines I assumed that when the time came to get the CD process down as YAML we would create a MyProject-CD.yml. That would be triggered by the dropping of an Artifact within the MyProject-CI.yml CI.
However I think that was just a misunderstanding on my behalf and what we are supposed to do is convert the original MyProject-CI.yml into a multi-stage pipeline that has the following stages
Build and Run Unit Tests
Deploy to Development and run WebTests
Deploy to Production and run WebTests
Is the switch to a multi stage CI/CD in one file correct rtaher than Release and Build in separate files?
The short answer is yes, you got the idea. A single multi-stage pipeline yml is the way to do both build and deploy, and that is the base intention. Here is an exercise that parallels your case that might help.
As your pipelines get more complex, you will likely get into scenarios with multiple files, as you can template parts of your pipeline for reuse in multiple places, or to enforce conventions from a central location.

Building a previous release using the original Build Pipeline at the time of release

We are following gitflow model in our project using Azure DevOps Services. I have a classic editor based pipeline which builds the Dev and Release/R1.0 branch.
I am going to setup a classic editor based pipeline which will build my Release R.10 from the master branch after merging Release/R1.0 branch at the end of the release. Let us say this classic editor based build pipeline is MyProduct-R1.0
After the release, I will be tagging the master branch and deleting the Release/R1.0 branch as per GitFlow model. However, I will be retaining the build pipeline MyProduct-R1.0
My question is this: Suppose after the Release R1.0 once master branch has moved ahead and I want to do a build of master branch at R1.0 tag, how do I do using the MyProduct-R1.0 pipeline which was used to originally build the R1.0 release?
I know this is possibly a confusing question, but I have tried my best to give it a shot.
Thanks,
Update 2: I think my question is more about the branch specification for my MyProduct-R1.0 release pipeline. I can't give master since master will evolve after the Release R1.0. I can't give Release/R1.0 since this branch will be deleted after the Release as per gitflow model. So what branch specifications should I provide for my pipeline?
Use YAML builds. There's no mechanism for this with JSON ("classic editor") builds, since JSON builds are versioned separately from source code.
Like Daniel said YAML is the best path. Handles this gracefully as the Build definition lives with the branch.
In the past for GUI builds, when I dealt with build definition drift, instead of creating new build definitions, I would use conditional steps tied to the branch that was being built, but it can be painful to go back and update\maintain.
I think here is what will achieve what I am after. Once I do my release and tag it as R1.0, then I can always use my pipeline at a later date to build the same source code in the master branch for R1.0 using either it's tag or the commit it.
Note that this shows Dev is my default branch for this pipeline as it is not yet the Release pipeline which I depict below. I am yet to create it.
Am I missing something?