How to configure Azure DevOps Pipeline decorators to run pre-tasks in classic pipelines? - azure-devops

We have a custom Azure DevOps extension to in order to inject SonarQube pipeline tasks into every definition using the Pipeline Decorator feature. These tasks are a mixutre of both pre and post tasks.
In YAML defined pipelines, the tasks run perfectly, however in Classic pipeline definitions, only the post tasks run, although the classic and YAML pipelines are defined identically (steps, agents, demands, variables etc.).
As this is a relatively new feature of Azure DevOps, there is a lack of documentation, especially regarding classic pipelines.
Is there something that we could possibly be missing for this to happen?

Is there something that we could possibly be missing for this to
happen?
This seems the issue on our side. And, it only exists to the sonarcloud/sonarqube prepare task if we apply it into Decorator.
As you know, users use yaml template for the steps to inserted at the specified location. And in fact, on our backend, this template file is processed through yaml template engine.
As our design, after you enable the Pipeline decorators at organization level. In Initialize job, Pipeline will call one backend class to get the JobContext, which will add decorator providers to JobContext. Then JobContext use these providers to fetch contributions to add pre/post tasks in job while preparing the job to run.
BUT, the sonar prepare task can not actively be detected by engine, then inject it into JobContext. For why I point to this specific task, because this kind of abnormality only exists in the prepare task of sonarcloud and sonarqube until now.
Our team will do some investigation and fix with sonar team together.
Until now, there has 2 work around you could consider to apply.
Work around 1:
As I mentioned previously, this prepare task can not actively be detected and injected into JobContext. So, the first work around we actively add this task info into JobContext via adding prepare task into agent job.
But this will cause one disadvantage is, it will load 2 prepare tasks. One is executed in pre-job, and next it will run second.
Work around 2:
Try to use YAML to build your pipeline until we implement this abnormality thing. So that it will not cause error because of lacking prepare task
Will update the status here to let you know once we have any progress.

Related

Is there any way to simulate dynamic YAML pipelines in Azure DevOps?

What I'm thinking about is to have a step in the pipeline to generate a full-blown pipeline to run afterwards.
Apparently this particular thing is not there yet (feature requests here, here). But maybe somebody has some fresh thoughts on workarounds?
Not really. It's a pain that's just a fact of life when working with YAML pipelines. It's especially annoying when trying to work through runtime vs compile time variable resolution issues.
Commit, run, commit, run, commit, run, over and over.
For the dynamic work you could set up a repo with one dummy yaml file and a pipeline registration that targets this yaml file.
From a static pipeline that is responsible for kicking off a dynamic pipeline you then execute two steps:
Create a fresh branch of that "dynamic" yaml file and commit the required dynamic workload
Not sure about branch limits though. You could also decide to reuse a branch.
Kick off this "dynamic" pipeline through az devops cli using the static pipeline's access token
Also see the following documentation:
https://learn.microsoft.com/en-us/azure/devops/cli/azure-devops-cli-in-yaml?view=azure-devops

Is it possible to run Azure DevOps pipeline tasks in for loop using Classic UI?

I have pipeline that was configured using UI (not directly using yaml file). I have many different configuration variants and for each of these I have to run the same seperate Task in this pipeline. I found that it is possible using yaml file but could not find any information about UI approach. Thought that I might be able to run powershell task and run other tasks from there.
As of this time, however, running tasks in for loop using Classic UI pipeline is not supported. You need to add the tasks multiple times to let them run multiples times.
The good news is that you can use task groups to simplify this process.
A task group allows you to encapsulate a sequence of tasks, already defined in a pipeline, into a single reusable task that can be added to a pipeline, just like any other task. You can choose to extract the parameters from the encapsulated tasks as configuration variables, and abstract the rest of the task information.
Click Task groups for builds and releases for detailed steps.

Deployment configured as YAML as part of a Pipeline

We have been using a YAML file to do our CI in Azure DevOps for a few months with the idea that we would get our release configured using YAML in the future.
Well that time is now and I'm confused by how you would introduce a CD process. With the MyProject-CI.yml being a Build Pipeline and our Releases being Classic Pipelines I assumed that when the time came to get the CD process down as YAML we would create a MyProject-CD.yml. That would be triggered by the dropping of an Artifact within the MyProject-CI.yml CI.
However I think that was just a misunderstanding on my behalf and what we are supposed to do is convert the original MyProject-CI.yml into a multi-stage pipeline that has the following stages
Build and Run Unit Tests
Deploy to Development and run WebTests
Deploy to Production and run WebTests
Is the switch to a multi stage CI/CD in one file correct rtaher than Release and Build in separate files?
The short answer is yes, you got the idea. A single multi-stage pipeline yml is the way to do both build and deploy, and that is the base intention. Here is an exercise that parallels your case that might help.
As your pipelines get more complex, you will likely get into scenarios with multiple files, as you can template parts of your pipeline for reuse in multiple places, or to enforce conventions from a central location.

Azure pipelines. Repeat tasks without project rebuilding

I use azure pipeline for build my solution.
Aftere build I need to generate and upload multiple packages with different assets. Packing implemented as a number of additional tasks in my agent job.
But I need ability to generate only selected packages or all packages based on specified arguments.
What is the best way to achieve this?
Ideally, it would have 2 pipelines. The first is automatic project build. And the second should use the result of the first and be able to repeatedly start manually with the desired parameters to exclude project rebuild. But I do not know how this can be implemented.
Not sure if I understand the question correctly, but two possible answers would be:
Classic pipelines, with build pipeline for project build and release pipeline for uploading the artifact(s) that build generates, if using release pipeline is applicable. Release pipeline can have a cd-trigger for the first run, and redeployed manually after that. If need to change release variables for subsequent deployments, you can create a new release with the same build artifact.
Multi-staged pipeline, with build and upload as different stages, manually redeploy/rerun the upload stage when needed. Build phase generates deployable pipeline artifact(s).
Somehow I think you're looking for more elaborate solution, as you state that you're already using pipelines. So how about creative use of conditional tasks (https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml) using pre-defined variables like Build.Reason (with value 'Manual') to exclude the execution of certain tasks in some runs. Then group tasks you want to re-use into either build templates or task groups.

How to test Concourse pipelines

My team has multiple Concourse pipelines and as we refactor tasks, we've realized the need to test our actual pipelines.
We already test our tasks by using environment variables enabling task scripts to be run locally, but the pipeline yaml is another matter.
What is the best way to accomplish testing of the pipeline itself?
You can use the Concourse Pipeline Resource to monitor the git repository where you keep your pipeline config. Whenever the pipeline resource detects a change, it will automatically run a fly set-pipeline to update the config in your running Concourse installation. From there, it's easy to script tests against the updated pipeline that is now running in your Concourse installation.
fly validate-pipeline is pretty useful, running that against pipelines before merging has caught a few bugs in "obviously correct" changes for me.
If you want to test the whole pipeline before merging you need to make sure that the data it's using is static and working (no sense in failing the pipeline if it's the repo that's broken), and that there are no side effects (like notifications) shared between the 'real pipeline' and the 'test pipeline'. I suspect that as long as you're careful with the restrictions, you could make it work, but it would have to be designed in the context of your existing pipelines and infrastructure.