We are using templates for deploy pipeline. We have 3 environments where we want to deploy. Problem we are facing is triggering of pipeline.
Ideal scenario would be to automatic deploy to dev environment where we run API level functional tests after deploy and manually deploy to UAT and production when we are satisfied with manual and automatic testing results.
Deploy pipelines are same and we create them from template with environment name as parameter. Is there a way to tell GO CD to conditionally start a pipeline build from template (something like if dev start automatic else start manual in meta language)?
If you can have environment name as environment variable instead of parameter, you can trigger your deploy pipeline using pipeline api. So you can keep your deploy pipeline in manual trigger mode and have another pipeline which can poll for changes and automatically trigger the pipeline api for your dev environment deployment. For UAT and production environment you can use the templatised pipeline with manual trigger. I don't think we can control the trigger based on a parameter or an environment variable.
Related
We are currently switching over from Classic (UI) Azure Pipelines to YAML pipelines.
We would like to deploy a branch every night to an environment (when changes have occurred) and also be able to deploy to this environment manually. With the Classic pipelines it was easy to setup a scheduled deploy to an environment that could also be triggered manually when desired.
In the new YAML pipelines we currently use the "Business hours"
check to make sure the nightly deploy takes place, but we are unable to override the check when a manual deployment is needed.
Is there a way to setup a YAML pipeline where a deploy waits for either a manual approval or a scheduled trigger to occur?
Until recently I ran Selenium automated tests from the Azure Test Plans using a YAML pipeline for the Build part and a classic Release Definition for the Release part.
Now, the organization I work in does not allow the use of classic Release Definitions from Azure DevOps.
So I extended my YAML Build Pipeline with stages for running my automated tests on different environments.
But I also want to make use of the Azure Test Plans for running my tests.
My problem is that when I want to run a test from my Test Plan, the only possibility for selecting a Release pipeline is the classic Release Definition I have created.
Is there any possibility to trigger the run of my automated tests from the Azure Test Plans, but to use a YAML pipeline when selecting the Stage.
Thank you!
I tried to edit the settings of my Test Plan, but in order to select a Stage, I have to select a Release Pipeline, and the only options are the ones existing in the Pipelines -> Releases windows of Azure DevOps
I have a multi-stage YAML pipeline using which I run my automated tests against different environments(Dev, Test, Pre-prod, etc). I wanted to run these automated tests directly from the test plan by configuring the build and release pipeline using Azure test plan settings.
But I couldn't see my release stages after selecting the build in the build pipeline dropdown. It is working fine for my classic UI release pipeline, not for the YAML muti-stage.
Does YAML multi-stage support this functionality?
This seems to rely on the "old releases". But the steps and code it generates could easily be ported over to the new YAML pipelines. But you'd have to do it by hand.
You could create a temporary "old style" pipeline, configure it to see what all gets generated and then duplicate those tasks to your YAML pipeline.
I have four environments that I deploy to.
I also have four different code branches that we use to deploy code from.
We constantly switch the branches we use to deploy on these environments.
One time I want to build and deploy a daily branch on my test environment.
Later I want to build and deploy a enhancements branch on the same test environment.
Next I want to build and deploy the daily branch on my test2 environment.
I think you get the picture
We are currently using a manual process to pull from the branch we want deployed, then zip it up and push it to AWS code deploy.
Using Azure DevOps pipeline and release what is the easiest method to allow me to switch to use different branchs on different environments.
I currently have a successful setup in Azure DevOps that performs a gradle build, creates the artifact and then lets me push it over to AWS CodeDeploy on one of my environments. I just can't seem to figure out a way to eastily swtich the branch without creating tons of Azure pipelines and releases.
Thanks all!
Where you manually trigger a build pipeline by clicking Queue or Run Pipeline, A new windows shown as below will be prompted which allows you to switch the branches.
If you want to automatically deploy different branch to different environment. You can push the build artifacts over to AWS CodeDeploy in a release pipeline and set the branch filters. Please refer to below steps:
1, set branch filter in the build pipeline as shown in below screenshot which will build the selected branched. Check here for more information about triggers.
2, create a release pipline to push build artifacts over to AWS CodeDeploy.
And Set the Artifact filters which will only allow the artifacts built from the specified branch to be deployed to this tage.
You could use a queue time variable to specify the branch name you would like to use on your build pipeline. You would need to:
Edit your build pipeline and create the variable on the "variables" tab. Make sure to mark the "Settable at queue time" check
variable creation
Update the source of your build pipeline, to specify the new variable under the "Default branch" option. It would look something like this:
pipeline source
RUN your pipeline. Before finally clicking on RUN, you will be able to specify the desire branch:
set variable value
Hope this works
My team has multiple Concourse pipelines and as we refactor tasks, we've realized the need to test our actual pipelines.
We already test our tasks by using environment variables enabling task scripts to be run locally, but the pipeline yaml is another matter.
What is the best way to accomplish testing of the pipeline itself?
You can use the Concourse Pipeline Resource to monitor the git repository where you keep your pipeline config. Whenever the pipeline resource detects a change, it will automatically run a fly set-pipeline to update the config in your running Concourse installation. From there, it's easy to script tests against the updated pipeline that is now running in your Concourse installation.
fly validate-pipeline is pretty useful, running that against pipelines before merging has caught a few bugs in "obviously correct" changes for me.
If you want to test the whole pipeline before merging you need to make sure that the data it's using is static and working (no sense in failing the pipeline if it's the repo that's broken), and that there are no side effects (like notifications) shared between the 'real pipeline' and the 'test pipeline'. I suspect that as long as you're careful with the restrictions, you could make it work, but it would have to be designed in the context of your existing pipelines and infrastructure.