How can we validate the Ansible Playbook from yaml build file ? Yamllint ?
Yamlint is installed in Ubuntu and Linux Azure agents as stated here
So you can run yamlint inside a bash task so as to check your yml files. Choose whatever parameters that suit you.
You can run this pipeline as a build validation, so that no branch is merged into main without validation.
Related
I have a situation where I need to deploy different stylesheets to different environments (Dev/Test etc).
Is there a way I can edit the publish artefact so I can do a release pipeline for each environment?
So I would have a build pipeline that produces theused.css, dev.css,
test.css
I would have a release pipeline for Dev & Test
The dev pipeline would edit the artefact by deleting theused.css then rename
dev.css to theused.css .. likewise for test
Or is there a better way to do this?
If I would want to rename a file I would probably write a bash/powershell script and execute it as a task in the pipeline.
Bash task
Power Shell Task
Specifically in your case I would copy/rename the dev.css/test.css to theused.css during the deployment step.
I do not know how you deploy but you could either rename the .css before the deployment to an S3 bucket for example or if you deploy on an on premise server copy the file and rename it at the same time.
cp /your/dev.css /your/deployed/path/to/theused.css #copying the file
mv /your/dev.css /your/deployed/path/to/theused.css #move/renaming the file
Meant to add this...
What I ended up doing was the following (with different configs, rather than css, but same idea):
In my app I have configs for Dev, Test & Prod (config.json, config-test.json, config-prod.json)
The first thing the app does is load the config when it runs
I build and deploy to dev
The build folder contains the build files including these config files
I have releases for Test and Prod that do the following:
Task 1: delete the config.json
Task 2: copy the appropriate config file, e.g. config-test in the test release pipeline, and rename it config.json
Task 3: deploy build files to the appropriate environment with the new config
I have a requirement to integrate the JMeter scripts, checked-in a Git repository, with a DevOps pipeline so that I can run the JMeter scripts using a specific VM in Azure.
Basically, I should have all my jmxs and csvs in a git repository and when I run the pipeline, having a parameter of the script name, it should run the script on a specific VM (not with a static IP) and copy the jtl in some storage.
What is the best way to achieve this?
With a DevOps pipeline so that I can run the JMeter scripts using a
specific VM in Azure. What is the best way to achieve this?
If the specific VM exists before the current pipeline, you can consider installing self-hosted agent there.
To do CI/CD using Azure pipelines, we need at least one agent. If we use microsoft-hosted agent, it will provide one fresh VM for us to run jobs. Since you need to run the script in your own specific VM, I suggest using self-hosted agent. You can follow the steps here to install one agent into your own VM. (The steps are quite easy and only cost several minutes)
After making your VM a self-hosted agent, the pipeline will call your VM to run the jobs. Now your original issue turns into how to run JMeter locally with command-line. See similar issues here: Five Ways To Launch a JMeter Test without Using the JMeter GUI and Run .jmx file through command line ....
1.So now we can use a command-line task in pipeline to run the JMeter related commands shared in the similar topics above. And these jobs are done in your specific VM.
2.I'm not sure which location you want to copy the jtl to, but you can use Azure File Copy task to copy files to Microsoft Azure storage blobs or virtual machines (VMs). Or a simple copy/xcopy command in your command line task to copy files to another location in same machine. (Specific VM)
Hope all above helps :)
I have Use following Task in Azure CD pipeline.
"Run Taurus" Task is as following.
Where "_WM WebClient TestArtifacts" is git/Azure Repo directory where .jmx file kept(in Code).
Background
As part of our deployment pipeline we are creating our deployment artifact, by running several .xdt transforms on our build artifact as well as adding several additional files.
As the last step before publishing, we would like to invoke msdeploy.exe to build a "custom" webdeploy package from a folder containing the wwwroot-content - (msdeploy command for creating custom package found in this question Web Deploy - How to create a package with selected items)
We are using hosted agents (win 2017).
We wish to deploy to an Azure AppService.
Question
Is there a task in Azure DevOps, that allows you to invoke MsDeploy.exe manually, such that we can create a custom webdeploy package, before we deploy?
Is there a task in Azure DevOps, that allows you to invoke MsDeploy.exe manually, such that we can create a custom webdeploy package, before we deploy?
I am afraid there is no such task to invoke MsDeploy.exe manually. We need invoke it by command line task, just like Daniel comment.
As we know, the default installation will place msdeploy.exe in:
C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe
To verify the msdeploy path on the hosted agents, I use a copy task with content **\msdeploy.exe:
Then use the Publish build artifacts to output the msdeploy.exe, I could get the result on the hosted agent vs2017-win2016 and windows-2019:
So, the the msdeploy path on the hosted agents vs2017-win2016 and windows-2019 is C:\Program Files (x86)\IIS\Microsoft Web Deploy V3\msdeploy.exe. We could use command line task to invoke it.
Hope this helps.
Here is the exact CommandLine task that worked for me (without parameters though):
Does anyone know how can we download GitHub repository as a part of spinnaker pipeline.
We have few scripts which are present in Github and I want to get those scripts during spinnaker pipeline execution.
You can use the Github artifact
You can use the Script or Run Command Stage to obtain the scripts and execute them.
I wish to execute Octo.exe from a powershell script on VSTS. Like this
Octo.exe push --package $_.FullName --replace-existing --server https://deploy.mydomain.com --apiKey API-xxxxxxxx
But I don´t know the correct path for Octo.exe or if it is present on the VSTS? Is it possible install it there? Or will i have to add the octo.exe to my source and call it from there?
You can’t call Octo.exe command if using Hosted build agent and it is impossible to install it on build agent too.
If you can call Octo.exe without install it, you can add octo.exe to the source control and map to build agent (Repository > Mappings), then you can call it via PowerShell. The path could be like $(build.sourcesdirectory)\Tool\octo.exe, according to how do you map it to the source directory)
If Octo.exe requires to install, you need to set up an on premise build agent and install Octo on that build agent.
On the other hand, there is the extension of Octopus Deploy Integration that you can install and use it directly.
Instead of cluttering source code repository with binaries,
the cleanest approach is using the Octopus REST APIs for pushing a package.
An example on how to push a package is provided by the Octopus company itself.