I'm very interested in the options which open up for us with the az devops cli.
I'd like to be able to generate a yaml file locally and run it from local file by using the "az pipelines run" command. is this possible?
it would allow for very fast iteration of pipeline creation. At present we are making updates to a yaml file in repo, committing, running and then reviewing (which isn't as smooth a process as it could be).
thanks
Unfortunately, this is not possible.
There is a suggestion under review on the Visual Studio Developer Community.
Related
I source multiple artifacts in a release pipeline and do multiple transformation before publishing the files to SF Cluster. Is there a way to look/Debug the content just before publishing so I understand my transformation are working correctly. I am thinking to connect to azure storage and publish those file to have a look. Is there a better way to look through the content before publishing?
Also, is there a way to look at locked (secured) variable content?
Add a command line / shell script step and run whatever commands you want to investigate the file system.
I am looking for hosting .exe files in Azure Devops. It seems to don't have feature similar to how we host executable or build files in GitHub for other people to download. Do we have such kind of feature to host the executables and have the latest commit tagged?
You can try publishing the executable or build files as Build Artifacts in Azure devops build pipeline.
You can create a pipeline in azure devops and using Publish build artifacts task to store the executable or build files in azure pipeline
See example here to create a classic azure pipeline. See Here for yaml pipeline example.
When you run the pipeline. You will see the commit hash and the files uploaded in the highlighted field of the build summary page shown in below screenshot. And you download the files from there.
You can retain this artifacts by Clicking retain in the pipeline run. See below
You can also change the retention policy for your pipeline. See here for more information.
Go the Project settings page-->Settings under Pipeline. See below:
I have an Azure DevOps pipeline build that has several steps and the build is long. Every time there is something wrong with the build we review the logs and identify issues or come up with theories, then in case of a theory we have to insert a diagnostic command line (such as get directory, show contents of a file, etc) in between the steps; and in case of a fix we add a fix but we have to wait for the whole pipeline to rerun and find out. This is causing us to take a lot of time to fix build issues.
If we had access to the state of the agent of an unfinished build and we could just log on using RDP or any other terminal and checkout the contents, and the state of the files on disk that would have saved us a lot of hours.
Is there any way with Azure DevOps to do any diagnostic of this type?
No, if you are using hosted agent. If you are using self-hosted agent you can obviously log in to that one. You can, however, implement steps that only work if the build failed and those steps can attempt to capture information you are interested in (say publish the state of the build directory).
If you are using Azure DevOps Services, there is a new REST API version out that will let you do a "preview" run of changes to the YAML definitions: https://learn.microsoft.com/en-us/azure/devops/release-notes/2020/sprint-165-update#preview-fully-parsed-yaml-document-without-committing-or-running-the-pipeline
I have a requirement to integrate the JMeter scripts, checked-in a Git repository, with a DevOps pipeline so that I can run the JMeter scripts using a specific VM in Azure.
Basically, I should have all my jmxs and csvs in a git repository and when I run the pipeline, having a parameter of the script name, it should run the script on a specific VM (not with a static IP) and copy the jtl in some storage.
What is the best way to achieve this?
With a DevOps pipeline so that I can run the JMeter scripts using a
specific VM in Azure. What is the best way to achieve this?
If the specific VM exists before the current pipeline, you can consider installing self-hosted agent there.
To do CI/CD using Azure pipelines, we need at least one agent. If we use microsoft-hosted agent, it will provide one fresh VM for us to run jobs. Since you need to run the script in your own specific VM, I suggest using self-hosted agent. You can follow the steps here to install one agent into your own VM. (The steps are quite easy and only cost several minutes)
After making your VM a self-hosted agent, the pipeline will call your VM to run the jobs. Now your original issue turns into how to run JMeter locally with command-line. See similar issues here: Five Ways To Launch a JMeter Test without Using the JMeter GUI and Run .jmx file through command line ....
1.So now we can use a command-line task in pipeline to run the JMeter related commands shared in the similar topics above. And these jobs are done in your specific VM.
2.I'm not sure which location you want to copy the jtl to, but you can use Azure File Copy task to copy files to Microsoft Azure storage blobs or virtual machines (VMs). Or a simple copy/xcopy command in your command line task to copy files to another location in same machine. (Specific VM)
Hope all above helps :)
I have Use following Task in Azure CD pipeline.
"Run Taurus" Task is as following.
Where "_WM WebClient TestArtifacts" is git/Azure Repo directory where .jmx file kept(in Code).
I'm new to Azure DevOps and I'm trying to understand how to package a release of a PowerShell script project I'm working on.
I'm previously familiar with GitHub and the manual process for drafting a new release of my project repo. I'm now experimenting with Azure DevOps and what I want to achieve is a similar output to GitHub where my repo of PowerShell scripts are packaged into a zip file which I can publish for release.
I'm not familiar with the pipeline process in Azure DevOps or YAML as a newbie to proper release cycle tools. Previously I've just created scripts and shared them simply as they are or dropped them into a GitHub repo and manually packaged a release. I'm not likely to be turning out large numbers of builds and so have never had to come at this from an automated standpoint which seems to be the way Azure is driving me unless I'm missing something?
It's pretty simple. I prefer to do this using the old-fashing GUI (hint: there is a link when starting a new Build Pipeline that says Use the classic editor), and then convert to YAML after I get my Build Pipeline working.
1) Create your standard Build Pipeline.
2) Add the step to ZIP your files
3) Add properties to that Archive step. Specify the source to zip and target where you want the zip file to end up at.
4) Lastly, convert that single step to a YAML step by clicking in the upper-right corner on the link View YAML.
There are a lot of steps I am leaving out, but I hope this leads you into the right direction.