I have configured a private Build agent/self hosted VM in VSTS.
I want to copy all test results for each test case to my self hosted VM. Currently for this I have created Build Pipeline.
In publish test task,has Test result files e.g. .xml file and Search folder as Default source repository path.
Test case snapshot:
How can I specify the path of attachment in test case and copy/download artifacts to VM?
Also after processing the attachment, I want to copy the output back to test results.
How can this be achieved?
Any help highly appreciated.
Thank you
All necessary files are on agent machine (build/deployment agent), so if you run test during release, the test result files will be on that agent machine.
On the other hand, if you can’t use VsTest task to run test, then you can publish test result through Publish Test Result task.
The VsTest task will publish the results automatically.
If you're not using VsTest for whatever reason, then using deployment groups would be easier. With deployment groups, you don't need to copy files -- the deployment runs directly on the target machines with no intermediary agent machine necessary.
Related
I'm just trying to get my fist Release pipeline underway.
Our current infastrucure setup is that we have a number of On Prem VM's which I have deployed the Azure Agents as per the deployment group setup.
The issue I have at the moment is that the deployment first tries to download the artifact from our build server using a file share.
However, currently the deployment machine cant see the file share. I gather I am supposed to be able to see the file share. I'm not entirely sure how to share this folder on the build machine?
Am I supposed to just create a share for everyone to see? Or is there a particluar user/role that I am having to share it for?
Our current infastrucure setup is that we have a number of On Prem VM's which I have deployed the Azure Agents as per the deployment group setup.
You need to check which user is used to run the tasks on the agent. You can add a powershell task with inline script: whoami.
You need to make sure the account have access to the file share.
In addition, when you publish artifact, you can select file share to store the artifact and then you can consume the artifact in the release pipeline. Please check the link for the details.
My "Download artifacts from file share" in release pipeline screenshot:
I have an .Net Core application along with unit test cases. For that I have configured the Build pipeline in Azure DevOps. In that pipeline I have integrated SonarQube tasks (prepare analysis, run code analysis, and publish quality gate results).
I can see the report in SonarQube server after successful run. But in that report, I didn’t see the Code Coverage Results and Unit test results. Even though I used Cobertura for unit tests.
Your pipeline seems to contain the right steps, so there can be two issues:
1. Code coverage file is not generated (correctly)
The easiest way to validate if the code coverage file is generated correctly, is by publishing it as an artifact. Now check what format the output file is. If there is no output file, please check if you did include /p:CollectCoverage=true --logger trx to the test command. If you are running the build pipeline on Linux, you should also add /p:CoverletOutputFormat=opencover and install the coverlet.collector NuGet package in the .NET Test Project.
2. Code coverage file is not sent to Sonarqube
If you configured step 1 correctly, it is still possible that the generated files are not sent to Sonarqube. The best way to see what is going wrong, is by checking the build logs of the Run Code Analysis and Publish Quality Gate Results steps.
The most common issue, is that Sonarscanner is checking the wrong directory. In the prepare step, please specify where the files are located, like:
sonar.cs.opencover.reportsPaths=$(Build.SourcesDirectory)/**/coverage.opencover.xml
sonar.cs.vstest.reportsPaths=$(Agent.TempDirectory)/*.trx
Build pipeline generates code coverage files would it be coverlet or opencover, the problem is that Azure Devops pipeline creates the reports outside the working directory on the build agent, it uses the _temp folder inside _work whereas soanrquube searches inside the working directory. I am facing this issue with coverlet as well as Visual studio coverage tool. You can read the following 2 threads
https://github.com/coverlet-coverage/coverlet/issues/1399
https://github.com/microsoft/azure-pipelines-tasks/issues/11536.
I posted an issue on snoarqube community site but did not get any response so far. The solution I think is either make the test step in the build pipeline change the directory or configure somewher sonarqube to search the agent build directory _temp which is above the working directory
I am using ReportGenerator task in my Azure DevOps to merge Cobertura based code coverage reports into one, end up with empty reports in pipeline CodeCoverage tab.
Below is my pipeline with three jobs.
Job1 – uses Windows agent pool1, builds java- (clean compile, test, cobertura:cobertura ), if build success, saves test reports, code coverage reports (only xml) to Azure pipeline artifacts.
Job2 – uses Windows agent pool2, builds .NET core- (restore, test, coverlet reports in Cobertura format), if build success, saves test reports, code coverage reports (only xml) to Azure pipeline artifacts.
Job3 - uses Windows agent pool3, downloads test and multiple coverage reports uploaded from previous jobs, merges all Cobertura reports into one using ReportGenerator. Publish code coverage reports.
But, if I go and see the Code Coverage report tab pipeline, assemblies, classes, files, package names data is there but no coverage data, when I click on a particular package class name it’s empty and showing “ ’/some relative path/ abc.java’ does not exist (any more)”.
Please suggest.
ReportGenerator needs the source code in order to create a complete report.
There is no way to avoid that.
You need to copy the source code or check it out again in the same directory.
Agree with Daniel's answer. And if you use Microsoft-hosted agents, there are different machines for each job so they don't share same build source directory.
You could try to deploy self-hosted agent and use it in pipeline so all jobs share the same build source directory, and then you don't need to check it out again.
I have a requirement to integrate the JMeter scripts, checked-in a Git repository, with a DevOps pipeline so that I can run the JMeter scripts using a specific VM in Azure.
Basically, I should have all my jmxs and csvs in a git repository and when I run the pipeline, having a parameter of the script name, it should run the script on a specific VM (not with a static IP) and copy the jtl in some storage.
What is the best way to achieve this?
With a DevOps pipeline so that I can run the JMeter scripts using a
specific VM in Azure. What is the best way to achieve this?
If the specific VM exists before the current pipeline, you can consider installing self-hosted agent there.
To do CI/CD using Azure pipelines, we need at least one agent. If we use microsoft-hosted agent, it will provide one fresh VM for us to run jobs. Since you need to run the script in your own specific VM, I suggest using self-hosted agent. You can follow the steps here to install one agent into your own VM. (The steps are quite easy and only cost several minutes)
After making your VM a self-hosted agent, the pipeline will call your VM to run the jobs. Now your original issue turns into how to run JMeter locally with command-line. See similar issues here: Five Ways To Launch a JMeter Test without Using the JMeter GUI and Run .jmx file through command line ....
1.So now we can use a command-line task in pipeline to run the JMeter related commands shared in the similar topics above. And these jobs are done in your specific VM.
2.I'm not sure which location you want to copy the jtl to, but you can use Azure File Copy task to copy files to Microsoft Azure storage blobs or virtual machines (VMs). Or a simple copy/xcopy command in your command line task to copy files to another location in same machine. (Specific VM)
Hope all above helps :)
I have Use following Task in Azure CD pipeline.
"Run Taurus" Task is as following.
Where "_WM WebClient TestArtifacts" is git/Azure Repo directory where .jmx file kept(in Code).
I'm building an Azure DevOps pipeline to deploy a custom-build powershell application to several on-prem environments that we support. I configured the required agent pools and installed them as a service on the on-prem environment.
Next, I have set up my pipeline in Azure DevOps, selecting a GitRepo:
Build (with the steps: Use Nuget, Nuget Restore, Build solution, Update version, Copy Files, and Publish build Artifact)
Release (with step: Publish Build Artifact)
Some things are unclear for me:
Do I need the Publish Build Artifacts twice? Can the Build pipeline
end with Copy Files step, and that the Release pipeline picks up this
artifact?
It is my understanding that the release publishes the app to the
on-prem environment (in my case). Where can I set a custom path (ie:
C:\deployed_apps) where the app needs to be deployed? When I tested
this pipeline, I got errors that the path I created using an variable
was not found.
What am I missing in my setup to get this pipeline working?
As #Shayki Abramczyk
pointed out this task is not for deployment, it just upload your Build Artifacts to azure devops server where your release pipeline can download directly.
In your case, if you want deploy your application to several on-prem environments, You need to create a deployment group first, A deployment group is a logical set of deployment target machines that have agents installed on each one. You application will be deployed to those machines in a deployment group in release pipeline. Check here for more details about deployment group.
After the deployment group is created, you can add a deployment group job by click the 3 dots, and then specify your deployment group as below pic shows, You can then simply add a copy file task or other deployment tasks to deploy your application to your on-prem machine.
In the release pipeline you shouldn't use the Publish Build Artifact. in the end of the build you put this step, what this step does? upload your artifacts to the Azure DevOps or to a file share. now in the release pipeline you choose the build artifact (in the left pane). the first thing that the agent does when the release pipeline started is to download the build artifacts to the agent. now you need to take them and deploy it to your environments. how? it depends which kind of application is (it can be just copy files, it can be deploy to IIS, etc.).
You can put the path in the variables tab and use this variable when you deploy the app (with copy files task, for example).