Multiple code coverage reports merged, published, showing no data in Azure DevOps pipeline - azure-devops

I am using ReportGenerator task in my Azure DevOps to merge Cobertura based code coverage reports into one, end up with empty reports in pipeline CodeCoverage tab.
Below is my pipeline with three jobs.
Job1 – uses Windows agent pool1, builds java- (clean compile, test, cobertura:cobertura ), if build success, saves test reports, code coverage reports (only xml) to Azure pipeline artifacts.
Job2 – uses Windows agent pool2, builds .NET core- (restore, test, coverlet reports in Cobertura format), if build success, saves test reports, code coverage reports (only xml) to Azure pipeline artifacts.
Job3 - uses Windows agent pool3, downloads test and multiple coverage reports uploaded from previous jobs, merges all Cobertura reports into one using ReportGenerator. Publish code coverage reports.
But, if I go and see the Code Coverage report tab pipeline, assemblies, classes, files, package names data is there but no coverage data, when I click on a particular package class name it’s empty and showing “ ’/some relative path/ abc.java’ does not exist (any more)”.
Please suggest.

ReportGenerator needs the source code in order to create a complete report.
There is no way to avoid that.
You need to copy the source code or check it out again in the same directory.

Agree with Daniel's answer. And if you use Microsoft-hosted agents, there are different machines for each job so they don't share same build source directory.
You could try to deploy self-hosted agent and use it in pipeline so all jobs share the same build source directory, and then you don't need to check it out again.

Related

How to see the Code Coverage results in Sonar Qube using Azure DevOps pipelines

I have an .Net Core application along with unit test cases. For that I have configured the Build pipeline in Azure DevOps. In that pipeline I have integrated SonarQube tasks (prepare analysis, run code analysis, and publish quality gate results).
I can see the report in SonarQube server after successful run. But in that report, I didn’t see the Code Coverage Results and Unit test results. Even though I used Cobertura for unit tests.
Your pipeline seems to contain the right steps, so there can be two issues:
1. Code coverage file is not generated (correctly)
The easiest way to validate if the code coverage file is generated correctly, is by publishing it as an artifact. Now check what format the output file is. If there is no output file, please check if you did include /p:CollectCoverage=true --logger trx to the test command. If you are running the build pipeline on Linux, you should also add /p:CoverletOutputFormat=opencover and install the coverlet.collector NuGet package in the .NET Test Project.
2. Code coverage file is not sent to Sonarqube
If you configured step 1 correctly, it is still possible that the generated files are not sent to Sonarqube. The best way to see what is going wrong, is by checking the build logs of the Run Code Analysis and Publish Quality Gate Results steps.
The most common issue, is that Sonarscanner is checking the wrong directory. In the prepare step, please specify where the files are located, like:
sonar.cs.opencover.reportsPaths=$(Build.SourcesDirectory)/**/coverage.opencover.xml
sonar.cs.vstest.reportsPaths=$(Agent.TempDirectory)/*.trx
Build pipeline generates code coverage files would it be coverlet or opencover, the problem is that Azure Devops pipeline creates the reports outside the working directory on the build agent, it uses the _temp folder inside _work whereas soanrquube searches inside the working directory. I am facing this issue with coverlet as well as Visual studio coverage tool. You can read the following 2 threads
https://github.com/coverlet-coverage/coverlet/issues/1399
https://github.com/microsoft/azure-pipelines-tasks/issues/11536.
I posted an issue on snoarqube community site but did not get any response so far. The solution I think is either make the test step in the build pipeline change the directory or configure somewher sonarqube to search the agent build directory _temp which is above the working directory

Calculate code metrics as part of build pipeline in Devops

This seems a pretty straightforward thing to do, but I cannot find relevant information. In Visual Studio is very easy to calculate code metrics for all projects and I would like to do the same during a build pipeline in Azure DevOps.
Has anyone done something like this?
In azure devops, you can review code coverage results. The results can be viewed and downloaded on the Code coverage tab.
Publish Code Coverage Results publishes code coverage results to
Azure Pipelines or TFS, which were produced by a build in Cobertura
or JaCoCo format.
Built-in tasks such as Visual Studio Test, .NET Core, Ant, Maven,
Gulp, Grunt, and Gradle provide the option to publish code coverage
data to the pipeline.
Here are some documents you can refer to:
Review code coverage results
Azure DevOps and the Code Coverage
In addition, you can get code analysis through SonarCloud integrated with Azure devops. SonarCloud is a cloud-based code quality and security service.
Here is the lab you can follow.
You can generate code metrics by adding the Microsoft.CodeAnalysis.Metrics nuget and building with the parameter that target the "Metrics" build, this will store the results in an xml file.
msbuild /t:Metrics
If you're building a solution with multiple projects, you have to add the nuget to every project otherwise the build will fail because the Metrics target won't be present for the other projects.
The build will produce one xml file per project.
You can also use the metrics command line tool from the roslyn analyzer repository.
See https://learn.microsoft.com/en-us/visualstudio/code-quality/how-to-generate-code-metrics-data?view=vs-2022

Project code is not being analyzed for sonarqube

I have a repo in azure DevOps with only folder as test.
Now, I have given the task structure in this way in azure DevOps. But I cannot see the code getting analyzed in sonarqube. The code tab shows blank. Could someone help me with where I am going wrong?? I do not want to give folder name in sources..I want whatever code I add in the branch to be analyzed.
edit: Just realized this is happening only for feature short lived branch..My sonarqube version is 8.0
steps:
task: SonarQubePrepare#4
inputs:
SonarQube: 'connection name'
scannerMode: 'CLI'
configMode: 'manual'
cliProjectKey: 'pipeline-sonar-demo'
cliProjectName: 'pipeline-sonar-demo'
cliSources: "."
extraProperties: |
# Additional properties that will be passed to the scanner,
# Put one key=value per line, example:
sonar.exclusions=**/*.xml
SonarQube extension provides three tasks you will use in your build definitions to analyze your projects:
Prepare Analysis Configuration task, to configure all the required
settings before executing the build.
This task is mandatory.
In case of .NET solutions or Java projects, it helps to integrate
seamlessly with MSBuild, Maven and Gradle tasks.
Run Code Analysis task, to actually execute the analysis of the
source code.
This task is not required for Maven or Gradle projects, because
scanner will be run as part of the Maven/Gradle build.
Publish Quality Gate Result task, to display the Quality Gate status
in the build summary and give you a sense of whether the application
is ready for production "quality-wise".
This task is optional.
It can significantly increase the overall build time because it
will poll SonarQube until the analysis is complete. Omitting this
task will not affect the analysis results on SonarQube - it simply
means the Azure DevOps Build Summary page will not show the status
of the analysis or a link to the project dashboard on SonarQube.
It seems you still need add Run Code Analysis task. Regarding how to use SonarScanner for Azure DevOps, please refer to the following documentation:
https://docs.sonarqube.org/latest/analysis/scan/sonarscanner-for-azure-devops/

Is there a way to integrate task into Azure Devops server?

I have a YAML task that runs tests on .NET solutions in Azure pipeline.
It is meant to run after a build step and execute Unit-tests on that assembly.
The output is a simple XML file with test results that needs to be shown after each build run in the summary tab.
How can I make it recognizable by azure?
For example: MsBuild step is recognized and shown in the summary menu as Build Artifacts and have the option to download them from Azure UI. How can I make Azure recognize my task and show it's artifacts and info too? How many tests ran and info in the title and when I click drop it will show artifacts
Summary menu after build run on Azure DevOps server
You can use ##vso[task.uploadsummary]local file path to upload summary to the build summary. However you may need to write the contents of xml file into md file. please check here to learn more upload tasks.
- powershell: '##vso[task.uploadsummary]path to test result'
However test results isnot intended to be displayed on the summary page.
You must have noticed that there is a Test tab beside Summary tab. Usually the test results will be automatically published by Vstest tasks and the test results will be displayed in Test tab.
You can also publish your test results using task PublishTestResults as mentioned by #4c74356b41. You test results will then be displayed in Test tab.
If you are asking how to make your test results visible in the build summary, there is task meant for that specifically.
- task: PublishTestResults#2
inputs:
testResultsFormat: 'JUnit' # Options: JUnit, NUnit, VSTest, xUnit, cTest
testResultsFiles: '**/TEST-*.xml'
your test results should be compatible with one of the supported versions

Copy Test results from VSTS to self-hosted Azure VM

I have configured a private Build agent/self hosted VM in VSTS.
I want to copy all test results for each test case to my self hosted VM. Currently for this I have created Build Pipeline.
In publish test task,has Test result files e.g. .xml file and Search folder as Default source repository path.
Test case snapshot:
How can I specify the path of attachment in test case and copy/download artifacts to VM?
Also after processing the attachment, I want to copy the output back to test results.
How can this be achieved?
Any help highly appreciated.
Thank you
All necessary files are on agent machine (build/deployment agent), so if you run test during release, the test result files will be on that agent machine.
On the other hand, if you can’t use VsTest task to run test, then you can publish test result through Publish Test Result task.
The VsTest task will publish the results automatically.
If you're not using VsTest for whatever reason, then using deployment groups would be easier. With deployment groups, you don't need to copy files -- the deployment runs directly on the target machines with no intermediary agent machine necessary.