How to have GoCD restore NuGet packages when building solution - github

I am trying to learn how to use Go CD to implement continuous integration into my company's workflow. I'm using GitHub for my source control and have my pipeline set up to monitor the repository for changes and attempt to rebuild whenever there's a change. All that is working fine. Now I'm trying to add unit tests to the pipeline and can't get the unit test project to build.
This is the config xml for my build command:
<exec command="%Windir%\Microsoft.NET\Framework64\v4.0.30319\MsBuild.exe" workingdir="UnitTest">
<arg>/T:rebuild</arg>
<arg>/P:Configuration=Release</arg>
<arg>/m</arg>
<runif status="passed" />
</exec>
This is the relevant part of the error that I'm getting:
The missing file is ..\packages\MSTest.TestAdapter.1.1.18\build\net45\MSTest.TestAdapter.props
The unit test project uses some NuGet packages, so it needs these files to build properly, but they are not in the working directory for GoCD. I added a job to my pipeline before trying to build the unit tests that restores the NuGet packages to the location specified in the error message. Here is the config xml for my restore nugget command:
<exec command="C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\MSBuild\ReadyRoll\Octopack\build\NuGet.exe" workingdir="UnitTest">
<arg>restore</arg>
<arg>packages.config</arg>
<arg>-PackagesDirectory</arg>
<arg>..\packages</arg>
<runif status="passed" />
</exec>
When I re-ran the pipeline and monitored the working directory, I see the NuGet files get created when the "restore" step is run, however building the unit test project deletes the files and I still get the error.
I thought this was because I was using /T:rebuild in my MSBuild command, but it still deleted the files when I changed it to /T:build.
Can anyone give me a tip on how add a job to my pipeline that builds a project using NuGet packages? I thought about adding the "NuGet.exe restore" command to the project file, but I couldn't figure out how to do it (or if it can be done?).
Thanks in advance!

I figured out what I was doing wrong. I set up the build and restore commands in separate jobs, I needed to make them both tasks in the same job. I did that, and it worked.
I think this has something to do with how GoCD checks out the material for each job? But since I'm so new to this (and may not be right about the reasons), I'm not going to speculate about why this changed caused it to work.
If anyone knows the reason, I'd be happy to know.

Related

How to see the Code Coverage results in Sonar Qube using Azure DevOps pipelines

I have an .Net Core application along with unit test cases. For that I have configured the Build pipeline in Azure DevOps. In that pipeline I have integrated SonarQube tasks (prepare analysis, run code analysis, and publish quality gate results).
I can see the report in SonarQube server after successful run. But in that report, I didn’t see the Code Coverage Results and Unit test results. Even though I used Cobertura for unit tests.
Your pipeline seems to contain the right steps, so there can be two issues:
1. Code coverage file is not generated (correctly)
The easiest way to validate if the code coverage file is generated correctly, is by publishing it as an artifact. Now check what format the output file is. If there is no output file, please check if you did include /p:CollectCoverage=true --logger trx to the test command. If you are running the build pipeline on Linux, you should also add /p:CoverletOutputFormat=opencover and install the coverlet.collector NuGet package in the .NET Test Project.
2. Code coverage file is not sent to Sonarqube
If you configured step 1 correctly, it is still possible that the generated files are not sent to Sonarqube. The best way to see what is going wrong, is by checking the build logs of the Run Code Analysis and Publish Quality Gate Results steps.
The most common issue, is that Sonarscanner is checking the wrong directory. In the prepare step, please specify where the files are located, like:
sonar.cs.opencover.reportsPaths=$(Build.SourcesDirectory)/**/coverage.opencover.xml
sonar.cs.vstest.reportsPaths=$(Agent.TempDirectory)/*.trx
Build pipeline generates code coverage files would it be coverlet or opencover, the problem is that Azure Devops pipeline creates the reports outside the working directory on the build agent, it uses the _temp folder inside _work whereas soanrquube searches inside the working directory. I am facing this issue with coverlet as well as Visual studio coverage tool. You can read the following 2 threads
https://github.com/coverlet-coverage/coverlet/issues/1399
https://github.com/microsoft/azure-pipelines-tasks/issues/11536.
I posted an issue on snoarqube community site but did not get any response so far. The solution I think is either make the test step in the build pipeline change the directory or configure somewher sonarqube to search the agent build directory _temp which is above the working directory

How to exclude file or folder from build task in azure devops

I have vs solution which contains multiple projects,
And im now configuring CI/CD pipeline for the solution in azure.
There is one project i dont want to be include in release.
Im trying to remove that project in restore and build tasks.
But still it included in restore and build.
if anyone has an idea about this ?
I've tested on my side using build task, the exclude pattern is working as expected, please check the following sample and compare with yours:
Task:
Log:

Automated build pipeline Salesforce Azure DevOps

I am trying an automated build process in Azure DevOps for Salesforce. whenever a change is pushed to the repository, my build is triggered and it is working fine and pushing the changes to the related sandbox. Here is the proof for the same
Success Build Process.
The configuration of the build is Build configuration.
The build is working fine as expected. I now want to create a release which will push this change to a different environment, and I don't want this to be automated, hence the option of creating the release. The build path to the ant file in my release is exactly as it should be but I am getting this error. Release Error.
The release configuration is Release configuration
My Repository folder structure is: Folder structure. and my build.xml is within the deploy folder.
I don't know what I am doing wrong but the release is always failing and giving me the error which says:
Error: Not found antBuildFile: D:\a\r1\a\deploy\build.xml
Not found antBuildFile: D:\a\r1\a\deploy\build.xml
Based on the first image (Success Build Process), seems that you already have deployed your changes on that sandbox. Working with metadata deployment in Salesforce is different from java and .net, keep in mind that you already have the "executables", all those XML are already the code that you will change on the environment.
The second point is that on release you are in another agent, Buil and Release pipelines runs have their own lifecycle, so the code existing at the Build pipeline is not available until you send it on "drop" artifact, see Publish Build Artifacts task documentation. So that use copy task to put build.xml on publish folder, then you'll be able to use it on Release pipeline.
When you are executing ant go the /deploy folder and execute your command or check for your ant version using ant -version command.

dotnet build vs publish on Azure DevOps

I have a .NET Core 2.0 console app. I can successfully build or publish this app and run it locally. I can also successfully build and publish this app in Azure DevOps. However, if I build the app in Azure DevOps, I cannot run the result.
In Azure DevOps, I tried building using:
dotnet build -c Release -r win-x64 -o app
This generates a small number of files with just the project related files. It does not include all of the System.*.dll files, etc that seem excessive for most of my cases. This command works fine when I run it on my local machine and I can successfully click the MyApp.exe file and run my console app. However, if I run the same command on Azure DevOps, the MyApp.exe file that gets generated does not run as expected. Instead, it starts then immediately quits. Nothing is printed in the console app. I see no errors. The app is very basic, includes a "try-catch" around everything and has a Console.ReadLine at the end. So, I thought it would stay open.
When I run:
dotnet publish -c Release -r win-x64 -o app
I get the same files, but with all of the System.*.dll files, etc. included. This time, I've noticed that I can successfully run MyApp.exe and it behaves as expected.
Why does dotnet build ... work locally, but I don't seem to get the same behavior when I run dotnet build ... in Azure DevOps. It seems I'm forced to use dotnet publish. My issue is, the resulting .zip file goes from ~500kb to 30MB. This is big difference.
An answer from the horse's mouth:
The dotnet build command builds the project and its dependencies into
a set of binaries. The binaries include the project's code in
Intermediate Language (IL) files with a .dll extension and symbol
files used for debugging with a .pdb extension. A dependencies JSON
file (*.deps.json) is produced that lists the dependencies of the
application. A *.runtimeconfig.json file is produced, which specifies
the shared runtime and its version for the application.
If the project has third-party dependencies, such as libraries from
NuGet, they're resolved from the NuGet cache and aren't available with
the project's built output. With that in mind, the product of dotnet
build isn't ready to be transferred to another machine to run.
dotnet build - Builds a project and all of its dependencies.
dotnet publish - Packs the application and its dependencies into a folder for deployment to a hosting system. (PS - this also builds the application before packing)
The description is actually very good considering it's coming directly from Microsoft so I will not duplicate the words here.
As an exercise create a solution with multiple projects. For one of the projects add a reference to another project. Add some static files which your code references and a few NuGet packages. And run these commands at the solution root level and at the project level and observe the output in the bin folder.
Commands to run:
dotnet build
dotnet publish
dotnet clean to clean the bin folder
Also, run this at the root level and observe the output with the self-contained flag enabled:
dotnet publish -o ./output --runtime win10-x64 --self-contained
More info on self-contained builds
The different between them is that:
For publish, the necessary assembly files (packages) will be included in build folder and the app uses these assemblies.
But for build, the app references packages that are in the user’s folder. That’s why the zip file just 500 kb.
Since it references packages that are in the user’s folder, so the app needs to be built under the same user’s account, then you can run the app without publishing. So, you need to change build agent’s service account to your account (log on as), then restart the service and queue a new build.
Otherwise, you need to publish the app.

How to get Jenkins to report after the build script fails?

I use rake to build my project and one of the steps is running the unit, integration and fitnesse tests. If too many of these fail, I fail the rake script.
That part is working fine.
Unfortunately, after the build is failed, jenkins doesn't publish the html reports I generated from the unit, integration and fitnesse tests I generated, making it tad difficult to track down the failure reason.
Am I missing a configuration step to get the reports published?
Is Jenkins supposed to skip the post-build steps when the build fails?
It seems like it some for most of the plugins I am using.
You have to tell Jenkins which artifacts to archive in a post-build step (there is a check box under general 'Post-build actions' heading which is called 'Archive the Artifacts'). Important: the artifact path is determined relative to the workspace directory. Make sure that the option Discard all but the last successful/stable artifact to save disk space is not checked.
Finally figured it out, one of those I could have had a V8 moments...
I'm using a rake file to build and one of it's tasks is failing just before some reporting tasks that need to run in order to have the HTML pushed into the correct area to be published.