How to get Jenkins to report after the build script fails? - rake

I use rake to build my project and one of the steps is running the unit, integration and fitnesse tests. If too many of these fail, I fail the rake script.
That part is working fine.
Unfortunately, after the build is failed, jenkins doesn't publish the html reports I generated from the unit, integration and fitnesse tests I generated, making it tad difficult to track down the failure reason.
Am I missing a configuration step to get the reports published?
Is Jenkins supposed to skip the post-build steps when the build fails?
It seems like it some for most of the plugins I am using.

You have to tell Jenkins which artifacts to archive in a post-build step (there is a check box under general 'Post-build actions' heading which is called 'Archive the Artifacts'). Important: the artifact path is determined relative to the workspace directory. Make sure that the option Discard all but the last successful/stable artifact to save disk space is not checked.

Finally figured it out, one of those I could have had a V8 moments...
I'm using a rake file to build and one of it's tasks is failing just before some reporting tasks that need to run in order to have the HTML pushed into the correct area to be published.

Related

How to see the Code Coverage results in Sonar Qube using Azure DevOps pipelines

I have an .Net Core application along with unit test cases. For that I have configured the Build pipeline in Azure DevOps. In that pipeline I have integrated SonarQube tasks (prepare analysis, run code analysis, and publish quality gate results).
I can see the report in SonarQube server after successful run. But in that report, I didn’t see the Code Coverage Results and Unit test results. Even though I used Cobertura for unit tests.
Your pipeline seems to contain the right steps, so there can be two issues:
1. Code coverage file is not generated (correctly)
The easiest way to validate if the code coverage file is generated correctly, is by publishing it as an artifact. Now check what format the output file is. If there is no output file, please check if you did include /p:CollectCoverage=true --logger trx to the test command. If you are running the build pipeline on Linux, you should also add /p:CoverletOutputFormat=opencover and install the coverlet.collector NuGet package in the .NET Test Project.
2. Code coverage file is not sent to Sonarqube
If you configured step 1 correctly, it is still possible that the generated files are not sent to Sonarqube. The best way to see what is going wrong, is by checking the build logs of the Run Code Analysis and Publish Quality Gate Results steps.
The most common issue, is that Sonarscanner is checking the wrong directory. In the prepare step, please specify where the files are located, like:
sonar.cs.opencover.reportsPaths=$(Build.SourcesDirectory)/**/coverage.opencover.xml
sonar.cs.vstest.reportsPaths=$(Agent.TempDirectory)/*.trx
Build pipeline generates code coverage files would it be coverlet or opencover, the problem is that Azure Devops pipeline creates the reports outside the working directory on the build agent, it uses the _temp folder inside _work whereas soanrquube searches inside the working directory. I am facing this issue with coverlet as well as Visual studio coverage tool. You can read the following 2 threads
https://github.com/coverlet-coverage/coverlet/issues/1399
https://github.com/microsoft/azure-pipelines-tasks/issues/11536.
I posted an issue on snoarqube community site but did not get any response so far. The solution I think is either make the test step in the build pipeline change the directory or configure somewher sonarqube to search the agent build directory _temp which is above the working directory

Automated build pipeline Salesforce Azure DevOps

I am trying an automated build process in Azure DevOps for Salesforce. whenever a change is pushed to the repository, my build is triggered and it is working fine and pushing the changes to the related sandbox. Here is the proof for the same
Success Build Process.
The configuration of the build is Build configuration.
The build is working fine as expected. I now want to create a release which will push this change to a different environment, and I don't want this to be automated, hence the option of creating the release. The build path to the ant file in my release is exactly as it should be but I am getting this error. Release Error.
The release configuration is Release configuration
My Repository folder structure is: Folder structure. and my build.xml is within the deploy folder.
I don't know what I am doing wrong but the release is always failing and giving me the error which says:
Error: Not found antBuildFile: D:\a\r1\a\deploy\build.xml
Not found antBuildFile: D:\a\r1\a\deploy\build.xml
Based on the first image (Success Build Process), seems that you already have deployed your changes on that sandbox. Working with metadata deployment in Salesforce is different from java and .net, keep in mind that you already have the "executables", all those XML are already the code that you will change on the environment.
The second point is that on release you are in another agent, Buil and Release pipelines runs have their own lifecycle, so the code existing at the Build pipeline is not available until you send it on "drop" artifact, see Publish Build Artifacts task documentation. So that use copy task to put build.xml on publish folder, then you'll be able to use it on Release pipeline.
When you are executing ant go the /deploy folder and execute your command or check for your ant version using ant -version command.

Copying files and deploying to Azure without building using Visual Studio Team Services

I'm attempting to deploy a web site to Azure using VSTS. Basically, I commit code to the GIT repo and have it setup to run CI, so it begins building as soon as I commit. However, once it hits the release section, it never copies the code to the Azure web app, rather, it gives me this line:
Info: Updating file ({projectname}\error.txt).
It doesn't copy the files I changed, but rather always just copies this file. I checked and there is indeed an error.txt file in my website directory in Azure, but it is always blank.
This build/deploy process isn't "standard" because the build step only downloads from source code, it doesn't build, because the website isn't a "web application", but rather just a "web site", meaning it doesn't need to be built.
So my build step is as follows:
Get Sources
Run on Agent - this step is empty
so the idea is that it just downloads everything from source control, that's it.
Then, my release step is as follows:
Artefacts are from build step above
deploy to environment 1 (dev)
Azure app service deploy, using "package or folder" as $(System.DefaultWorkingDirectory)/
Any idea what I might be doing wrong here?
So I actually figured this out and will leave this here in case anyone else needs it.
I admit I'm pretty new to the Azure/VSTS world, so maybe someone else is making my mistake as well.
If you don't need to "build" your project, then don't. I resolved it by simply skipping the build step altogether. What I was really after was to just download the files from source control and deploy them as-is.
In your release editor, you can specify which "artifact" you want to use to release, and one of the options is source control, which is what I did.
This would be useful for websites like mine where you don't need to build them (mine is DNN/DotNetNuke, so you don't build it before deploying).

Phabricator-Jenkins plugin issue; Jenkins build will pass, but it will not generate any artifacts

I have Phabricator and Jenkins integrated. However, when I kick off a build in either Phabricator or Jenkins, there are no results. My goal is for Jenkins to generate .jar files as artifacts after I kick off a build of Java code. I did try to configure Jenkins with post-build actions to "archive the artifacts." That however causes the builds to fail because there are no artifacts. I am doing a parameterized build. The builds pass, but nothing is generated. The build executes successfully, but as I understand it there should be something generated as a result of that build (jars/wars). My issue is that is not happening at all. As I understand it, Jenkins should generate some artifact after a successful build. However, there are no artifacts showing up in Jenkins
In Jenkins, the build output is:
`Building in workspace /var/lib/jenkins/jobs/DansItem/workspace
[phabricator:ignore-build] No differential ID found.
[phabricator:plugin-provider] 'cobertura' plugin not installed.
[phabricator:non-differential-harbormaster] Sending diffusion result as: SUCCESS
Finished: SUCCESS
`
There is nothing in the workspace for that build. Looking at Jenkins documentation, building Java code should result in some .jar .war files. My Jenkins build creates nothing.
In Phabricator, there is an artifact tab, that is just a link back to Jenkins:
![Phabricator artifacts tab][1]`
I have tried setting the post build actions to archive the artifacts, but whenever I set a post build action, the build fails. I have tried just setting a single wildcard (*) character, and the job fails saying there are no artifacts found. The build fails regardless of what I set the files to archive in the post build actions.
![Jenkins Post build actions for "/**/*.jar"][2]
![Jenkins Post build actions for "/**/*.jar"][3]

TeamCity: Best Practices to deploy produced installers (artifacts)

We got a TeamCity server which produces nightly deployable builds. We want our beta tester to have access these nightly builds.
What are the best practices to do this? TeamCity Server is not public, it is in our office, so I assume best approach would be pushing artifacts via FTP or something like that.
Also I have no clue how to trigger a script when an artifact created successfully. Does TeamCity provide a way to do that?
I don't know of a way to trigger a script, but I wouldn't worry about that. You can retrieve artifacts via a URL. Depending on what makes sense for your project, you could have a script set up on a scheduler (cron or Windows Scheduling) that pulls the artifact and sends it to the FTP site for the Beta testers. You can configure it to pull only the latest successful artifact. If you set up the naming right, if the build fails they beta testers won't notice because the new build number just won't be there, no bad builds would be pushed to them.
Read the following help page from the documentation. It shows how you send commands from your build script to tell teamCity to publish the artifacts to a given path.
In TeamCity 7.0+ you can use Deployer plugin. Installation steps can be found here. It also allows to upload artifacts via SMB and SSH.
I suggest you start looking at something like (n)Ant to handle your build process. That way you can handle the entire "build artifacts" -> "publish artifacts" chain in an automated manner. These tools are dependency based, so the artifacts would only be published if the build succeeded.