Phabricator-Jenkins plugin issue; Jenkins build will pass, but it will not generate any artifacts - plugins

I have Phabricator and Jenkins integrated. However, when I kick off a build in either Phabricator or Jenkins, there are no results. My goal is for Jenkins to generate .jar files as artifacts after I kick off a build of Java code. I did try to configure Jenkins with post-build actions to "archive the artifacts." That however causes the builds to fail because there are no artifacts. I am doing a parameterized build. The builds pass, but nothing is generated. The build executes successfully, but as I understand it there should be something generated as a result of that build (jars/wars). My issue is that is not happening at all. As I understand it, Jenkins should generate some artifact after a successful build. However, there are no artifacts showing up in Jenkins
In Jenkins, the build output is:
`Building in workspace /var/lib/jenkins/jobs/DansItem/workspace
[phabricator:ignore-build] No differential ID found.
[phabricator:plugin-provider] 'cobertura' plugin not installed.
[phabricator:non-differential-harbormaster] Sending diffusion result as: SUCCESS
Finished: SUCCESS
`
There is nothing in the workspace for that build. Looking at Jenkins documentation, building Java code should result in some .jar .war files. My Jenkins build creates nothing.
In Phabricator, there is an artifact tab, that is just a link back to Jenkins:
![Phabricator artifacts tab][1]`
I have tried setting the post build actions to archive the artifacts, but whenever I set a post build action, the build fails. I have tried just setting a single wildcard (*) character, and the job fails saying there are no artifacts found. The build fails regardless of what I set the files to archive in the post build actions.
![Jenkins Post build actions for "/**/*.jar"][2]
![Jenkins Post build actions for "/**/*.jar"][3]

Related

GitHub Actions uploading and downloading artifacts

I have a Github Actions workflow which has two jobs:
Compile code to a .exe file and upload as artifact
Job that runs on Windows/Mac which downloads the .exe file and uses it.
However, all of my pipelines have failed, either because a job was for some reason canceled or because there was an error. For reference, here is my latest pipeline run: https://github.com/dantheking-crypto/MakeALanguage/actions/runs/332305074
I would appreciate it if someone could point out what I did wrong.
I took a look at your pipeline above and it seems your run-on-windows-and-mac job is missing actions/checkout action step. This is why spark.exe execution fails. It cannot find test.spark file used as an argument (it is in the repo but was not checked out to the runner executing the run). Just add the actions/checkout step before executing spark.exe spark.test and you should be fine.

Triggering a build on completion of another build when a Pull Request comes in

We have a web application in an Azure DevOps repo and there's a branch policy on the master branch that kicks off a build when a pull request is created. This validates that it compiles and performs code quality checks and the like.
We also have some integration tests (using Mocha and Selenium) that live in another repo. I would like to run the integration tests when a PR against master is created.
As far as I know I cannot have the same build pull from two different repos (without using extensions and it seems cleaner to me to have two separe builds anyway). So I thought I would have another build just to run the integration tests. The build that pulls from the webapp repo would have a final step where it would deploy to an integration tests environment and then the second build would get the latest version of the integration tests and run them against the integration tests environment. I created a Build Completion trigger on the integration tests build that is triggered by the completion of the webapp build.
The problem is that when I queue the webapp build manually, it will launch the integration tests build when done. But when the webapp build is queued by an incoming PR, the integration tests build does not get triggered.
Is this a bug in Azure DevOps or am I going about this wrong?
Also in my side builds from PR doesn't trigger another builds (with Build Completion trigger), I don't know if it's a bug or it's by design.
Anyway, there is a workaround - the final step in the first build will trigger the second build. how? with Trigger Build task.
You just need to change the branch because it will be a merge branch from the PR that doesn't exist in the tests repository:
You can also do it without install extensions with PowerShell task and the Rest API.

Automated build pipeline Salesforce Azure DevOps

I am trying an automated build process in Azure DevOps for Salesforce. whenever a change is pushed to the repository, my build is triggered and it is working fine and pushing the changes to the related sandbox. Here is the proof for the same
Success Build Process.
The configuration of the build is Build configuration.
The build is working fine as expected. I now want to create a release which will push this change to a different environment, and I don't want this to be automated, hence the option of creating the release. The build path to the ant file in my release is exactly as it should be but I am getting this error. Release Error.
The release configuration is Release configuration
My Repository folder structure is: Folder structure. and my build.xml is within the deploy folder.
I don't know what I am doing wrong but the release is always failing and giving me the error which says:
Error: Not found antBuildFile: D:\a\r1\a\deploy\build.xml
Not found antBuildFile: D:\a\r1\a\deploy\build.xml
Based on the first image (Success Build Process), seems that you already have deployed your changes on that sandbox. Working with metadata deployment in Salesforce is different from java and .net, keep in mind that you already have the "executables", all those XML are already the code that you will change on the environment.
The second point is that on release you are in another agent, Buil and Release pipelines runs have their own lifecycle, so the code existing at the Build pipeline is not available until you send it on "drop" artifact, see Publish Build Artifacts task documentation. So that use copy task to put build.xml on publish folder, then you'll be able to use it on Release pipeline.
When you are executing ant go the /deploy folder and execute your command or check for your ant version using ant -version command.

Downloading Artifacts Locally from VSTS

I have successfully created a Build definition in VSTS for some SharePoint client side projects that I'm working on. I tried creating a Release definition but I can't seem to find any way for me to copy/download the artifacts created from my build definition locally. I may be missing something since I'm still quite new with VSTS but I can't seem to figure it out.
To downbload build artifacts from release when build successful, you can specify the release definition as below:
Add the build artifacts with latest version in release definition.
Enable Continuous deployment trigger for the artifacts.
Select the private agent which you want to download on the local machine.
Now when a build succeed, a new release will be triggered to download the latest build artifacts.

How to get Jenkins to report after the build script fails?

I use rake to build my project and one of the steps is running the unit, integration and fitnesse tests. If too many of these fail, I fail the rake script.
That part is working fine.
Unfortunately, after the build is failed, jenkins doesn't publish the html reports I generated from the unit, integration and fitnesse tests I generated, making it tad difficult to track down the failure reason.
Am I missing a configuration step to get the reports published?
Is Jenkins supposed to skip the post-build steps when the build fails?
It seems like it some for most of the plugins I am using.
You have to tell Jenkins which artifacts to archive in a post-build step (there is a check box under general 'Post-build actions' heading which is called 'Archive the Artifacts'). Important: the artifact path is determined relative to the workspace directory. Make sure that the option Discard all but the last successful/stable artifact to save disk space is not checked.
Finally figured it out, one of those I could have had a V8 moments...
I'm using a rake file to build and one of it's tasks is failing just before some reporting tasks that need to run in order to have the HTML pushed into the correct area to be published.