I have a Github Actions workflow which has two jobs:
Compile code to a .exe file and upload as artifact
Job that runs on Windows/Mac which downloads the .exe file and uses it.
However, all of my pipelines have failed, either because a job was for some reason canceled or because there was an error. For reference, here is my latest pipeline run: https://github.com/dantheking-crypto/MakeALanguage/actions/runs/332305074
I would appreciate it if someone could point out what I did wrong.
I took a look at your pipeline above and it seems your run-on-windows-and-mac job is missing actions/checkout action step. This is why spark.exe execution fails. It cannot find test.spark file used as an argument (it is in the repo but was not checked out to the runner executing the run). Just add the actions/checkout step before executing spark.exe spark.test and you should be fine.
Related
I have a TFS build process that incorporates asynchronous remote functional test executions. When the tests have finished executing, I want to publish the generated TRX file in the originating build summary and update the build status (if required).
I've been searching for awhile now and have so far been unsuccessful in finding exactly what I'm looking for: is it possible to publish the TRX file to the build summary via a PowerShell scripted REST API call?
I am afraid there is not a rest api available to publish the TRX file to the build summary page.
During the pipeline execution, the tasks consume the trx file and read the test result to generate a report which you see on the build summary page. So even if you managed to upload the trx file afterwards, the trx file will not be processed and you cannot get test report on the build summary page.
If your pipeline waits for the test execution to complete, you can try using scripts to copy the generated TRX file back to the local agent machine and published via publish tesk results task.
If the pipeline finished before the test execution completed. You can create new pipeline to publish the trx file as workaround. But this will end up showing the test result in a different pipeline build summary page. If it is acceptable to you, you can copy the trx file back to the local agent machine and trigger the new pipeline via Build Queue rest api. You can also consider publishing the trx file to a git repo and add this git repo to the new pipeline as git source.
I am trying an automated build process in Azure DevOps for Salesforce. whenever a change is pushed to the repository, my build is triggered and it is working fine and pushing the changes to the related sandbox. Here is the proof for the same
Success Build Process.
The configuration of the build is Build configuration.
The build is working fine as expected. I now want to create a release which will push this change to a different environment, and I don't want this to be automated, hence the option of creating the release. The build path to the ant file in my release is exactly as it should be but I am getting this error. Release Error.
The release configuration is Release configuration
My Repository folder structure is: Folder structure. and my build.xml is within the deploy folder.
I don't know what I am doing wrong but the release is always failing and giving me the error which says:
Error: Not found antBuildFile: D:\a\r1\a\deploy\build.xml
Not found antBuildFile: D:\a\r1\a\deploy\build.xml
Based on the first image (Success Build Process), seems that you already have deployed your changes on that sandbox. Working with metadata deployment in Salesforce is different from java and .net, keep in mind that you already have the "executables", all those XML are already the code that you will change on the environment.
The second point is that on release you are in another agent, Buil and Release pipelines runs have their own lifecycle, so the code existing at the Build pipeline is not available until you send it on "drop" artifact, see Publish Build Artifacts task documentation. So that use copy task to put build.xml on publish folder, then you'll be able to use it on Release pipeline.
When you are executing ant go the /deploy folder and execute your command or check for your ant version using ant -version command.
I have a GitHub repository which contains a Jenkinsfile (with job configuration steps). I want to trigger a Jenkins simple Pipeline (not multibranch) job every night to build a jar from this repo and deploy to Nexus.
The pipeline definition options says read Pipeline script from SCM but then I don't see any option to point to specific SCM i.e. GitHub in my case. I can write the pipeline script in the Job but that is not what I want.
How can I achieve this? Please help.
You can add a build trigger for Build periodically to the jenkins job.
This will build it on a schedule for you.
You will need to install the Git Client Plugin
Then you will get the following option:
Under it you will be able to put the location of the git repo and the credentials.
I have Phabricator and Jenkins integrated. However, when I kick off a build in either Phabricator or Jenkins, there are no results. My goal is for Jenkins to generate .jar files as artifacts after I kick off a build of Java code. I did try to configure Jenkins with post-build actions to "archive the artifacts." That however causes the builds to fail because there are no artifacts. I am doing a parameterized build. The builds pass, but nothing is generated. The build executes successfully, but as I understand it there should be something generated as a result of that build (jars/wars). My issue is that is not happening at all. As I understand it, Jenkins should generate some artifact after a successful build. However, there are no artifacts showing up in Jenkins
In Jenkins, the build output is:
`Building in workspace /var/lib/jenkins/jobs/DansItem/workspace
[phabricator:ignore-build] No differential ID found.
[phabricator:plugin-provider] 'cobertura' plugin not installed.
[phabricator:non-differential-harbormaster] Sending diffusion result as: SUCCESS
Finished: SUCCESS
`
There is nothing in the workspace for that build. Looking at Jenkins documentation, building Java code should result in some .jar .war files. My Jenkins build creates nothing.
In Phabricator, there is an artifact tab, that is just a link back to Jenkins:
![Phabricator artifacts tab][1]`
I have tried setting the post build actions to archive the artifacts, but whenever I set a post build action, the build fails. I have tried just setting a single wildcard (*) character, and the job fails saying there are no artifacts found. The build fails regardless of what I set the files to archive in the post build actions.
![Jenkins Post build actions for "/**/*.jar"][2]
![Jenkins Post build actions for "/**/*.jar"][3]
I use rake to build my project and one of the steps is running the unit, integration and fitnesse tests. If too many of these fail, I fail the rake script.
That part is working fine.
Unfortunately, after the build is failed, jenkins doesn't publish the html reports I generated from the unit, integration and fitnesse tests I generated, making it tad difficult to track down the failure reason.
Am I missing a configuration step to get the reports published?
Is Jenkins supposed to skip the post-build steps when the build fails?
It seems like it some for most of the plugins I am using.
You have to tell Jenkins which artifacts to archive in a post-build step (there is a check box under general 'Post-build actions' heading which is called 'Archive the Artifacts'). Important: the artifact path is determined relative to the workspace directory. Make sure that the option Discard all but the last successful/stable artifact to save disk space is not checked.
Finally figured it out, one of those I could have had a V8 moments...
I'm using a rake file to build and one of it's tasks is failing just before some reporting tasks that need to run in order to have the HTML pushed into the correct area to be published.