Building Artifactory fails for Build Stage in Delivery Pipeline - ibm-cloud

I have created a toolchain, which downloads the code from the bitbucket repository and builds the docker image in IBM Cloud.
After the code builds the image, the build stage fails while building the artifactory.
Error:
Preparing the build artifacts...
Customer script does not exist for the job, exitting
I have specified the Build archive directory as the folder name. Do I need to write any scripts for archiving?

That particular error occurs when one of our checks -- the existence of /home/pipeline/$TASK_ID/_customer_script.sh -- fails.
Archiving happens automatically but that file needs to be present as we use it as part of the traceability around how the artifact was created. Is it possible that file is getting removed? (Also will look into removing or making the check non-fatal however that will take time)

This issue appears to be caused by setting a working directory for the job. _customer_script.sh gets dropped into the working directory, but the script Simon is referring to (/opt/IBM/pipeline/bin/ids-buildables-notify.sh) only checks the top-level directory the code input is at (/home/pipeline/$TASK_ID/).
Three options to fix this, assuming you're doing a container registry job:
Run cp _customer_script.sh /home/pipeline/$TASK_ID in your script. The ids-buildables-notify.sh script does some grepping for your bx cr build call, so make sure that's still in there.
touch /home/pipeline/$TASK_ID/_customer_script.sh and export PIPELINE_IMAGE_URL=<your image url>. If PIPELINE_IMAGE_URL is set, the notify script doesn't bother with being clever, which I prefer.
Don't change the working directory.
A script which works for me:
#!/bin/bash
echo -e "Build environment variables:"
echo "REGISTRY_URL=${REGISTRY_URL}"
echo "REGISTRY_NAMESPACE=${REGISTRY_NAMESPACE}"
echo "IMAGE_NAME=${IMAGE_NAME}"
echo "BUILD_NUMBER=${BUILD_NUMBER}"
echo -e "Building container image"
set -x
export PIPELINE_IMAGE_URL=$REGISTRY_URL/$REGISTRY_NAMESPACE/$IMAGE_NAME:$BUILD_NUMBER
bx cr build -t $PIPELINE_IMAGE_URL .
set +x
touch /home/pipeline/$TASK_ID/_customer_script.sh

Related

SFTP from web service through Cygwin fails

I have a web page running on Apache which uses a matured set of Perl files for monitoring our workplace servers and applications. One of those tests goes through Cygwin´s SFTP, list files there and assess them.
The problem I have is with SFTP itself - when I run part of test either manually from cmd as D:\cygwin\bin\bash.exe -c "/usr/bin/sftp -oIdentityFile=[privateKeyPath] -oStrictHostKeyChecking=no -b /cygdrive/d/WD/temp/list_SFTP.sh [user]#[hostname]" or invoke the very same set of Perl files as web it works OK (returns list of files as it should). When exactly same code is run through web page it fails quick and does not tell anything. Only thing I have is error code 255 and "Connection closed". No error stream, no verbose output, nothing, no matter what way to capture any error I have used.
To cut long story short, the culprit was HOME path.
When run manually either directly from cmd or through Perl, the D:\cygwin\bin\bash.exe -c "env" would report HOME as HOME=/cygdrive/c/Users/[username]/ BUT this same command when run through web page reports HOME=/ i.e. root, apparently loosing the home somewhere along the path.
With this knowledge the solution is simple: prepend SFTP command with proper home path (e.g. D:\cygwin\bin\bash.exe -c "export HOME=/cygdrive/c/Users/%USERNAME%/ ; /usr/bin/sftp -oIdentityFile=[privateKeyPath] -oStrictHostKeyChecking=no -b /cygdrive/d/WD/temp/list_SFTP.sh [user]#[hostname]") and you are good to go.

Bamboo Powershell Task fails after first run

I'm completely new to Bamboo, so thank you in advance for the help.
I'm trying to create a Bamboo Run that zips files from a git repo and uploads it to Artifactory. Currently my build contains 2 tasks - source code checkout and a simple powershell script. The first time I run it it builds perfectly fine, but without any modifications any consecutive runs fail.
The error I'm getting in the log is the following:
Failing task since return code of [powershell -ExecutionPolicy bypass -Command /bin/sh /opt/bamboo/agent/temp/OR-J8U-JOB1-4-ScriptBuildTask-539645121146088515.ps1] was -1 while expected 0
Replacing the powershell script with empty space does not resolve the issue - only removing the script completely allows the build to succeed, but I cannot reinsert a new script or it will fail. I read other online questions suggesting that I "merge the user-level PATH environment information in to the system-level PATH" but I cannot find the user-level environment information, my environmental variables section is completely empty.
Like Vlad, I found that it was more efficient to implement my powershell script with batch.

Run setup creation using bat - install4j

I run the command:
"C:\Program Files\install4j6\bin\install4jc.exe" --license="xxx" 64DeveloperInstallation.install4j -r RADview_Test.exe
(in the xxx i put a valid license)
and i got the response: Updated licensing information.
My Goal is to run the setup creation using Bat file.
You have to execute install4jc twice, once with
--license [license key]
and then with the other parameters for building your project.

Post-commit hook failed (exit code 3) with output

I'm trying to call a Jenkins job remotely using a post-commit script. I'm currently committing code through Eclipse Kepler/Subversive/SVNKit Connector.
post-commit script:
if svnlook dirs-changed -r "$REV" "$REPOS" | grep -qEe '^trunk/'; then
wget --post-data="job=APS-RemoteServerAction&token=SECRET&ACTION=deploy&ASSET_NAME=POST-COMMIT-TEST&DEPLOY_ENV=DEV&REVISION=$REV" "http://my.domain.com:8080/buildByToken/buildWithParameters"
fi
Screenshot of error through Eclipse:
Important notes:
Code does get committed properly, repository browser indicates a new version
The job runs on Jenkins, the history shows that
Everytime I commit, I get this error message
I tried adding the flag --quiet, but I got the same exit code.
I'm thinking it's due to wget and posting the values?
Edit #1
I would like to point out that I'm using the Jenkins Build Authorization Token Root Plugin. I switched to a POST instead of a GET (which works) due to eventually moving onto https and keeping the token out of the URL.
I interpret the error message to mean that wget can not write a file with the name buildWithParameters in its current directory. Use wget -O - to write the output to stdout.
The error is (I think) because it's trying to download the webpage to a local dir. You just need to ping the endpoint to make jenkins build, so I used the --spider (doesn't download), --no-proxy (I was getting cached responses sometimes) and -q (don't output, cuz svn will report it).
wget --post-data="job=APS-RemoteServerAction&token=SECRET&ACTION=deploy&ASSET_NAME=POST-COMMIT-TEST&DEPLOY_ENV=DEV&REVISION=$REV" "http://my.domain.com:8080/buildByToken/buildWithParameters" --spider --no-proxy -q

Stop Copy Task immediately if one file was not copied successfully

Just imagine regular Deploy Target which copies thousand files to the remote network folder using MSBuild Copy Task, I believe pretty common scenario. So when folder is not accessible or there are some access privilegies problems - obviously Copy Task would not be able to copy files, but it will try to copy each file anyway, I want to prevent this to speed up Deploy Target for this case and report Failed status immediately and do not wait 30-60 mins whilst it process all files in the queue...
How to force MSBuild Copy Task to stop immediately in case when a file was not copied successfully and do not try to copy all other files?
If this is not possible using Copy Task perhaps this could be achieved using other facilities?
You can use Exec task instead. Like this:
<Exec Command="xcopy /s "from with spaces" $(WebDeployFolder)\$(WebDeployName)"/>
Would it not be better to use robocopy to copy, its has a plenty of options for similar things. See the task in the extension pack:
http://www.msbuildextensionpack.com/help/4.0.4.0/index.html
The options property of the task takes a number of parameters:
http://technet.microsoft.com/en-us/library/cc733145%28WS.10%29.aspx