Jenkins workflow stashing flattened folders - jenkins-workflow

Is it possible to unstash a stashed group of artifacts as a flatten directory like you can do with jenkins archiving?
What I would like it to have 3 folders stashed and only one unstashed with all the contents of the 3, but I can't do it currently.
Here is my try:
echo("Workflow Starting...");
node {
def imageNames = ["connector","registryloader","salepersistence","settlement","standalone","trackingloader"]
stage 'Building
checkout([$class: 'SubversionSCM', additionalCredentials: [], excludedCommitMessages: '', excludedRegions: '', excludedRevprop: '', excludedUsers: '', filterChangelog: false, ignoreDirPropChanges: false, includedRegions: '', locations: [[credentialsId: '36c9ca9f-de25-4022-b9eb-70ada8e793b8', depthOption: 'infinity', ignoreExternalsOption: true, local: '.', remote: 'http://10.64.111.28/svn/SampleProject/skl-br']], workspaceUpdater: [$class: 'UpdateUpdater']])
withEnv(["PATH+MAVEN=${tool 'M3'}/bin"]) {
bat "mvn clean install assembly:assembly versions:resolve-ranges -Dmaven.test.skip=false -DskipTests"
}
echo("Archiving")
archive '**/target/*.tar.gz, conf/desenv/**/*'
for(int i = 0; i < imageNames.size(); i++){
String imageName = imageNames[i]
echo("Stashing ${imageName}")
stash excludes: '', includes: '**/sklbr-1.0.0-standalone.tar.gz, **/'+imageName+'/*, **/commonConfigs/*, **/scripts/*', name: ''+imageName+''
}
stage 'Creating Container'
docker.withServer('tcp://localhost:2375')
{
for(int i = 0; i < imageNames.size(); i++){
String imageName = imageNames[i]
ws('sub-workspace ' + imageName) {
echo("Creating ${imageName} container")
//unarchive mapping: ['**/sklbr-1.0.0-standalone.tar.gz' : '.', '**/${imageName}/*' : '.', '**/commonConfigs/*' : '.', '**/scripts/*' : '.']
unstash imageName
echo("Unstashing ${imageName}")
def newApp = docker.build "devops-docker-registry:5000/"+imageName+":${env.BUILD_NUMBER}-${env.SVN_REVISION}"
echo("Building container with ${imageName}")
newApp.push()
echo("Pushing ${imageName} container")
}
}
}
}

Well I've found a way using shell script, but that is not what I was looking for...
Here it is in case you need it
*Nix
sh 'find /${pwd()} -iname "*" -print0 | xargs -0 echo mv -t a'
Windows
bat 'for /r %f in (*) do #move "%f" .'

JENKINS-29780 may be helpful, but remember you can wrap stash and/or unstash in dir.

Related

Error in email template for Robot framework when using multiple script blocks in pipeline

Im using Jenkins in Windows 10 with pipeline script running a robot script and then sending a email with the results.
The template I'm using is this opne
I've tried using different setups in the jenkinsfile (including using a scripted version) but basically this works:
pipeline {
agent any
stages {
stage('Build') {
steps {
script{
bat "robot -v store:${storeID} -v testType:${testType} -v site:${pageID} -v productName: previews.robot"
}
}
}
}
post {
always{
script{
step(
[
$class : 'RobotPublisher',
outputPath : '.',
outputFileName : 'output.xml',
reportFileName : 'report.html',
logFileName : 'log.html',
disableArchiveOutput: false,
passThreshold : 50,
unstableThreshold : 40,
otherFiles : "*.png",
]
)
def emailBody = '''${SCRIPT, template="robot.template"}'''
emailext body: emailBody,
subject: "[JENKINS BUILD]",
recipientProviders: [[$class: 'DevelopersRecipientProvider'], [$class: 'RequesterRecipientProvider']]
}
archiveArtifacts '*.html, *.xml, *.png'
}
}
}
But if I add another stage with some scripting, as for example, change the build name, then I will get a template error.
Pipeline that generates error:
pipeline {
agent any
stages {
stage('Init'){
steps{
script{
def name = ""
switch(testType){
case "newProduct":
def tmpProduct = pageID.split("/")
productName = tmpProduct[tmpProduct.size()-1]
name= storeID+": New Product ("+productName+")"
break
case "newHomepage":
name = storeID+": New Homepage"
break
case "newBlog":
name = storeID+": New Blog Post"
break
case "cosmetics":
name = storeID+": UI Changes"
break
default:
name = "Test type empty"
break
}
currentBuild.displayName = name
}
}
}
stage('Build') {
steps {
script{
bat "robot -v store:${storeID} -v testType:${testType} -v site:${pageID} -v productName: previews.robot"
}
}
}
}
post {
always{
script{
step(
[
$class : 'RobotPublisher',
outputPath : '.',
outputFileName : 'output.xml',
reportFileName : 'report.html',
logFileName : 'log.html',
disableArchiveOutput: false,
passThreshold : 50,
unstableThreshold : 40,
otherFiles : "*.png",
]
)
def emailBody = '''${SCRIPT, template="robot.template"}'''
emailext body: emailBody,
subject: "[JENKINS BUILD]",
recipientProviders: [[$class: 'DevelopersRecipientProvider'], [$class: 'RequesterRecipientProvider']]
}
archiveArtifacts '*.html, *.xml, *.png'
}
}
}
The name of the build is changed correctly and the script runs to the end without issues BUT in the email and console output I can see a template error:
Exception raised during template rendering: Cannot get property 'simpleName' on null object
java.lang.NullPointerException: Cannot get property 'simpleName' on null object
Any Pointers? I would really like to make a pipeline more rich but I will stay with the basic if I can't solve this problem...
Modify robot.template as below,which solved this problem.
if( action && (action.class.simpleName.equals("RobotBuildAction") ) )

run my test in docker mongo instance using jenkins pipeline

I would like to run my tests against a Docker MongoDB instance using Jenkins pipeline. I have got it working kind of. My problem is the tests are running within the Mongo container. I just want it to load up a container and my tests for it to connect to the Monogo container. At the moment it downloads Gradle within the container and takes about 5 min to run. Hope that makes sense. Here is my JenkinsFile
#!/usr/bin/env groovy
pipeline {
environment {
SPRING_PROFILES_ACTIVE = "jenkins"
}
agent {
node {
label "jdk8"
}
}
parameters {
choice(choices: 'None\nBuild\nMinor\nMajor', description: '', name: 'RELEASE_TYPE')
string(defaultValue: "refs/heads/master:refs/remotes/origin/master", description: 'gerrit refspec e.g. refs/changes/45/12345/1', name: 'GERRIT_REFSPEC')
choice(choices: 'master\nFETCH_HEAD', description: 'gerrit branch', name: 'GERRIT_BRANCH')
}
stages {
stage("Test") {
stages {
stage("Initialise") {
steps {
println "Running on ${NODE_NAME}, release type: ${params.RELEASE_TYPE}"
println "gerrit refspec: ${params.GERRIT_REFSPEC}, branch: ${params.GERRIT_BRANCH}, event type: ${params.GERRIT_EVENT_TYPE}"
checkout scm
sh 'git log -n 1'
}
}
stage("Verify") {
agent {
dockerfile {
filename 'backend/Dockerfile'
args '-p 27017:27017'
label 'docker-pipeline'
dir './maintenance-notifications'
}
}
steps {
sh './gradlew :maintenance-notifications:backend:clean'
sh './gradlew :maintenance-notifications:backend:check :maintenance-notifications:backend:test'
}
post {
always {
junit 'maintenance-notifications/backend/build/test-results/**/*.xml'
}
}
}
}
}
stage("Release") {
when {
expression {
return params.RELEASE_TYPE != '' && params.RELEASE_TYPE != 'None';
}
}
steps {
script {
def gradleProps = readProperties file: "gradle.properties"
def isCurrentSnapshot = gradleProps.version.endsWith("-SNAPSHOT")
def newVersion = gradleProps.version.replace("-SNAPSHOT", "")
def cleanVersion = newVersion.tokenize(".").collect{it.toInteger()}
if (params.RELEASE_TYPE == 'Build') {
newVersion = "${cleanVersion[0]}.${cleanVersion[1]}.${isCurrentSnapshot ? cleanVersion[2] : cleanVersion[2] + 1}"
} else if (params.RELEASE_TYPE == 'Minor') {
newVersion = "${cleanVersion[0]}.${cleanVersion[1] + 1}.0"
} else if (params.RELEASE_TYPE == 'Major') {
newVersion = "${cleanVersion[0] + 1}.0.0"
}
def newVersionArray = newVersion.tokenize(".").collect{it.toInteger()}
def newSnapshot = "${newVersionArray[0]}.${newVersionArray[1]}.${newVersionArray[2] + 1}-SNAPSHOT"
println "release version: ${newVersion}, snapshot version: ${newSnapshot}"
sh "./gradlew :maintenance-notifications:backend:release -Prelease.useAutomaticVersion=true -Prelease.releaseVersion=${newVersion} -Prelease.newVersion=${newSnapshot}"
}
}
}
}
}
and here is my Dockerfile
FROM centos:centos7
ENV container=docker
RUN mkdir -p /usr/java; curl http://configuration/yum/thecloud/artifacts/java/jdk-8u151-linux-x64.tar.gz|tar zxC /usr/java && ln -s /usr/java/jdk1.8.0_151/bin/j* /usr/bin
RUN mkdir -p /usr/mongodb; curl http://configuration/yum/thecloud/artifacts/mongodb/mongodb-linux-x86_64-3.4.10.tgz|tar zxC /usr/mongodb && ln -s /usr/mongodb/mongodb-linux-x86_64-3.4.10/bin/* /usr/bin
ENV JAVA_HOME /usr/java/jdk1.8.0_151/
ENV SPRING_PROFILES_ACTIVE jenkins
RUN yum -y install git.x86_64 && yum clean all
# Set up directory requirements
RUN mkdir -p /data/db /var/log/mongodb /var/run/mongodb
VOLUME ["/data/db", "/var/log/mongodb"]
# Expose port 27017 from the container to the host
EXPOSE 27017
CMD ["--port", "27017", "--pidfilepath", "/var/run/mongodb/mongod.pid"]
# Start mongodb
ENTRYPOINT /usr/bin/mongod

How to manipulate variable in Jenkins pipeline

been trying several trial and errors on where or how to cut the string in the variable and assign to a new variable to be used by jenkins stage. Normally just removing -TEST Jenkins pipeline indicated below:
properties([
[$class: 'RebuildSettings', autoRebuild: false, rebuildDisabled: false],
parameters([choice(choices: ['SQA-ENV-CLONE', 'DEV-ENV-CLONE'],
description: 'Select the ENV', name: 'ENV')])])
pipeline {
agent any
stages {
stage('VALIDATE ENVIRONMENT') {
def ACTIVE = sh(returnStdout: true, script: "echo $ENV | sed -e 's/-CLONE//g'")
steps {
echo 'Checking 1st the ${ACTIVE}'
}
}
}
}
Error I'm getting
Running in Durability level: MAX_SURVIVABILITY
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 6: Not a valid stage section definition: "def ACTIVE = sh(returnStdout: true, script: "echo $ENV | sed -e 's/-CLONE//g'")". Some extra configuration is required. # line 6, column 9.
stage('VALIDATE ENVIRONMENT') {
^
1 error
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1085)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:603)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:581)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:558)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:268)
at groovy.lang.GroovyShell.parseClass(GroovyShell.java:688)
at groovy.lang.GroovyShell.parse(GroovyShell.java:700)
at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.doParse(CpsGroovyShell.java:131)
at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.reparse(CpsGroovyShell.java:125)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.parseScript(CpsFlowExecution.java:560)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.start(CpsFlowExecution.java:521)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:330)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Finished: FAILURE
Lets say I choose "DEV-ENV-CLONE" as the value I'm expecting to have a successful build with this output:
Checking 1st the DEV-ENV
You need move the def ACTIVE = sh(...) into pipeline step script, They are Groovy script, only can be wrapped by script step.
stage('VALIDATE ENVIRONMENT') {
steps {
script {
ACTIVE = sh(
returnStdout: true,
script: "echo $ENV | sed -e 's/-CLONE//g'"
).trim()
}
echo "Checking 1st the ${ACTIVE}"
}
}

How to make a groovy script list files from a git repo?

I need to write a groovy script that talks to git, go to a repository, get the list of file names, store it in an array and return. Then I'll show that in a Jenkins choice parameter.
Is this possible, and if so, how?
If you're going to use pipeline it will be easier.
You can use following to get files from a directory recursively:
import groovy.io.FileType
def fileList = []
def dir = new File("your_repo_dir")
dir.eachFileRecurse (FileType.FILES) { file ->
fileList << file
}
And then in job properties you need to add choice param:
choiceParam(name: 'Repo Files', choices: fileList.join("\n"), description: '')
Use activeChoiceReactiveParam with public Git repository:
parameters {
stringParam {
name('BRANCH_NAME')
defaultValue('master')
description('git branche name')
trim(true)
}
activeChoiceReactiveParam('PLAYBOOK') {
description('Select a playbook')
filterable()
choiceType('SINGLE_SELECT')
groovyScript {
script("""
|def fileList = ['/bin/bash', '-c', "git clone --single-branch --branch " + BRANCH_NAME + " https://git.repository.com/scm/project/repo.git > /dev/null 2>&1 ; cd repo ; git ls-tree -r origin/" + BRANCH_NAME + " --name-only"].execute()
|fileList.waitFor()
|return ["playbook-default.yml"] + fileList.text.readLines().findAll { it.startsWith("playbook").endsWith(".yml") }
""".stripMargin())
fallbackScript('return ["playbook-default.yml"]')
}
referencedParameter('BRANCH_NAME')
}
Note: for now, I didn't manage to use credentials (event with SSH). I could be a good improvement.
If you want to get list of branch from HTTPS GIT URL + Jenkins credential, use this activeChoiceReactiveParam:
activeChoiceReactiveParam('BRANCH_NAME') {
description('Branch from git repo')
filterable()
choiceType('SINGLE_SELECT')
groovyScript {
script("""
|credentialsId = '<TO-REPLACE-WITH-YOUR-CREDENTIAL-ID'
|def creds = com.cloudbees.plugins.credentials.CredentialsProvider.lookupCredentials(com.cloudbees.plugins.credentials.common.StandardUsernamePasswordCredentials.class, jenkins.model.Jenkins.instance, null, null ).find{it.id == credentialsId}
|def url = 'https://' + creds.username + ':' + java.net.URLEncoder.encode(creds.password.getPlainText()) + '#bitbucket.url.com/scm/project/repo.git'
|def fileList = ['/bin/bash', '-c', 'rm -rf branch-name > /dev/null 2>&1 ; git clone ' + url + ' branch-name > /dev/null 2>&1 ; cd branch-name ; git for-each-ref --format="%(refname)" --sort -committerdate | sed "s|refs/[a-z]*/||g" | sed "s|origin/||g" '].execute()
|fileList.waitFor()
|return fileList.text.readLines()
|""".stripMargin())
fallbackScript('return ["branch-name-not-found"]')
}
}

qbs avr compiling

I try to build simple project with qbs
import qbs
Project {
name: "simple"
Product {
name: "micro"
type: "obj"
Group {
name: "sources"
files: ["main.c", "*.h", "*.S"]
fileTags: ['c']
}
Rule {
inputs: ["c"]
Artifact {
fileTags: ['obj']
filePath: input.fileName + '.o'
}
prepare: {
var args = [];
args.push("-g")
args.push("-Os")
args.push("-w")
args.push("-fno-exceptions")
args.push("-ffunction-sections")
args.push("-fdata-sections")
args.push("-MMD")
args.push("-mmcu=atmega328p")
args.push("-DF_CPU=16000000L")
args.push("-DARDUINO=152")
args.push("-IC:/Programs/arduino/hardware/arduino/avr/cores/arduino")
args.push("-IC:/Programs/arduino/hardware/arduino/avr/variants/standard")
args.push("-c")
args.push(input.fileName)
args.push("-o")
args.push(input.fileName + ".o")
var compilerPath = "C:/Programs/arduino/hardware/tools/avr/bin/avr-g++.exe"
var cmd = new Command(compilerPath, args);
cmd.description = 'compiling ' + input.fileName;
cmd.highlight = 'compiler';
cmd.silent = false;
console.error(input.baseDir + '/' + input.fileName);
return cmd;
}
}
}
}
And I get error
compiling main.c
C:/Programs/arduino/hardware/tools/avr/bin/avr-g++.exe -g -Os -w -fno-exceptions -ffunction-sections -fdata-sections -MMD "-mmcu=atmega328p" "-DF_CPU=16000000L" "-DARDUINO=152" -IC:/Programs/arduino/hardware/arduino/avr/cores/arduino -IC:/Programs/arduino/hardware/arduino/avr/variants/standard -c main.c -o main.c.o
avr-g++.exe: main.c: No such file or directory
avr-g++.exe: no input files
Process failed with exit code 1.
The following products could not be built for configuration qtc_avr_f84c45e7-release:
micro
What do I wrong?
File main.c present in project and in directory.
If I start this command from command prompt I do not get error.
In short, you need to pass input.filePath after -c and -o, not input.fileName. There's no guarantee that the working directory of the command invoked will be that of your source directory.
You can set the workingDirectory of a Command object, but that is not generally recommended as your commands should be independent of the working directory unless absolutely necessary.
Furthermore, you appear to be duplicating the functionality of the cpp module. Instead, your project should look like this:
import qbs
Project {
name: "simple"
Product {
Depends { name: "cpp" }
name: "micro"
type: "obj"
Group {
name: "sources"
files: ["main.c", "*.h", "*.S"]
fileTags: ['c']
}
cpp.debugInformation: true // passes -g
cpp.optimization: "small" // passes -Os
cpp.warningLevel: "none" // passes -w
cpp.enableExceptions: false // passes -fno-exceptions
cpp.commonCompilerFlags: [
"-ffunction-sections",
"-fdata-sections",
"-MMD",
"-mmcu=atmega328p"
]
cpp.defines: [
"F_CPU=16000000L",
"ARDUINO=152"
]
cpp.includePaths: [
"C:/Programs/arduino/hardware/arduino/avr/cores/arduino",
"C:/Programs/arduino/hardware/arduino/avr/variants/standard
]
cpp.toolchainInstallPath: "C:/Programs/arduino/hardware/tools/avr/bin"
cpp.cxxCompilerName: "avr-g++.exe"
}
}
it's work
args.push("-c")
args.push(input.filePath) at you args.push(input.fileName)
args.push("-o")
args.push(output.filePath)at you args.push(input.fileName)