How to run continuous integration in parallel across multiple Pull Requests? - github

I am testing use of Jenkins with Github pull request builder plugin I have successfully set up a toy project on Github and dev installation of Jenkins so that raising a PR, or pushing changes to a PR branch triggers a build. Mostly this works as required - a few things don't match preferred workflow, but the freedom from having to write and maintain our own plugin is a big deal.
I have one potential showstopper. The plugin queues up all pushes in all PRs it sees, and only ever seems to run a single job at a time, even with spare executors available. In the real world project, we may have 10 active PRs, each may get a few pushed updates in a day in response to QC comments, and the full CI run takes > 30 mins. However, we do have enough build executors provisioned to run multiple jobs at the same time.
I cannot see any way to configure the PR request builder to process multiple jobs at once on the same trigger, but I may be missing something basic elsewhere in Jenkins. Is there a way to do this, without needing to customise the plugin?
I have installed Jenkins ver. 1.649 on a new Ubuntu 14.04 server (on a VirtualBox guest) and followed the README in the ghprb plugin (currently version 1.30.5), including setting up a jenkins "bot" account on Github as a collaborator to run all the integration API calls to Github.
I was wondering what the behaviour would be if I cloned the job (create new item and "Copy existing item"), and may try that next, but I expect that will result in the same job being run multiple times for no benefit as opposed to interacting smartly with other jobs polling the same pool of PRs.

I have found the config setting whilst exploring more for the question.
It is really easy when you know which config item it is, but Jenkins has a lot of configuration to work through, especially when you are exploring the plugins.
The key thing is that the option to serve queued jobs in parallel (available executors allowing) is core Jenkins config, and not part of the Github PR builder.
So, just check the option Execute concurrent builds if necessary. This option should be found at the bottom of the first, untitled section of config. It is a really basic Jenkins option, that a newbie like me missed due to the mountain of other options.

May be it is too late to answer this question, but after few days of researching I figured out a way to create multiple jobs per PR in github.
The code I am showing here applies to github enterprise, but it works well enough for the general github(bitbucket) as well with a few tweaks in url and git command.
The mainline repository against which the PRs are created needs to have a file, I call it PRJob.groovy and contains
import groovy.json.JsonSlurper
gitUrl = GIT_URL
repoRestUrl = "${GITHUB_WEB_URL}/repos/${project}/${repo}"
def getJSON(url) {
def conn = (HttpURLConnection) new URL(url).openConnection()
conn.setRequestProperty("Authorization", "token ${OAUTH_TOKEN}");
return new JsonSlurper().parse(new InputStreamReader(conn.getInputStream()))
}
def createPipeline(name, description, branch, prId) {
return pipelineJob(name) {
delegate.description description
if (ENABLE_TRIGGERS == 'true') {
triggers {
cron 'H H/8 * * *'
scm 'H/5 * * * *'
}
}
quietPeriod(60)
environmentVariables {
env 'BRANCH_NAME', branch
env 'PULL_REQUEST', prId
env 'GITHUB_WEB_URL', GITHUB_WEB_URL
env 'OAUTH_TOKEN', OAUTH_TOKEN
env 'PROJECT', project
env 'REPO', repo
}
definition {
cpsScm {
scriptPath "Jenkinsfile"
scm {
git {
remote {
credentials "jenkins-ssh-key"
delegate.url gitUrl
if (prId != "") {
refspec "+refs/pull/${prId}/*:refs/remotes/origin/pr/${prId}/*"
}
}
delegate.branch branch
}
}
}
}
}
}
def createPRJobs() {
def prs = getJSON("${repoRestUrl}/pulls?state=open")
if (prs.size() == 0) {
def mergedPrs = getJSON("${repoRestUrl}/pulls?state=closed")
if (mergedPrs.size() == 0) {
throw new RuntimeException("No pull-requests found; auth token has likely expired")
}
}
prs.each { pr ->
def id = pr.get("number")
def title = pr.get("title")
def fromRef = pr.get("head")
def fromBranchName = fromRef.get("ref")
def prRepo = fromRef.get("repo")
def repoName = prRepo.get("name")
def prHref = pr.get("url")
createPipeline("${repo}-PR-${id}-${fromBranchName}",
"${prHref} Pull Request ${id}: ${title}", "origin/pr/${id}/head", id)
}
}
createPRJobs()
This creates 1 jenkins job per PR.
This relies on the project having a Jenkinsfile which can be picked up for running a peipeline job. A sample Jenkinsfile will look like below:
//Jenkinsfile for building and creating jobs
commitId = null
repoRestUrl = "${GITHUB_WEB_URL}/repos/${PROJECT}/${REPO}"
try{
stage('Install and Tests') {
runTest("Hello")
}
notify_github 'success'
}catch (Exception e) {
notify_github 'failure'
print e
throw e
}
def runTest(String someDummyVariable) {
node {
checkout scm
sh 'git clean -qdf'
if (env.PULL_REQUEST == ""){
sh 'git rev-parse --verify HEAD > commit.txt'
} else {
// We check out PR after it is merged with master, but we need to report the result against the commit before merge
sh "git rev-parse refs/remotes/origin/pr/${env.PULL_REQUEST}/head^{commit} > commit.txt"
}
commitId = readFile 'commit.txt'
echo commitId
sh 'rm -f commit.txt'
//Here goes your code for doing anything
sh 'echo "Hello World!!!!!"'
}
}
def http_post(url, rawJson) {
def conn = (HttpURLConnection) new URL(url).openConnection()
conn.setRequestProperty("Authorization", "token ${OAUTH_TOKEN}");
conn.doOutput = true
conn.requestMethod = "POST"
conn.setRequestProperty("Content-Type", "application/json")
def wr = new OutputStreamWriter(conn.getOutputStream());
wr.write(rawJson);
wr.close()
def code = conn.getResponseCode()
if (code < 200 || code >= 300){
println 'Failed to post to ' + url
def es = conn.getErrorStream();
if (es != null) {
println es.getText()
}
}
}
def notify_github(state) {
http_post(
"${repoRestUrl}/statuses/${commitId}",
"""
{ "state": "${state}",
"target_url": "${env.BUILD_URL}",
"description": "Build Pipeline",
"context": "Build Pipeline"
}
"""
)
}
Hope this helps someone.

Related

Running executables with arguments from a Jenkinsfile (using bat - under Windows)

I have read multiple threads and I still cannot figure out how to get my Jenkinsfile to run some applications such as NuGet.exe or devenc.com.
Here is what I have so far as a Jenkins file:
pipeline {
agent any
options {
timestamps ()
skipStagesAfterUnstable()
}
environment {
solutionTarget = "${env.WORKSPACE}\\src\\MySolution.sln"
}
stages {
stage('Build Solution') {
steps {
dir ("${env.workspace}") {
script {
echo "Assumes nuget.exe was downloaded and placed under ${env.NUGET_EXE_PATH}"
echo 'Restore NuGet packages'
""%NUGET_EXE_PATH%" restore %solutionTarget%"
echo 'Build solution'
""%DEVENV_COM_PATH%" %solutionTarget% /build release|x86"
}
}
}
}
}
}
In this example, I get the following error:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: java.lang.String.mod() is applicable for argument types: (java.lang.String) values: [C:\Program Files (x86)\NuGet\nuget.exe]
Note that a declarative checkout is in place, although not visible from the Jenkinsfile:
I have also tried to use a function to run those cmds, but without success either:
def cmd_exec(command) {
return bat(returnStdout: true, script: "${command}").trim()
}
Any tip would be highly appreciated.

How to create a tag for specific branch using Jenkinsfile

I have a scenario i need to build and push docker file only when tag is build from master branch.
stage('Tag') {
when {
expression { sh([returnStdout: true, script: 'echo $TAG_NAME | tr -d \'\n\'']) }
}
steps {
script {
echo "tag"
}
}
}
The above code will work.. If at all we create a tag for the develop branch or the test branch even though when condition is getting satisfied. can anyone help me how to over come this issue.
If it is of any help, this is an except of a sequential Jenkinsfile I use to tag branches on repos. It is a two stage process - find the current commit for the branch and then tag that:
steps.withCredentials([steps.usernamePassword(credentialsId: httpCredentials,
usernameVariable: 'GITHUB_USERNAME', passwordVariable: 'GITHUB_TOKEN')]) {
// Need to do this as a two stage activity (need commit sha) but *warning* this api assumes ref is a branch
def json_info = steps.sh(script: "curl -X GET -H \"Authorization: token \$GITHUB_TOKEN\" -H \"Accept: application/vnd.github.v3+json\" ${httpUrlForHost}/repos/${slugForRepo}/branches/${ref}",
returnStdout: true)
def branch_info = steps.readJSON text: json_info
def sha = branch_info?.commit?.sha
if (!sha) {
steps.error("Unexpected sha for branch ${ref} (${json_info})")
}
def jsonPayload = "{\"ref\":\"refs/tags/${new_tag}\",\"sha\":\"${sha}\"}"
steps.sh "curl -X POST -H \"Authorization: token \$GITHUB_TOKEN\" -H \"Accept: application/vnd.github.v3+json\" -d '${jsonPayload}' ${httpUrlForHost}/repos/${slugForRepo}/git/refs"
}

Publish Nunit Test Results in Post Always Section

I'm trying to run a pipeline that does some Pester Testing and publish the NUnit results.
New tests were introduced and for whatever the reason, Jenkins no longer publishes the test results and errors out immediately after the powershell script. Hence, it doesn't get to the nunit publish piece. I receive this:
ERROR: script returned exit code 128
Finished: FAILURE
I've been trying to include the publish in the always section of the post section of the Jenkinsfile, however, I'm running into problems on how to make that NUnit test file available.
I've tried establishing an agent and unstash the file (even though it probably won't stash if the powershell script cancels the whole pipeline). When I use agent I get the following exception:
java.lang.NoSuchMethodError: No such DSL method 'agent' found among steps
Here is the Jenkinsfile:
pipeline {
agent none
environment {
svcpath = 'D:\\svc\\'
unitTestFile = 'UnitTests.xml'
}
stages {
stage ('Checkout and Stash') {
agent {label 'Agent1'}
steps {
stash name: 'Modules', includes: 'Modules/*/**'
stash name: 'Tests', includes: 'Tests/*/**'
}
}
stage ('Unit Tests') {
agent {label 'Agent1'}
steps {
dir(svcpath + 'Modules\\'){deleteDir()}
dir(svcpath + 'Tests\\'){deleteDir()}
dir(svcpath){
unstash name: 'Modules'
unstash name: 'Tests'
}
dir(svcpath + 'Tests\\'){
powershell """
\$requiredCoverageThreshold = 0.90
\$modules = Get-ChildItem ../Modules/ -File -Recurse -Include *.psm1
\$result = Invoke-Pester -CodeCoverage \$modules -PassThru -OutputFile ${unitTestFile} -OutputFormat NUnitXml
\$codeCoverage = \$result.CodeCoverage.NumberOfCommandsExecuted / \$result.CodeCoverage.NumberOfCommandsAnalyzed
Write-Output \$codeCoverage
if (\$codeCoverage -lt \$requiredCoverageThreshold) {
Write-Output "Build failed: required code coverage threshold of \$(\$requiredCoverageThreshold * 100)% not met. Current coverage: \$(\$codeCoverage * 100)%."
exit 1
} else {
write-output "Required code coverage threshold of \$(\$requiredCoverageThreshold * 100)% met. Current coverage: \$(\$codeCoverage * 100)%."
}
"""
stash name: 'TestResults', includes: unitTestFile
nunit testResultsPattern: unitTestFile
}
}
post {
always {
echo 'This will always run'
agent {label 'Agent1'}
unstash name: 'TestResults'
nunit testResultsPattern: unitTestFile
}
success {
echo 'This will run only if successful'
}
failure {
echo 'This will run only if failed'
}
unstable {
echo 'This will run only if the run was marked as unstable'
}
changed {
echo 'This will run only if the state of the Pipeline has changed'
echo 'For example, if the Pipeline was previously failing but is now successful'
}
}
}
Any and all input is welcome! Thanks!
The exception you are getting is due to Jenkins' strict pipeline DSL. Documentation of allowable uses of agent are here.
Currently agent {...} is not allowed to be used in the post section. Maybe this will change in the future. If you require the whole job to run on the node that services label 'Agent1' the only way to currently do that is to
Put agent {label 'Agent1'} immediately under pipeline { to make it global
Remove all instances of agent {label 'Agent1'} in each stage
Remove the agent {label 'Agent1'} from the post section.
The post section acts more like traditional scripted DSL than the pipeline declarative DSL. So you have to use node() instead of agent.
I believe I've had this same question myself, and this SO post has the answer and some good context.
This Jenkins issue isn't exactly the same thing but shows the node syntax in the post stage.

Jenkins Github plugin doesn't set status

I'm trying to set a github status from a Jenkins job. Jenkins returns a
[Set GitHub commit status (universal)] SUCCESS on repos [] (sha:9892fbd) with context:ci/jenkins/tests
... but the status isn't set when I query it with the REST API later.
There's the groovy code:
def getCommitHash() {
sh(script: """
git rev-parse HEAD
""", returnStdout: true).trim()
}
def setCountTestLocation(String location) {
url = "https://<internal github>/<org>/<repo>"
commitHash = getCommitHash()
print(url)
print(commitHash)
step([
$class: "GitHubCommitStatusSetter",
reposSource: [$class: "ManuallyEnteredRepositorySource", url: url],
contextSource: [$class: "ManuallyEnteredCommitContextSource", context: "ci/jenkins/tests"],
statusBackrefSource: [$class: "ManuallyEnteredBackrefSource", backref: location],
errorHandlers: [[$class: "ChangingBuildStatusErrorHandler", result: "UNSTABLE"]],
commitShaSource: [$class: "ManuallyEnteredShaSource", sha: commitHash],
statusResultSource: [ $class: "ConditionalStatusResultSource", results: [[$class: "AnyBuildResult", message: "Tests here!", state: "SUCCESS", location: location]] ]
]);
}
You repository hasn't been updated as it seems that repos were not properly set.
Plugin still reports success as it properly completed its run, but repo list is empty as evident in your message SUCCESS on repos [].
This issue can occur if you have not set up a "GitHub Server" config under the global Jenkins configs:
Manage Jenkins -> Configure System -> GitHub
You can find more details on how to set up a server configuration under the "Automatic Mode" section of the GitHub Plugin documentation:
https://wiki.jenkins-ci.org/display/JENKINS/GitHub+Plugin#GitHubPlugin-AutomaticMode%28Jenkinsmanageshooksforjobsbyitself%29
After much pain with the same issue and plugin, here is a fix not for this particular plugin, but rather a workaround that does not require a plugin and still solves the issue, using curl. You can add the following to your pipeline:
post {
success {
withCredentials([usernamePassword(credentialsId: 'your_credentials_id', usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD')]) {
sh 'curl -X POST --user $USERNAME:$PASSWORD --data "{\\"state\\": \\"success\\"}" --url $GITHUB_API_URL/statuses/$GIT_COMMIT'
}
}
failure {
withCredentials([usernamePassword(credentialsId: 'your_credentials_id', usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD')]) {
sh 'curl -X POST --user $USERNAME:$PASSWORD --data "{\\"state\\": \\"failure\\"}" --url $GITHUB_API_URL/statuses/$GIT_COMMIT'
}
}
}
Where the GITHUB_API_URL is usually constructed like so, for example in the environment directive:
environment {
GITHUB_API_URL='https://api.github.com/repos/organization_name/repo_name'
}
The credentialsId can be created and obtained from Jenkins -> Credentials

How to trigger build for changes in a subdirectory of a Git repo in buildbot

Say you have a repo with this structure:
myrepo/project1
myrepo/project2
How do you configure buildbot so it only triggers build when there's update in myrepo/project1?
Following is sample config I have that triggers on the whole repo:
step_build = steps.ShellCommand(name='somebuildcommand',
command=['some', 'build', 'command'],
workdir="build/",
description='some build command')
factory = util.BuildFactory()
# check out the source
factory.addStep(steps.Git(repourl='https://github.com/some/myrepo.git', mode='incremental'))
factory.addStep(step_build)
c['builders'] = []
c['builders'].append(
util.BuilderConfig(name="runtests",
workernames=["example-worker"],
factory=factory))
Ok, figured this out myself, basically needed to configure scheduler and only trigger on "important" files, example below:
def file_is_important(change):
if not change.files:
return False
for file in change.files:
if file.startswith('important-dir/'):
print 'detected important changes:', change.files
return True
return False
c['schedulers'] = []
c['schedulers'].append(schedulers.SingleBranchScheduler(
name="all",
fileIsImportant=file_is_important,
change_filter=util.ChangeFilter(branch='master'),
treeStableTimer=None,
builderNames=["builder"]))