We want to check in the Jenkins home directory into Github, certainly ignoring some files/folders. As we want to have everything defined by code, we try to achieve this using Job DSL & Pipeline DSL (not writing the pipeline script inside Jenkins GUI but having it read from a workspace file - see below).
My problem right now is that, being not very proficient in both DSL's yet, I don't know how to force git to do an initial clone of the remote repo (and later push) inside the home directory - which is a parent directory of the job's directory.
I tried this pipeline:
node('master') {
dir('../..') {
scm {
git {
remote {
github('company/repo', 'https')
credentials('xyz')
}
}
}
}
}
The job itself is defined like this:
pipelineJob('backup') {
definition {
cps {
script(readFileFromWorkspace('pipelines/backup.groovy'))
sandbox(true)
}
}
}
The job fails with this error message:
ERROR: ‘checkout scm’ is only available when using “Multibranch Pipeline” or “Pipeline script from SCM”
So I guess the above used 'pipelineJob(backup)' does not fit. Should I change this, and if, how, or should I take another approach?
Another shot at this was trying to rewrite the pipeline like this:
node {
dir('../..') {
git url: 'https://github.com/company/repo.git'
}
}
But then it won't work because credentials are missing...
Related
I have built a pipeline which runs a set of sql scripts to generate results. I would like to be able to export the console output, ideally into a .txt file or .xlsx file. Is this possible? For info I drive the pipeline via GitHub.
Thanks
Tried searching the web but have been unable to find a solution
Do you want to Save the Console output to a file and then Commit it to Github? Check the following sample Pipeline.
pipeline {
agent any
stages {
stage('Sample') {
steps {
script {
echo "Somehitng 1"
echo "Something 2"
// Read the console log
def consoleLog = Jenkins.getInstance().getItemByFullName(env.JOB_NAME).getBuildByNumber(Integer.parseInt(env.BUILD_NUMBER)).logFile.text
//Write the log to a file
writeFile(file: "Log_${BUILD_NUMBER}.txt", text: consoleLog, encoding: "UTF-8")
sh'''
git add *
git commit -m "Add console log"
git push
'''
}
}
}
}
}
I tried to run a flutter build from jenkins using the following pipeline code :
pipeline {
agent any
stages {
stage('build') {
steps {
bat 'C:\\path_to_doc\\flutter_dev\\flutter\\bin\\flutter.bat build web -t "C:\\path_to_doc\\lib\\src\\main\\main.dart"'
}
}
}
post{
always {
archiveArtifacts artifacts: 'C:\\path_to_doc\\build\\web\\index.html', fingerprint: true, followSymlinks: false
}
}
}
I got this error in jenkins :
I tried to write the flutter build code in a bat file in the root of my flutter project, and then execute this file on the pipeline code, got the same error.
What is the correct way to proceed to avoid this error ?
Jenkins has a habit of reverting to the initial workspace directory for each separate command. Try setting the directory after your steps{ line:
dir('C:\\path_to_doc\\flutter_dev\\flutter\\bin\\') {
bat 'flutter.bat build web -t "C:\\path_to_doc\\lib\\src\\main\\main.dart"'
}
This will ensure that your script will run in this location. So if your pubspec.yaml is in this location, it should be able to find it. In any case, this is a problem with the directory, so if this doesn't work, some manual debugging would be necessary to see what went wrong.
I have the following gradle task:
task copyToDeployDir(dependsOn: preDeploy) << {
copy {
from codeDir
into deployDir
}
}
This works great for copying from my code dir (usually main or src) into a deployment directory. However, I want to rename a few of these files. Specifically, since I'm using CodeIgniter, I want to rename a couple of the controllers to start with an "install" controller.
Ideally, what I want is to copy over all of the files into deployDir, except any files that are in codeDir/application/controllers, I want to rename to have a suffix of .dist.
I'm not really sure how to go about doing this. I previously tried the renaming stuff in Working with Files in the gradle documentation, but it doesn't seem to work with gradle 2.2.
Try:
task copyToDeployDir(dependsOn: preDeploy, type: Copy) << {
from(codeDir) {
exclude '**/application/controllers/*'
}
from(codeDir) {
include '**/application/controllers/*'
rename {
"${it}.suffix"
}
}
into deployDir
}
I would like to set up some automation inside Jenkins that periodically polls the list of repos in our github organization and automatically sets up a jenkins job for each of that Git repos based on a job template.
What would be a possible solution to achieve this? Thanks!
You can use the Jenkins Job DSL plugin
which is a build step in jobs to create and modify other jobs
From the Wiki:
The job-dsl-plugin allows the programmatic creation of projects using
a DSL. Pushing job creation into a script allows you to automate and
standardize your Jenkins installation, unlike anything possible
before.
An example would be:
def organization = 'jenkinsci'
repoApi = new URL("https://api.github.com/orgs/${organization}/repos")
repos = new groovy.json.JsonSlurper().parse(repoApi.newReader())
repos.each {
def repoName = it.name
job {
name "${organization}-${repoName}".replaceAll('/','-')
scm {
git("git://github.com/${organization}/${repoName}.git", "master")
}
}
}
Jenkins Pipeline is nowadays the way to go.
It defines pipelines using a Jenkinsfile, which you can check into your repos.
Which the best practice is a file like this
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}
As described in the documentation.
I have created a plugin for Gradle to deploy to my company's OpenVMS directory structures, which adds deployDev, deployTest, and, when credentials are provided, deployProd tasks to an application build. I have extracted our configuration files for all environments to a separate project so that we can deploy configuration separately from the application.
These custom tasks depend on the application plugin's distZip task and I haven't found a good way to get the appropriate configuration files into the zip based on the called task (e.g. deployDev includes dev config).
I have tried having the deploy* tasks copy configuration files during the configuration phase with:
task distZip(type:Zip, overwrite:true) {
archiveName = 'config.zip'
from "build/props"
}
deployDev.configure {
copyResources('dev')
}
deployTest.configure {
copyResources('test')
}
project.tasks.findByName('deployProd')?.configure {
copyResources('prod')
}
def copyResources(String environment){
copy {
from "src/$environment/resources"
into 'build/props'
}
}
This doesn't work due to configuration phase executing on all tasks, so even when we execute deployDev, the test configuration has been copied over the dev resources.
I also have tried:
file('build/props').mkdirs()
task distZip(type:Zip, overwrite:true) {
archiveName = 'config.zip'
from "build/props"
outputs.upToDateWhen {false}
}
gradle.taskGraph.whenReady { graph ->
if (graph.hasTask('deployProd')) {
copyResources('prod')
}else if(graph.hasTask('deployTest')){
copyResources('test')
}else if(graph.hasTask('deployDev')){
copyResources('dev')
}
}
def copyResources(String environment){
copy {
from "src/$environment/resources"
into 'build/props'
}
}
However, with this proposal the distZip task always is UP-TO-DATE. Is there a correct way to do this?