Copy different files to build directory based on executed Gradle task - deployment

I have created a plugin for Gradle to deploy to my company's OpenVMS directory structures, which adds deployDev, deployTest, and, when credentials are provided, deployProd tasks to an application build. I have extracted our configuration files for all environments to a separate project so that we can deploy configuration separately from the application.
These custom tasks depend on the application plugin's distZip task and I haven't found a good way to get the appropriate configuration files into the zip based on the called task (e.g. deployDev includes dev config).
I have tried having the deploy* tasks copy configuration files during the configuration phase with:
task distZip(type:Zip, overwrite:true) {
archiveName = 'config.zip'
from "build/props"
}
deployDev.configure {
copyResources('dev')
}
deployTest.configure {
copyResources('test')
}
project.tasks.findByName('deployProd')?.configure {
copyResources('prod')
}
def copyResources(String environment){
copy {
from "src/$environment/resources"
into 'build/props'
}
}
This doesn't work due to configuration phase executing on all tasks, so even when we execute deployDev, the test configuration has been copied over the dev resources.
I also have tried:
file('build/props').mkdirs()
task distZip(type:Zip, overwrite:true) {
archiveName = 'config.zip'
from "build/props"
outputs.upToDateWhen {false}
}
gradle.taskGraph.whenReady { graph ->
if (graph.hasTask('deployProd')) {
copyResources('prod')
}else if(graph.hasTask('deployTest')){
copyResources('test')
}else if(graph.hasTask('deployDev')){
copyResources('dev')
}
}
def copyResources(String environment){
copy {
from "src/$environment/resources"
into 'build/props'
}
}
However, with this proposal the distZip task always is UP-TO-DATE. Is there a correct way to do this?

Related

Can't run flutter build from Jenkins

I tried to run a flutter build from jenkins using the following pipeline code :
pipeline {
agent any
stages {
stage('build') {
steps {
bat 'C:\\path_to_doc\\flutter_dev\\flutter\\bin\\flutter.bat build web -t "C:\\path_to_doc\\lib\\src\\main\\main.dart"'
}
}
}
post{
always {
archiveArtifacts artifacts: 'C:\\path_to_doc\\build\\web\\index.html', fingerprint: true, followSymlinks: false
}
}
}
I got this error in jenkins :
I tried to write the flutter build code in a bat file in the root of my flutter project, and then execute this file on the pipeline code, got the same error.
What is the correct way to proceed to avoid this error ?
Jenkins has a habit of reverting to the initial workspace directory for each separate command. Try setting the directory after your steps{ line:
dir('C:\\path_to_doc\\flutter_dev\\flutter\\bin\\') {
bat 'flutter.bat build web -t "C:\\path_to_doc\\lib\\src\\main\\main.dart"'
}
This will ensure that your script will run in this location. So if your pubspec.yaml is in this location, it should be able to find it. In any case, this is a problem with the directory, so if this doesn't work, some manual debugging would be necessary to see what went wrong.

How can we check in Jenkins home directory to Github?

We want to check in the Jenkins home directory into Github, certainly ignoring some files/folders. As we want to have everything defined by code, we try to achieve this using Job DSL & Pipeline DSL (not writing the pipeline script inside Jenkins GUI but having it read from a workspace file - see below).
My problem right now is that, being not very proficient in both DSL's yet, I don't know how to force git to do an initial clone of the remote repo (and later push) inside the home directory - which is a parent directory of the job's directory.
I tried this pipeline:
node('master') {
dir('../..') {
scm {
git {
remote {
github('company/repo', 'https')
credentials('xyz')
}
}
}
}
}
The job itself is defined like this:
pipelineJob('backup') {
definition {
cps {
script(readFileFromWorkspace('pipelines/backup.groovy'))
sandbox(true)
}
}
}
The job fails with this error message:
ERROR: ‘checkout scm’ is only available when using “Multibranch Pipeline” or “Pipeline script from SCM”
So I guess the above used 'pipelineJob(backup)' does not fit. Should I change this, and if, how, or should I take another approach?
Another shot at this was trying to rewrite the pipeline like this:
node {
dir('../..') {
git url: 'https://github.com/company/repo.git'
}
}
But then it won't work because credentials are missing...

Access tasks' properties via Gradle Tooling API

I'm using the gradle tooling API to run functional tests for my own build script.
I would like to access tasks' properties, e.g. the destinationDir of a JavaCompile task, and i don't know how to accomplish this.
Simple example:
Snippet in my buildScript (I defined a SourceSet 'openjpa'):
compileOpenjpaJava {
destinationDir = file(getOpenjpaClassesDir())
}
private String getOpenjpaClassesDir(){
return "build/classes_openjpa"
}
In my functional test I read about a way to access the tasks, but I cannot access the destinationDir-property.
GradleProject project = connection.getModel(GradleProject.class);
project.tasks.each { myTask ->
if ("compileOpenjpaJava" == myTask.name) {
return myTask.destinationDir.absolutePath // brings a runtime error like: unknown property 'destinationDir'
}
}
A similar question w/o answers is here: Gradle tooling api get task outputs
Is it possible at all to access tasks' properties?
Thanks
Jan

How to change a file name based on the directory it is in for packaging

I have the following gradle task:
task copyToDeployDir(dependsOn: preDeploy) << {
copy {
from codeDir
into deployDir
}
}
This works great for copying from my code dir (usually main or src) into a deployment directory. However, I want to rename a few of these files. Specifically, since I'm using CodeIgniter, I want to rename a couple of the controllers to start with an "install" controller.
Ideally, what I want is to copy over all of the files into deployDir, except any files that are in codeDir/application/controllers, I want to rename to have a suffix of .dist.
I'm not really sure how to go about doing this. I previously tried the renaming stuff in Working with Files in the gradle documentation, but it doesn't seem to work with gradle 2.2.
Try:
task copyToDeployDir(dependsOn: preDeploy, type: Copy) << {
from(codeDir) {
exclude '**/application/controllers/*'
}
from(codeDir) {
include '**/application/controllers/*'
rename {
"${it}.suffix"
}
}
into deployDir
}

Generate jenkins jobs for all github repos

I would like to set up some automation inside Jenkins that periodically polls the list of repos in our github organization and automatically sets up a jenkins job for each of that Git repos based on a job template.
What would be a possible solution to achieve this? Thanks!
You can use the Jenkins Job DSL plugin
which is a build step in jobs to create and modify other jobs
From the Wiki:
The job-dsl-plugin allows the programmatic creation of projects using
a DSL. Pushing job creation into a script allows you to automate and
standardize your Jenkins installation, unlike anything possible
before.
An example would be:
def organization = 'jenkinsci'
repoApi = new URL("https://api.github.com/orgs/${organization}/repos")
repos = new groovy.json.JsonSlurper().parse(repoApi.newReader())
repos.each {
def repoName = it.name
job {
name "${organization}-${repoName}".replaceAll('/','-')
scm {
git("git://github.com/${organization}/${repoName}.git", "master")
}
}
}
Jenkins Pipeline is nowadays the way to go.
It defines pipelines using a Jenkinsfile, which you can check into your repos.
Which the best practice is a file like this
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}
As described in the documentation.