Generate jenkins jobs for all github repos - github

I would like to set up some automation inside Jenkins that periodically polls the list of repos in our github organization and automatically sets up a jenkins job for each of that Git repos based on a job template.
What would be a possible solution to achieve this? Thanks!

You can use the Jenkins Job DSL plugin
which is a build step in jobs to create and modify other jobs
From the Wiki:
The job-dsl-plugin allows the programmatic creation of projects using
a DSL. Pushing job creation into a script allows you to automate and
standardize your Jenkins installation, unlike anything possible
before.
An example would be:
def organization = 'jenkinsci'
repoApi = new URL("https://api.github.com/orgs/${organization}/repos")
repos = new groovy.json.JsonSlurper().parse(repoApi.newReader())
repos.each {
def repoName = it.name
job {
name "${organization}-${repoName}".replaceAll('/','-')
scm {
git("git://github.com/${organization}/${repoName}.git", "master")
}
}
}

Jenkins Pipeline is nowadays the way to go.
It defines pipelines using a Jenkinsfile, which you can check into your repos.
Which the best practice is a file like this
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
echo 'Testing..'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}
As described in the documentation.

Related

Exporting Console Output of Jenkins Pipeline

I have built a pipeline which runs a set of sql scripts to generate results. I would like to be able to export the console output, ideally into a .txt file or .xlsx file. Is this possible? For info I drive the pipeline via GitHub.
Thanks
Tried searching the web but have been unable to find a solution
Do you want to Save the Console output to a file and then Commit it to Github? Check the following sample Pipeline.
pipeline {
agent any
stages {
stage('Sample') {
steps {
script {
echo "Somehitng 1"
echo "Something 2"
// Read the console log
def consoleLog = Jenkins.getInstance().getItemByFullName(env.JOB_NAME).getBuildByNumber(Integer.parseInt(env.BUILD_NUMBER)).logFile.text
//Write the log to a file
writeFile(file: "Log_${BUILD_NUMBER}.txt", text: consoleLog, encoding: "UTF-8")
sh'''
git add *
git commit -m "Add console log"
git push
'''
}
}
}
}
}

Trigger multiple azure devops Pipelines in Parallel to create VM's on Azure using same Terraform module

I have terraform Module for example to create a VM on Azure and it works when I trigger the Pipeline.
But When I trigger the Pipeline twice it fails to create two VM's. How do I manipulate terraform State file ? Only way I can think of is two run multiple pipeline in different agents, does that work ?
What we have done is create terraform "common" modules (basically a subdirectory with tf files), which we source into a terraform environment multiple times with different parameters.
These we usually put into a list with a loop.
In your environments terraform:
locals {
azure_vms = [
{ name = "vm1", size = "Standard_B2s" },
{ name = "vm2", size = "Standard_B4s" }
]
}
module "my_azure_vm" {
source = "./common/my_azure_vm"
for_each = { for vm in local.azure_vms : vm.name => vm }
size = each.value.size
name = each.value.name
}
In common my_azure_vm, you can define inputs for size and name, then use those to create the VM's with your standard parameters.

How can we check in Jenkins home directory to Github?

We want to check in the Jenkins home directory into Github, certainly ignoring some files/folders. As we want to have everything defined by code, we try to achieve this using Job DSL & Pipeline DSL (not writing the pipeline script inside Jenkins GUI but having it read from a workspace file - see below).
My problem right now is that, being not very proficient in both DSL's yet, I don't know how to force git to do an initial clone of the remote repo (and later push) inside the home directory - which is a parent directory of the job's directory.
I tried this pipeline:
node('master') {
dir('../..') {
scm {
git {
remote {
github('company/repo', 'https')
credentials('xyz')
}
}
}
}
}
The job itself is defined like this:
pipelineJob('backup') {
definition {
cps {
script(readFileFromWorkspace('pipelines/backup.groovy'))
sandbox(true)
}
}
}
The job fails with this error message:
ERROR: ‘checkout scm’ is only available when using “Multibranch Pipeline” or “Pipeline script from SCM”
So I guess the above used 'pipelineJob(backup)' does not fit. Should I change this, and if, how, or should I take another approach?
Another shot at this was trying to rewrite the pipeline like this:
node {
dir('../..') {
git url: 'https://github.com/company/repo.git'
}
}
But then it won't work because credentials are missing...

Jenkins publish changes in repository to github using pipeline and groovy

I have a jenkins organization pipeline job that executes on all repositories that have "Jenkinsfile" defined. The job clones the repository from github, then runs the powershell script that increments the version number in the file. I'm now trying to publish that updated file back to the original repository on github, so when developer pulls the changes he gets the latest version number.
I tried using the script (inside "jenkinsfile") as suggested in jenkins JIRA (https://issues.jenkins-ci.org/browse/JENKINS-28335), but to no avail. Any suggestions will be appreciated. Basically need to execute "git commit" and "git push" using the same parameters defined for a job.
Just as a reference, here is a previous solution used for free style (not pipeline job): How to push changes to github after jenkins build completes?.
Actually found couple solutions, first I modied script from Jenkins like this (some objects changed in workflow pipeline):
import hudson.FilePath
import org.eclipse.jgit.transport.URIish
node {
env.WORKSPACE = pwd()
stage 'Checkout'
checkout scm
def build = manager.build
def listener = manager.listener
def workspace = new FilePath(new File(env.WORKSPACE))
def environment = build.getEnvironment(listener)
final def project = build.getParent()
final def gitScm = project.getTypicalSCM()
final def gitClient = gitScm.createClient(listener, environment, build, workspace);
final def gitTagName = "TAG_NAME"
final def comment = "COMMENT"
final def remoteURI = new URIish("origin")
gitClient.tag(gitTagName, comment)
gitClient.push().tags(true).to(remoteURI).execute()
}
You need to run the script multiple times and then allow code execution in jenkins (manage jenkins->in process script approval).
Another solution, much simpler (using this one for now):
bat "\"${tool 'Git'}\" config user.email \"ci#virtocommerce.com\""
bat "\"${tool 'Git'}\" config user.name \"Virto Jenkins\""
bat "\"${tool 'Git'}\" commit -am \"Updated version number\""
bat "\"${tool 'Git'}\" push origin HEAD:master -f"
You have to have Git tool with a name "Git" configured in Jenkins.

Copy different files to build directory based on executed Gradle task

I have created a plugin for Gradle to deploy to my company's OpenVMS directory structures, which adds deployDev, deployTest, and, when credentials are provided, deployProd tasks to an application build. I have extracted our configuration files for all environments to a separate project so that we can deploy configuration separately from the application.
These custom tasks depend on the application plugin's distZip task and I haven't found a good way to get the appropriate configuration files into the zip based on the called task (e.g. deployDev includes dev config).
I have tried having the deploy* tasks copy configuration files during the configuration phase with:
task distZip(type:Zip, overwrite:true) {
archiveName = 'config.zip'
from "build/props"
}
deployDev.configure {
copyResources('dev')
}
deployTest.configure {
copyResources('test')
}
project.tasks.findByName('deployProd')?.configure {
copyResources('prod')
}
def copyResources(String environment){
copy {
from "src/$environment/resources"
into 'build/props'
}
}
This doesn't work due to configuration phase executing on all tasks, so even when we execute deployDev, the test configuration has been copied over the dev resources.
I also have tried:
file('build/props').mkdirs()
task distZip(type:Zip, overwrite:true) {
archiveName = 'config.zip'
from "build/props"
outputs.upToDateWhen {false}
}
gradle.taskGraph.whenReady { graph ->
if (graph.hasTask('deployProd')) {
copyResources('prod')
}else if(graph.hasTask('deployTest')){
copyResources('test')
}else if(graph.hasTask('deployDev')){
copyResources('dev')
}
}
def copyResources(String environment){
copy {
from "src/$environment/resources"
into 'build/props'
}
}
However, with this proposal the distZip task always is UP-TO-DATE. Is there a correct way to do this?