How to set up 2 different Jenkins jobs linked with 2 different repos in one Jenkins installation? - github

I have Job1 that is linked to a Github repo and when I push code it builds in it's own workspace (space1)
I want to add a second job (Job2) that will be linked with a different GitHub repo and will build the code in a different workspace (space2).
Notice: 2 different jobs building different code from different repos (both master branches) in different workspaces.
Is it possible with vanilla Jenkins or will I need any extra plugin?
I have researched Pipeline (link1, link2) a little but I try to figure out if it covers my use case.
EDIT:
I have setup the communication between the second job and GitHub but in order for the build to succeed needs an SSH key. But Jenkins provides only one slot for configuring the SSH key.
Also I have added a second workspace .

Related

Azure DevOps build from dynamic repo name

Anybody know if it is possible to pass in a repo name / base the build on a dynamic repo name? This would allow us to share the same build definition across different branches, cutting down on definitions when creating a feature branch, etc.
When using a TFVC repo we would store the different releases in the same repo but different paths. We could reuse the same build definition across different releases/FB's by altering the source path such as $/product/$(release)/......
It appears Git likes to have the repo hard-coded into the build (hence the dropdown - no way to plug in a variable.
While the question is targeted to On-prem Azure DevOps, if it is possible in the hosted environment it would be helpful to know.
I recommend using YAML build templates. By default these check out "self" and are stored in the repo. That way they work on forks, branches etc. Each branch can contain tweaks to the build process as well.
With the 'old' UI based builds this isn't possible.
What you are looking for is actually two things:
templates - this allows you reuse definition accross different pipelines
triggers - this allows you to trigger pipeline when commit happens on different branches
Looks like Task Groups solved the need (mostly). I was hoping to have one build definition that could be shared across multiple branches; while this appears to be possible on the hosted model, on prem is different.
I am able to clone a build (or use templates) to have an entry point into the repo/branch to get the sources, then pass off the work to a common task group. If I need to modify the build process for multiple branches, just modify the task group.

Single CI config for multiple repositories

seeking for advice about such problem.
We have stack of microservices written on NodeJs and running on Kubernetes cluster. We have separate GitHub repository for each of them and currently using Circleci for our CI/CD process. As of now we have about 25-30 repos, but their number will increase and problem that we faced now is that we need to have Circleci config yaml in each repository and if we need to change something globally in our ci/cd pipeline, we need to update this in each repository, which is obviously pretty painful process and Circleci doesn't support to have one config file for multiple repos.
I believe our situation/setup in terms of multiple repos is not unique, does anybody have experience/ideas of which CI tool support described scenario of having one config file for multiple repos?
Below are 2 approaches that I considered when had to deal with similar situation. You'd need to define for yourself what you want to optimize for and make a decision based on that
Optimizing for flexibility and isolation. In this scenario instead of making all repos use the same config file, you're keeping the file in each repo and automating how you manage this file.
For example: you'll have to create a CLI tool or a script to automate copying circle file and committing to appropriate repos (whenever a change needs to happen)
PROS: isolation - all repos have their own configuration, if you ever going to have a golang microservice or different config in one of your nodejs services, modifying CI pipeline wouldn't be an issue
CONS: a bit of extra work to write automation around managing this config separately
Optimizing for easier maintainability. Figure how to share single pipeline configuration across your repos.
For example: use git submodules for keeping circle.yml file, or use separate npm package with circle.yml file. Another alternative is to use a CI tool that supports templating, then define pipeline template and re-use it for each individual pipeline (one of the CI tools that supports it - Teamcity)
I personally picked approach #1 in similar situation. IMHO, this is a price one have to pay when one decides to go with microservices to not end up with a platform that is rather a distributed monolith :) also I really liked when all repos are descriptive and self contained and CI pipeline as code is one of the ways to help achieve that
In my mind you have 2 options - you could have a single CI job/config that can deploy any single/multiple services (if all the services are the same). Or if every service is different than you need a separate job/config for each. If it's somewhere in the middle it's a question of whether you want a single job that has a bunch of if/then statements e.g. "if repo = user then do this special thing." The if/then approach worked fine for me up to a point, but eventually, there were too many special cases at it was easier to just go with the unique config for each service.
I solved the issue of it "being hard to make a 1 line change across 30 git repos" by having a git superuser. Basically, normal users can only merge using PRs, but the superuser can commit directly. Since I'm only changing things like config files there are rarely merge conflicts or broken test cases so it works. Here's some sample code:
#!/usr/bin/env bash
for dir in /temp/*/
do
cd $dir
git pull
sed 's/Nick/John/g' report.txt > report_new.txt
git commit -m "CI change" && git push
cd ..
done

Building multiple Gradle projects in Jenkins with AWS CodePipeline

I have a Gradle project that consists of a master project and 2 others that included using includeFlat directive. Each of these 3 projects has its own repo on GitHub. To build it I checkout all 3 projects into a common top folder then cd into the master project and run gradle build. And it works great!
Now I need to deploy the resulting app to AWS EB (Elastic Beanstalk) which is also works great when I produce the artifact locally and then deploy it manually. I want to automate the process so I'm trying to set it up using CodePipelines + Jenkins as described in this document adjusted for Gradle.
The problem is that if I specify 3 Sources in the pipe I end up with my projects extracted on top of each other creating a mess in Jenkins workspace. I need to somehow configure each project to be output to its own directory within Jenkins workspace and I just don't see a way to do it (at least in UI)
Then, of course even if I achieve what I want I need somehow to cd into the master directory to run gradle build and again I'm not sure how to do that
P.S. Great suggestions from #Phil but unfortunately is seems that CodePipeline does not currently support Git submodules or subtrees
I would start common build, when changes happened on any of 3 repos. With say 5 minutes delay, to have single build, even if changes are introduced to more then one repo.
I can't see good way to deal with deployment in other way than using eb deploy... old way... Please install aws tools at your jenkins machine. Create deployment job triggered on successful build. And put bash script doing deployment there. Please put more details about your deployment, that way I can help with deployment script.

Is it possible to apply several pipeline to one Github repository?

I would ask you a question about Jenkins :
Is it possible to apply several pipeline to one github repository ?
If yes, someone can help me with tutorial our documents or other things ?
I've tryed to do this with the BlueOcean plugin but this seems impossible ...
I would like to apply 6 pipepline on one github repository triggerd by events so this question is essential to me !
When you create a job on Jenkins you bind it to a repo.
If you create two jobs and bind them both to the same repo they will both be triggered when you push to your repo. Then you attach your pipelines to these jobs.
You can create pipeline jobs pointing at different build scripts (ie, filenames other than /Jenkinsfile) in the repo or different branches just fine. I do it all the time.
However, be aware that any scm polling you do on the repo/branch could prevent other jobs from identifying changes depending on how you are triggering them.

One job with different steps for different branches

I am currently in a weird situation. We have one repository with multiple branches (CI and an official branch). Our current job structure has two jobs, one for each branch, for each platform we support. Our official build has different steps than our development build, all of which can be run using a script. My question is can we set up one job per platform that will look at both of the branches and run different steps according to what branch is changed? I am aware of the environment vars that get set with the github plugin. Is this something that would need to be utilized?
Thanks!