I have a Groovy repository which contains my Jenkins pipeline's Groovy code.
Currently, I am making changes in an IDE, commiting them to the repository, going to the Jenkins instance, manually triggering a Jenkins job, and checking to see if all of the changes all working. This is taking a lot of time.
Is there a way to do all of this from the IDE itself?
I would suggest to treat your pipeline code like some other code in IT. What are you doing now could be called "manual integration tests" because you are making your code changes and check how that code integrate with other components (like shell commands, jenkins plugins, etc.) on jenkins - this development loop is long and not efficient. So my proposition for you is to write simple unit tests using this framework:
https://github.com/jenkinsci/JenkinsPipelineUnit
So you can test your pipelines on your machine without any interaction with jenkins.
If you think that it's not proper way for you I would suggest to mix using this plugin for running jobs directly from IntelliJ: https://github.com/programisci/jenkins-control-plugin/
and of course IntelliJ git integration to commit your changes to repository.
For executing from the IDE, an option is to create some automation around using the Jenkins CLI. You should be able to see the CLI commands at http://your-jenkins-url/cli.
java -jar jenkins-cli.jar -s https://jenkins.physiq.zone/ replay-pipeline JOB [-n (--number) BUILD#] [-s (--script) SCRIPT]
Replay a Pipeline build with edited script taken from standard input
JOB : Name of the job to replay.
-n (--number) BUILD# : Build to replay, if not the last.
-s (--script) SCRIPT : Name of script to edit, such as Script3, if not the main Jenkinsfile.
For example, in IntelliJ you could use a Run Configuration that:
Downloads the CLI JAR
Executes it with the path to the local file with certain parameters
You can also write a script, Gradle build, or something else that wires into the IDE to pull the CLI JAR and execute a job with your local pipeline code.
For testing you may want to use https://github.com/jenkinsci/JenkinsPipelineUnit as already brought up, or a Gradle plugin that I maintain at https://github.com/mkobit/jenkins-pipeline-shared-libraries-gradle-plugin which uses the previously mentioned library for unit testing and the jenkinsci/jenkins-test-harness for integration testing.
Related
I have a Gradle project that consists of a master project and 2 others that included using includeFlat directive. Each of these 3 projects has its own repo on GitHub. To build it I checkout all 3 projects into a common top folder then cd into the master project and run gradle build. And it works great!
Now I need to deploy the resulting app to AWS EB (Elastic Beanstalk) which is also works great when I produce the artifact locally and then deploy it manually. I want to automate the process so I'm trying to set it up using CodePipelines + Jenkins as described in this document adjusted for Gradle.
The problem is that if I specify 3 Sources in the pipe I end up with my projects extracted on top of each other creating a mess in Jenkins workspace. I need to somehow configure each project to be output to its own directory within Jenkins workspace and I just don't see a way to do it (at least in UI)
Then, of course even if I achieve what I want I need somehow to cd into the master directory to run gradle build and again I'm not sure how to do that
P.S. Great suggestions from #Phil but unfortunately is seems that CodePipeline does not currently support Git submodules or subtrees
I would start common build, when changes happened on any of 3 repos. With say 5 minutes delay, to have single build, even if changes are introduced to more then one repo.
I can't see good way to deal with deployment in other way than using eb deploy... old way... Please install aws tools at your jenkins machine. Create deployment job triggered on successful build. And put bash script doing deployment there. Please put more details about your deployment, that way I can help with deployment script.
Is there an example of how to do testing against the Jenkins Workflow groovy DSL?
Something similar to the example for the Jenkins Job DSL.
What I've done, is that I created a complete dev-test environment. I did it by using a docker-compose file that includes: jenkins, gitlab, and archiva. I push to a "jenkins-test" origin and run the workflow in the safe "test" environment.
Here's my docker-compose in case someone is interested as a starting point, or as a simple test env:
https://github.com/portenez/dry-dock
it's not fully automated, but it's a good start.
No, running a workflow script requires Jenkins to actually be running (since most of what it does is interact directly with Jenkins features like slaves and test results), so the only way to test it is to have a test Jenkins server and run it. By far the most convenient ways to do that in a fully automated way are:
Use JenkinsRule in the Jenkins test harness, like plugins would do in their test sources. Example
Use the acceptance-test-harness project as a dependency to create integration tests driven via Selenium. Example
Currently all our Regression tests are configured in a Jenkins job.We want once the regression tests are completed,it should trigger a plan in our Bamboo server and also record the tests results in Bamboo using TestNG parser.Is it possible?
Ps : I have already seen Bamboo rest-api but can not seem to find a solution.Any suggestions will be highly appreciated.Thanks
It hasn't been updated for a while and I'm not using it currently, thus can't confirm it is still working as desired (the download statistics suggest it being used though), but given there's not much to it, you should be able to achieve the first part of your use case with the Bamboo Notifier, which allows you to Trigger a Bamboo build upon successful completion of a Jenkins job.
The second part should be covered by the Bamboo TestNG Parser task, though you'll need to push your existing test result files to Bamboo by some means of course, possibly by Using the SCP task in Bamboo.
i am using jenkins for automation builds.
my issue is i want to download sources from svn and run the build steps and after running the builds steps once again i want to take latest sources from svn.
is there any plugin for it where my requirement satisfy.
Consider setting up two jobs (A and B) with a shared workspace (job > configure > Advanced Project Options ; click button Advanced...). check custom workspace and define a location). Once job A is finished it triggers job B and job B then performs a svn update plus whatever else you need. In order to avoid parallel execution of A and B, check Block build when upstream project is building and Block build when downstream project is building.
Maybe not a plugin, but you can always run manual SVN commands as part of the build step
Add a new build step to "Execute shell" (if on Linux) or "Execute Windows batch command"
(if on Windows).
Inside, write SVN commands, depending on your OS, for example:
svn up checkout_folder, note that path will be relative to Jenkin's workspace
I have setup a Jenkins job to build a project. I'm using email-ext plugin to send out build notifications with the intent of showing who did what and the path to the files changed. But unfortunately I'm not getting anything. I believe the reason why is that under "Source Code Management" I'm setting it to "None". My shell script that I'm using to drive the build is responsible for check-in out a copy of the code based on a CVS tag and run maven to do the build. In the ext-email i'm using the following syntax
${CHANGES_SINCE_LAST_SUCCESS, reverse=true, showPaths=true,
format="\n====\nChanges for Build # %n\n%c\n",
changesFormat="\n[%r] %d %a %m %p\n"}
Same thing with CHANGES: ${CHANGES, showPaths=true}
Is there a way of getting CHANGES and CHANGES_SINCE_LAST_SUCCESS to work if None option is used under Source Code Management?
Thanks for your help folks.
EmailExt plugin gets that info from Jenkins. As Jenkins has access to that info only via its SCM plugins the answer is "no", you can't do it without specifying the SCM option.
There are two things you can do:
(1) Do it by hand. Which with CVS, if I remember correctly, means having a working copy checked out anyway.
(2) Use SCM checkout/update option, but store the working copy on the side without using it in the build. You'll use twice as much disk-space, but nowadays disk-space is not a problem.
By the way, why are you using CVS? SVN, GIT, and Mercurial are all free.