Installing Jenkins plugin (mercurial) in Docker shows in plugins folder but not in Jenkins itself - plugins

Problem: I can't seem to successfully install the mercurial plugin to jenkins using the Dockerfile and plugins.txt combination.
What I've done so far:
I have a Dockerfile that's loading Jenkins. It has the following lines:
FROM jenkins:1.642.1
COPY plugins.txt /usr/share/jenkins/plugins.txt
RUN /usr/local/bin/plugins.sh /usr/share/jenkins/plugins.txt
My plugins.txt has this line:
mercurial:1.54
When I build the image and run the container, everything seems to work, there are no errors or complaints. But the Mercurial plugin isn't marked as installed when I go to Manage Plugins, and if I try to make a build, Mercurial isn't an option under Source Code Management.
I've tried going to:
<jenkins ip address>:8080/reload
As well as the "Reload Configuration from Disk" option in Manage Jenkins. Mercurial still isn't visibly installed after either of these.
I've also done this on the command line:
docker exec -i -t container bash
ls /var/jenkins_home/plugins/
And at this point I'm totally confused, because there's mercurial, mercurial.jpi and mercurial.jpi.pinned right there in the list. Does anyone have any ideas on this? I would like to have Mercurial installed on Jenkins as soon as it's loaded from the Dockerfile without having to do it manually...
Also, I tried doing this with git-changelog as well to see if another plugin would work better, and had the same result.

As you can see on the Mercurial Plugin wiki page, the plugin currently has four mandatory dependencies, and one optional:
credentials
matrix-project
multiple-scms (optional)
ssh-credentials
scm-api
The plugin installation mechanism that you're using with the Jenkins Docker image does not automatically install dependent plugins for you, as mentioned in the documentation for the jenkins image:
All plugins need to be listed as there is no transitive dependency resolution.
Therefore you need to additionally list those plugins, and any of their transitive dependencies, in your plugins.txt file.
At the moment, the simplest way to get the full list would be to start your container (potentially without plugins.txt), and then install the Mercurial plugin via the Plugin Manager, which will be installed along with all of its dependencies. Then you can see a list of which plugins are required via $JENKINS_HOME/plugins.

Related

SBT always downloads the packages/scala libraries on Docker, docker-compose

I have recently installed SBT on Docker Ubuntu machine to get started with Scala. When I started docker initially, it started grabbing all the Java, sbt JAR's from the remote locations (https://repo.scala-sbt.org/scalasbt/debian/sbt-0.13.17.deb).
But, whenever I run sbt command, it again starts downloading the sbt JAR. Is there a way of maintaining a global cache whereby artifacts are only downloaded once and not every time I remote to docker container?
My solution to this was a multi stage build.
Have a “base” docker image.
Copy in only build.sbt, projects.sbt and the file which sets out the sbt version from your project.
That defines the required dependencies. The last line in that base image is “sbt update” - I.e fetch them. That “base image” has the dependencies in it… and is reuseable. Just remember to run it when you change library versions etc to rebuild it.
In the ”build” image… copy over the project and proceed as normal… make sure sbt is resolving from maven-local, and it should use the “cache”… which is already in place from the paragraph above.
I’d be interested to hear other answers, but that’s my solution… YMMV :-).
That works for me on a cloud / Kube pipeline.

Deployable JAR file from JB Plugin Repo does not contain my files, but the plugin runs correctly locally

Background
I am working on a simple plugin, and have already deployed to the Plugin Repository once before (successfully).
Since my last successful deployment, I found that I had a lot of issues with the IDE. After completely upgrading, and modifying my plugin's directory structure, I have been able to get the plugin to Run again.
Issue
tl;dr - I have an updated plugin in the JetBrain's Plugin Repository that does not work as intended, and I cannot update it correctly!
When I run the plugin, a second instance of the IDE comes up with my plugin working correctly. I edit my code and run the plugin again - the plugin runs smoothly and the updates are applied!!
With all of this, I decided to deploy my updated plugin to the Repository again. Once that was done, I decided to download the plugin and try it out myself; just to make sure things worked.
The issue is that nothing can be found in the plugin file!! Just the updated plugin.xml file and Manifest.mf file. The total size of the archive file is around 500bytes. I know a correct archive would have more files in it, and in my case, the file size should be around 6kb (based on my first successful archive file).
So how can my local IDE instance find the files correctly, but the deployment feature cannot? How does the deployment feature actually work? I get the feeling I have the structure wrong, eventhough the new IDE instance works perfectly
Plugin
GitHub
JetBrain's Plugin Repository
When you install the plugin, the version is shown as v1.1; however, that is not true, in reality. One of the easiest features to determine the actual version of the plugin is the Folded Text foreground color.
v1.0 - RED
v1.1 - YELLOW
Deployment
Preparing Plugin Module for Deployment + resulting plugin.jar file
Contents of plugin.jar
It seems possible that because of the restructuring an old ChroMATERIAL.xml file was left somewhere in the build output. Somehow this could end up in the plugin jar. An invocation of Build > Rebuild Project should fix this problem.
There could also be problems in the project or module configuration, but the project files are not included in the GitHub repository, so that cannot be checked.

Build KafkaOffsetMonitor tool manually

I am trying to monitor kafka using KafkaOffsetMonitor tool. It is working fine as I used already built jar available at its github page. Now I want to make some changes in this tool but I dont know how to manually build it. I have downloaded the zip file from github page. Now how should I build it?
PS: Steps would be helpful
Below are the steps which may help you:
1. Checkout and get the source code into your local machine or you may have to unzip it if you have downloaded zip file.
2. Go to the folder which you have extracted or checked-out.
3. Run below command:
mvn clean package -U -DskipTests
Note: make sure you have installed maven in your machine.
Make the changes in the source if you want to modify the tool and build it using above steps.
I know that I'm coming at this pretty late, but I ran into the same problem. Basically, you need a Java jdk, scala, and sbt installed first. You didn't post what os you're dealing with, so it would be hard for me to give you steps for that. I use Gentoo Linux and you can install it by running emerge -av sbt
Once sbt is installed, just clone KafkaOffsetMonitor Github repository change to the top level directory and run the following sbt command: sbt assembly
The jar you will be looking for will be in: ./target/scala-<scala_version>/KafkaOffsetMonitor-assembly-<kafkaoffsetmonitor_version>.jar

Google App Engine, Maven and Eclipse development setup

I'll try keep this short. I have Eclipse with an installed M2E (Maven to Eclipse) plugin. I have a GAE (Google App Engine) project I'm working on. Everything is working ok apart from one really annoying thing: I have to stop/start the devserver every time I make a change.
If you have any experience with this setup then you might be able to answer this simple question?
I start the development server with "mvn appegnine:devserver" on the command line. Now I would expect that if I made changes to a *.jsp for example that those changes would automatically be updated on the devserver. Is this what happens with you?
I have noticed that if I make changes to *.jsp files under my target folder then devserver will see those changes and updates as I would expect. I think my problem lies with Eclipse not copying changes to target folder, but not sure if is even suppose to?
Does anyone have any suggestions on how I should progress investigating this? I've ran out of ideas :-/
I thank you in advance for any comments you may have.
P.s I know I can run "mvn package" to update files, but this is slow and the devserver runs out of memory after a do it twice.
This can be little painful, depending on how you want to work and which version of eclipse you're using.
Install the m2e-wtp plugin if you haven't. It's the secret sauce that makes appengine projects work in eclipse. Note this isn't m2e - but another plugin.
Install the GPE - the google plugin for eclipse if you haven't
Make sure your project is being managed by m2e as a maven project.
Go into your project properties - enable it as an appengine project using the GPE (listed under 'Google'). Don't forget to tick HRD while you're here.
Go to your project build path (Properties -> Java Build Path).
Ensure on the source tab that your src/main/resources doesnt have an ** exclusion.
Ensure on the libraries tab your have the three libraries 'JDK', 'Google Appengine' and 'Maven Dependencies' and nothing else
Ensure on the order and export tab that the appengine dependencies are above the maven dependencies.
It sounds pretty ridiculous - i'm not really sure why its still so painful, but that is a good recipe for success. Once that's done, this should allow you to run in debug from eclipse itself, with hotloading of code, jsps, css, scripts etc. I've had this work in helios, indigo and juno.
You can read more about the m2e-wtp setup instructions here. They refer to GWT but it's the same for appengine (I'm not sure why the emphasis on using GWT on GAE) because its actually about the correct setup of GPE and Maven.
You will also find that you may need to repeat some parts of step 5 pretty frequently - if your app isn't loading properly take a quick look to ensure that your resources haven't been excluded. This happens when you update your project configuration using the m2e plugin.
The wtp-m2e plugin updates the target folder as resources modified - so this should also resolve your issues running from the command line, but i can't vouch for that - I prefer to run straight out of eclipse.
I have the same problem as you, however I resolved with other way. I use FileSync plugin (which can be found in the market place).
With this plugin you configure an input directory (webapp) and output directory (target).
Any change made to the webapp will be passed to the target.
I have helped too.
You can use rsync like this:
rsync -r --existing src/main/webapp/ target/ROOT
where "ROOT" is the project build finalName.
The below point worked for me.
Ensure on the order and export tab that the appengine dependencies are above the maven dependencies.

Debugging a Jenkins Plugin but using existing Jenkins

I need to extend an existing plugin of our company for Jenkins.
The thing is, with the new and clean version that eclipse creates me (using maven with goal: hpi:run) I always get the problem that the plugin isn't embedded at all. Even though it is listed under "Manage Plugins" and the exact same version works fine if you deploy it as an hpi-file to a new jenkins installation.
So I found this: hpi:run -DhudsonHome=C:\Jenkins which supposedly should do the trick.
But it doesn't. It still uses a temporary folder next to src called "work".
When I took a look at the console output I recognized that the environmental system variable HUDSON_HOME wasn't set, so I did that and now it uses the existing Jenkins directory.
However, that is not an optimal solution because I'd need to use several Jenkins Servers on a Development-Machine. Is there any way I could get the -DhudsonHome Parameter to work?
Thanks.
Best regards.
In eclipse, under the run configuration you have made, look at the JRE tab. In the box for VM arguments you can add -DJENKINS_HOME=.