I want to make a custom buildpack on bluemix, as part of it I am trying to add my own jar file as a javaagent. I used to work with Tomcat where I just added the extra agent to the catalina.sh script.
On bluemix those are the steps I took:
I create new project and uploaded my code.
I cloned the default java buildpack to my own git repository.
On the repository I added the .jar file on /lib/java_buildpack folder.
Now is the step I have trouble with, I located the:
java_opts.add_javaagent(#droplet.sandbox + 'javaagent.jar')
function call which according to the comments should so exactly what I am looking for.
the issue is that when I check the function I see that it calls the following function:
qualify_path(path, root = #droplet_root)
"$PWD/#{path.relative_path_from(root)}"
I cant figure out where is this #droplet_root position is, if I could find it I could upload my jar file there.
I tried adding the relative position like this:
java_opts << "java_buildpack/myAgent.jar"
But it didnt work.
Any suggestions on how it might be achieved? where should I place the file or is there any other way?
Forking the buildpack is one way to achieve this. You can implement this as a "framework" in the Java buildpack. Here are a few samples you can refer to which also adds an agent jar:
https://github.com/cloudfoundry/java-buildpack/blob/master/lib/java_buildpack/framework/new_relic_agent.rb
https://github.com/cloudfoundry/java-buildpack/blob/master/lib/java_buildpack/framework/jrebel_agent.rb
Another little hacky way is to simply add the agent jar to your application package, and then add a Java option to enable the agent, using the JAVA_OPTS environment variable. That requires you to find out the path where the agent jar ends up in the running application container. You can browse to it by using "cf files". This will have a dependency on the internal structure of the droplet so it may get broken if the buildpack changes the droplet structure.
Related
Background
I am working on a simple plugin, and have already deployed to the Plugin Repository once before (successfully).
Since my last successful deployment, I found that I had a lot of issues with the IDE. After completely upgrading, and modifying my plugin's directory structure, I have been able to get the plugin to Run again.
Issue
tl;dr - I have an updated plugin in the JetBrain's Plugin Repository that does not work as intended, and I cannot update it correctly!
When I run the plugin, a second instance of the IDE comes up with my plugin working correctly. I edit my code and run the plugin again - the plugin runs smoothly and the updates are applied!!
With all of this, I decided to deploy my updated plugin to the Repository again. Once that was done, I decided to download the plugin and try it out myself; just to make sure things worked.
The issue is that nothing can be found in the plugin file!! Just the updated plugin.xml file and Manifest.mf file. The total size of the archive file is around 500bytes. I know a correct archive would have more files in it, and in my case, the file size should be around 6kb (based on my first successful archive file).
So how can my local IDE instance find the files correctly, but the deployment feature cannot? How does the deployment feature actually work? I get the feeling I have the structure wrong, eventhough the new IDE instance works perfectly
Plugin
GitHub
JetBrain's Plugin Repository
When you install the plugin, the version is shown as v1.1; however, that is not true, in reality. One of the easiest features to determine the actual version of the plugin is the Folded Text foreground color.
v1.0 - RED
v1.1 - YELLOW
Deployment
Preparing Plugin Module for Deployment + resulting plugin.jar file
Contents of plugin.jar
It seems possible that because of the restructuring an old ChroMATERIAL.xml file was left somewhere in the build output. Somehow this could end up in the plugin jar. An invocation of Build > Rebuild Project should fix this problem.
There could also be problems in the project or module configuration, but the project files are not included in the GitHub repository, so that cannot be checked.
I'm trying to use teamcity deployer plugin to send my build result (war file) via ssh to another computer in network.
my problem is how to config deployer to find my builded war file.
I used %teamcity.build.workingDir%**/*.war in Artifacts path setting but it can not find any file in there.
The log show that it try to find my file in here /home/teamcity/TeamCity/buildAgent/work/c4bca27d2b00a6fe**/*.war.
the path is correct but It's not working...
the teamcity document for Accessing Build Artifacts is not clear, and does not show what should i use in setting dialog.
Update:
I tried to use **/mywar-1.0.war and **/build/libs/mywar-1.0.war, both works but now the problem is, it deploy the file with it's subdirectories like this dest/build/libs/mywar-1.0.war
but i need dest/mywar-1.0.war,
so still don't know how to configure it...
Teamcity deployer plugin, uses pattern as a Artifacts path to find files for deploy, in most cases as #Vlad said using **/*.fileType or **/filename.type is enough,
for example : **/*.war or **/myprojectfile.war
but sometimes your output files are in subdirectory tree, so using pattern case the deployer to create those subdirectories on destination.
in this case, I just need the war file without its subdirectory so the right way is using complete path to that file.
for example :
my war file is in build/libs/ folder after build process
so using build/libs/mywar.war as a Artifacts path, will deploy the war to destination without its subdirectory.
Artifacts resolved under checkout directory as documentation says, so just **/*.war is enough.
I want to be able to compile my project once and pass it through multiple build steps on a CI server. But SBT puts files in a staging area like the one below.
/home/vagrant/.sbt/0.13/staging/
This means the project is not stand-alone and for every CI step it is going to compile it again.
How can I tell SBT to keep things simple and stand-alone and to make sure everything it needs is inside the project directory?
FYI, the staging area is used for the target files when the source folder is not read/write. Making the source folder read/write should fix this.
If you pass -Dsbt.global.staging=./.staging to sbt when starting it up, the staging directory will be .staging in the project's directory.
I figured that out by looking at the sbt source and patching that together with how Paul P's sbt runner passes the value for the sbt boot path.
If that doesn't accomplish what you want, then you might be able to make something go with a custom resolver. The sbt Build Loaders page talks about creating a custom resolver that lets you specify more detail about where dependencies are written. If my solution doesn't get you what you want, you'd probably need to do something like that.
I need to extend an existing plugin of our company for Jenkins.
The thing is, with the new and clean version that eclipse creates me (using maven with goal: hpi:run) I always get the problem that the plugin isn't embedded at all. Even though it is listed under "Manage Plugins" and the exact same version works fine if you deploy it as an hpi-file to a new jenkins installation.
So I found this: hpi:run -DhudsonHome=C:\Jenkins which supposedly should do the trick.
But it doesn't. It still uses a temporary folder next to src called "work".
When I took a look at the console output I recognized that the environmental system variable HUDSON_HOME wasn't set, so I did that and now it uses the existing Jenkins directory.
However, that is not an optimal solution because I'd need to use several Jenkins Servers on a Development-Machine. Is there any way I could get the -DhudsonHome Parameter to work?
Thanks.
Best regards.
In eclipse, under the run configuration you have made, look at the JRE tab. In the box for VM arguments you can add -DJENKINS_HOME=.
I'm evaluating Maven 3 at work. For several example projects I have to deploy them to a server (no repository), but that's not the problem.
In my current example-project I'm trying to upload only the "jar-with-dependencies".
and exactly that's my problem.
It all works fine, except that the main-artifact AND the jar-with-dependencies (created by the assembly-plugin) are uploaded.
How do I prevent Maven or rather the deploy-phase from uploading the main-jar and only upload a given or specified file (in this case, the assembly-file "jar-with-dependencies")?
Referring to the question Only create executable jar-with-dependencies in Maven, I can't just alter the packaging-setting to pom, because it will also prevent the assembly-plugin from adding my classes to the JAR file. It only creates a JAR file with the files of the dependencies.
I hope I'm clear about my problem, and you can help me ;)
if you just looking how to add a file to be deployed you can take a look here:
http://mojo.codehaus.org/build-helper-maven-plugin/attach-artifact-mojo.html
May be this helps. If not express your needs more in detail.
There seems to be no way to configure the deploy plugin to filter out some of the artifacts from a project and selectively deploy the others. Faced with a similar problem, we solved this with the ease-maven-plugin. It fit well into our release process but might not be the right choice for everyone as it mandates a two-step approach. During the build you would make a list of all artifacts and filter out those that you want deployed. In a second step, you then run mvn deploy on a separate project (or separate profile) in which the list of artifacts is attached to the project as the only artifacts which then get deployed. See the examples in the source code of the ease maven plugin to better understand how it works.
The original version is not able to filter out specific artifacts of a project. I have forked the project and added patches that add this.