Publishing an sbt scala project to Artifactory with the env vars - scala

What I have is this.
I have defined in sbt my publish task which publishes to artifactory. Im running it in a Jenkins job as a shell build step.
What I would like to do is to include all the environment information that the Jenkins Artifactory plugin includes when it deploys. I dont know if I should it add it to sbt or if there is some way of configuring the Plugin to use sbt but do the publishing himself.

Currently there is no integration for sbt and Artifactory Jenkins plugin, what you can do is to use the Generic deployment feature of the plugin. The plugin will upload as part of the deployment a build info json object with some of the system and environment variables.
Because the plugin doesn't record the actual build process
then all the run time goodies will be missed but you could inject them by using buildInfo.property.* - All properties starting with this prefix will be added to the root properties of the build-info.

Related

SBT : Auto versioning of artifacts

We have a Scala project which we are building via CI tool (TeamCity/Jenkins). We are looking for an ability to set the build version of the artifact from the build job itself and not depend on the entry in build.sbt file. To give a reference, for java maven builds we can use goal set-version where the artifact version is set, irrespective of what we have in pom.xml. we are looking for something similar for a SBT build as well.
I'd reccoment to take a look at next sbt plugins:
https://github.com/dwijnand/sbt-dynver
https://github.com/sbt/sbt-git
Our team uses sbt-dynver to create version from Git, because it is easier. I'd recommend to build version on top of git tags information rather then using CI tool (TeamCity/Jenkins) information - like build number, because you can build same version twice for instance.
Also, consider using https://github.com/sbt/sbt-buildinfo - additionally, so to expose build version though API or print to output to quick identify currently deployed app version.
This isn't really the right forum for this kind of recommendation, but you could start by looking a sbt-git which will set version numbers based on GIT tags.
One option your have, is to add this to your build.sbt:
version := sys.env.getOrElse("ARTIFACT_VERSION", "0.0.0-SNAPSHOT")
Then setting the version you want at the environment variable ARTIFACT_VERSION.

Publishing / Resolving Custom SBT Plugins with Nexus

I've created my first SBT (1.x) AutoPlugin to add some settings and behavior to projects that use the plugin.
When I publish it locally, everything resolves and works correctly for the projects using the plugin.
However, when I publish the plugin to our private Nexus repository, it fails to resolve for any projects attempting to use it.
I realize that when sbt plugins are published locally, the path is different than 'regular' sbt projects, but they still resolve correctly for projects which use them.
Do I need to publish sbt plugins to a different location within Nexus than our other Scala / SBT-based projects?
And / or, do I need to set up a new resolver for Nexus-hosted SBT plugins?
I know similar questions have been asked previously, but being new to both Nexus and plugin creation, I haven't been able to figure out exactly what I need to do to get the plugin to resolve correctly when publishing to Nexus rather than simply doing a publishLocal and then adding it to the plugins.sbt file of projects meant to use the plugin.
Any assistance would be very much appreciated!

Access Jenkins model instances from Gradle

I'd like to develop a Gradle plugin that performs some operations on the Jenkins model object.
(such as automatically fingerprinting the compile dependencies and the published ivy artifact)
Do you know if it is currently possible to retrieve the instances of Jenkins' AbstractBuild and other classes from a gradle plugin ?
I'm not sure how you'd achieve this with a Gradle plugin, as you need to be running inside the Jenkins JVM to access the model object. You might find it easier to use the REST API if it exposes the data you need. Otherwise you'd need to attach to the Jenkins JVM process remotely.

Automating build tasks using eclipse / maven m2e

I am about to use maven to automate my builds. Unfortunately, I am not able to get all the features I want, even after reading several tutorials :(
I would be glad if somebody could explain a way I can achieve all my goals!
I want to automate 3 specific build tasks with several actions for a project from within eclipse, using m2e:
Build snapshot
compile
define current project version + date as version
build jar file
copy jar file into the local repository in the project path itself (ยง{project}/builds/)
Debug snapshot
build snapshot as mentioned above
copy jar file to plugins folder of a local test server
build another project the current project depends on, copy its jar file to the plugins folder aswell
launch server / connect to eclipse debugger (I know how to do that, the previous steps are the important ones)
Create release
compile
define current project version as version
build jar file
copy jar file into the local repository in the project path itself
create javadoc
copy source files and javadoc to an archive folder
increase the project version (for example v6 -> v7)
As mentioned I don't need a perfect solution, just a way to realize this ;)
(Annotation: Chaining multiple launch configurations is not a problem.)
Edit:
Which sections of the pom.xml do I have to modify to realize these steps and how can I invoke them using an eclipse launch configuration?
Hi based on your requirements i can say the following:
Build Snapshots
Building a SNAPSHOT is usually the convention during development cycle.
1.1 just using the conventions.
1.2 Date as version
This is a bad idea, cause Maven has some conventions how a version looks like (1.0-SNAPSHOT or 1.2.3-SNAPSHOT etc.)
1.3 Build jar file
Usually done by the jar life cycle (mvn package)
1.4 The local repository is on your home drive in ${HOME}/.m2/repository for all your projects. Technically you can do what you like but it's against the Maven conventions. The question is why do you need such thing?
2.1 Usual procedure
2.2 Usually a deployment is not a job for Maven but you can do such things by using cargo-maven-plugin (integration testing).
2.3 If you have dependencies between project you need CI solution like Jenkins to do such things otherwise you need to do this manually. But that is different from a multi-module build.
2.4 Integration testing different story. Depends on what exactly you like to do.
3.
1-7
The maven-release-plugin will handle such things except copying to the project path itself which is against the conventions. For such purposes you need a repository manager.
I can recommand reading these books: http://www.sonatype.com/Support/Books

Using TeamCity, how to manually trigger a DEPLOY against a previously built and tested build run?

Using TeamCity 6.5, I am trying to figure out how to setup a manual deployment for a specific build run if it's possible.
What I would like to be able to do is to take an already built and tested TeamCity run (only the artifacts needs to be deployed - this is not a web application or site) and call an MSBuild step to publish the artifacts to somewhere else.
You can do what you want by setting up Artifact Dependency between the configurations where you want to do the manual deployment and the one where you have the built artifacts.
Once you have setup the Artifacts dependency, click on the Run custom build ellipsis near the "Run" button for the configuration. Here you will have the Artifacts dependencies part where it will say the configuration that this configuration you are running is dependent on and will also have a dropdown list from which you can choose the particular version of the other configuration from which to get the artifacts. Click run from here to run your custom build.
See here for more details: http://confluence.jetbrains.net/display/TCD65/Triggering+a+Custom+Build
You might be thinking about this a bit backwards. What you probably want is a build configuration that takes the previously known successful build (in TC terms it has a snapshot dependency) and then runs a different build targeted at dropping the artifacts somewhere. Pretty easily done by switching the output directories in MSBuild.
The most "integrated" way I could think to do it would be to add a dependency to your deployment configuration that depends on the latest pinned build for the dependent configuration. Then you just unpin any newer builds in the dependent configuration and pin the one you want and run the deploy...This is a bit kludgy and might not work very well if you depend on pinned builds for anything else in the dependent configuration.
The other built in way to do with would be to add an artifact dependency using a specific build number. The drawback of this method is that any time you want to deploy a different build, you will need to be able to edit the artifact dependency build number by hand and then hit run.