Combining Gradle with Ivy and sources for eclipse - eclipse

We use Gradle to build our projects and to manage our dependencies. We want to reuse our legacy Ivy repository which is filesystem based. Our convention was to have several artifacts per module - one of them being a source artifact.
The normal jars are perfectly managed by the Gradle Eclipse integration. But we are not shown the source artifacts. Are there any conventions for this kind of setting, e.g. name of the source artifacts?

artifactName-sources.jar should work, as should defining a configuration named sources (and assigning sources artifacts to that configuration) in the iyy.xml.

Related

How is the bin package of confluent platform / kafka built?

I see that the tarball present in http://packages.confluent.io/archive/5.1/confluent-5.1.0-2.11.tar.gz
contains dependency jars arranged in different folder structure ,with dependent jar files distributed according to categories in share/java/ folder
However , when i clone the git of confluent kafka v_5.1 and build the project , all the dependent jar files are present in lib/ folder instead .
Is this because of a different gradle definition ? where can i obtain the gradle file for publishing http://packages.confluent.io/archive/5.1/confluent-5.1.0-2.11.tar.gz
?
Kafka is built with Gradle.
Confluent primarily builds with Maven, and the build scripts are located on private Jenkins servers, with most of the build artifacts uploaded to S3.
The bin/ package for individual projects is just copied as-is (which can be done with some Gradle copy task), or sometimes using the maven-assembly-plugin, (as shown here with the Schema Registry)
The lib/ & share/java folders are done similarly, and could be done with a maven-dependency-plugin
After each individual repo is done, it mostly is up to shell-scripts to move around the folders and re-package everything.

Is it possible to publish the build project itself of a project using sbt?

I have a downstream project which would like to reference values defined inside the build files of an upstream project. I was thinking if there is an easy way to publish a jar with the source files of the build project itself then I could publish the build project of the upstream project and the build project of the downstream repo could depend on the build project of the upstream repo. Is it possible to do this? Is it reasonable? Are there other, potentially better solutions?
To be clear because I can see how the above might be confusing (and pardon me if I am using incorrect terminology), I am referring to the recursive nature of sbt builds and the fact that the build definition for a project is a project in itself and that's what I would like to publish, not the source files of the project itself.
I'm familiar with writing plugins in sbt and with the sbt buildinfo plugin. I'm hoping there's another way.

Simulink Project dependency management and dependency resolution

What is the best practice for managing dependencies within a Simulink Project when the project is worked on across a team and the project has dependencies on different models and libraries?
An parallel example would be when building an application using Gradle and declaring the dependencies of a project including the required version numbers. Grade will resolve and download the versions that are required to build the project.
e.g. the following declares a dependency on version 2.1 of library and version 1.0 upwards of some-library, so that the latest version 1.x (1.0, 1.1, 1.2...) that is available will be downloaded and used.
dependencies {
compile("com.example:library:2.1")
compile("com.example:some-library:1.+")
}
The documentation for Simulink (and also here covering manifests) seems to talk about models within a project having version numbers. It doesn't seem to mention libraries that are imported into the project. Models that are only used within a single project could all be contained in the overall project, but what happens if there are (for example) generic S-Functions defined within a separate project or library (or library defined within a project) that are applicable across multiple projects? This requirement is all with the aim of helping to support an automatic build process triggered by a Continuous Integration server, such as Jenkins.
I'm interested in a workflow that will easily support dependency management and automatic dependency resolution with a Github Flow git branching policy.
I've spent much time on this problem. Finally I didn't find an appropriate solution online, but I'd like to share the workflow we are using now and which fulfills our needs.
In short: We created our own dependency management by using git submodules.
Assumption: In fact, it is more a version management of persistent dependencies rather than offering the possibility to dynamically add new or remove old packages or libraries. This also works, but requires the git submodules to be added to or removed from the main git repository.
Objectives:
Consistent setup for everyone who works on the project.
Traceability of depdendencies.
Continous Integration with less effort.
How we do it (Example):
We have Project A and Project B which shall be used in Project C.
All three projects are under git version control and still under development.
We have set up additional release repositories for Project A and Project B, e.g. located on a network drive.
In Project C we add the release repositories of Project A and Project B as git submodules
We have set up some kind of auto-deployment to push only relevant files into these release repositories. For example if we want to make changes of Project B accessible to Project C, we only create a version tag in Project B's repository and it gets pushed to its release repository.
In Project C we update our git submodules and can checkout a new submodule version (if needed).
Advantages:
Since git stores the checked out version (commit) of git submodules in the main project, we can ensure that everyone works with the same files.
Changing the commit of a submodule is traceable in the main project.
The relation between the main project and the dependencies is always consistent.
Continuous Integration should work "out of the box". We are using GitLab and GitLab Runner and only had to setup our runner to fetch submodules recursively (in case of nested submodules).
I think this approach works as long as the repositories won't get too big, since you do not fetch only the version you need but also the whole version history.

How do I let jenkins and m2eclipse share the same maven repository

How do I configure m2eclipse ( maven plugin for eclipse ) to use a centralized maven repository that is also used in jenkins.
The default user settings in m2eclipse is something like "home/user/.m2"
How can we do something like "ssh user#192.168.1.200:/var/lib/jenkins/.m2"?
A neat and easy way to do it is to use a repository manager. Sonatype's Nexus seems to be the most popular, but there are others (e.g. JFrog Artifactory and Apache Archiva). They run as HTTP servers, and you can change your Maven configuration (both locally and for Jenkins) to use it as a mirror for any Maven repository (e.g. the Central Maven repo), or use it to host your own repositories.
There is no need to do that. Your POM files list dependencies and they list repositories. Maven will then resolve your dependencies against all known repositories (the listed ones and the "build-in" ones, like Maven central).
Maven will do this in m2eclipse, when running a Maven build. Maven will also do that when running the build on Jenkins. So if both machines can connect to all the repositories listed in your POM files, both will retrieve the same artifacts and both will do the same builds.
You should really not try to share the local copy of the artifacts. That is as bad as if I and you try to share our Maven artifacts using a network share. Maven is designed to find and manage those artifacts and you are trying to do its job with this question.

Tool for managing/hosting own p2 repositories?

Our company uses Maven. We use the Nexus repository manager in order to store our snapshots and releases.
Currently, we are developing a product based on Eclipse. We use Tycho to do that.
The problem is the following: In our Eclipse-based product we have many features. Our idea is build each feature (or group of features) separately and put them in internal p2 repositories. When one features requires another feature, we point the target platform to necessary internal p2 repository.
Currently, we build application with Tycho. We make our features "deployable", so Tycho produces a P2 site in target. We push that P2 site to our server and then run Eclipse FeaturesAndBundlesPublisher, which merges that recently-built feature with a P2 repository. As a result, we have a internal P2 repository having all the versions of required feature.
We find that this process is too cumbersome. Is there a tool like Nexus, which would be more convenient?
UPD.:There is a discussion on Tycho Users list
With the Unzip Repository Nexus Plugin, you can use Nexus for exchanging binary artifacts between Tycho builds.
Tycho project A publishes its artifacts like a normal Maven project: The project is built with mvn clean deploy, which uploads the project's artifacts into your deploy Maven repository on the Nexus. The only special requirement is that the project builds a p2 repository. The recommended way to do this is an eclipse-repository module, but a "deployable feature" should also work in most cases.
On your Nexus, you only need the following one-time configuration: For the deploy Maven repository (or a "Repository Group" which includes that repository), you need to add a virtual repository of type "Unzip Repository". This virtual repository shows zip artifacts from the deploy repository in unpacked form.
Example: If the p2 repository zip of project A is in the deploy Maven repository at http://nexus.corp/nexus/repositories/build.milestones/corp/example/project-a/project-a-repo/1.0.0/project-a-repo-1.0.0.zip, it will be available in standard p2 repository format in the Unzip Repository at http://nexus.corp/nexus/repositories/build.milestones.unzip/corp/example/project-a/project-a-repo/1.0.0/project-a-repo-1.0.0.zip-unzip/.
Tycho project B can reference the artifacts from project A by adding the latter URL to its target platform, e.g. in a target definition file.
In the above example, project B references a release version of project A. The same approach also works for snapshots because the Unzip Repository has support for "symbolic" versions, like 1.1.0-SNAPSHOT for the last deployed 1.1.0-SNAPSHOT or even just SNAPSHOT for the overall highest version. Using these symbolic versions, Project B can then, for example in its own CI build, reference the CI build results project A by adding the resulting (stable!) p2 repository URLs in its target platform.
Disclaimer: The Unzip Repository Nexus Plugin is part of the Tycho project, of which I'm a committer.
Maybe this is a bit late, but I am currently working on an open source (EPL) repository manager which supports the workflow of deploying to a repository with maven and tycho, and consuming it as P2 repository.
It is also possible to deploy bundles created by maven (not maven tycho) and the P2 metadata will be generated automatically.
The project is called "Package Drone" and hosted on github. There is also a short introduction video.