To keep artifacts separate, origin of packages differentiated and my development environments clean; I use separate settings.xml files for groups of projects. So and I invoke maven with command as:
mvn -s $PROJECT_ROOT/mvn_settings.xml compile
How can I configure sbt in a similar way? My workplace provides an internally hosted JFrog repository which has sbt and Ivy plugins enabled. I have tried looking up search engine with various keyword but couldn't find matching documentation.
I use IntelliJ Idea CE with Scala plugin, if this is relevant.
Edit 1: I want to be able to control where my artifacts are stored, their origin and their association with individual projects.
Edit 2: Consider two settings.xml's
For my random project with minimal libs from maven central: https://pastebin.com/nLc1PGa3
My company's projects in one big bin: https://pastebin.com/R6a4jGQC All from separate sources, in their own respective folders. Also I can move my projects independently, not worrying which dependency link might break something else unrelated.
First grouping things by settings.xml in Maven is not the best way to go. Better is to use the repository manager which can have routes to the particular repositories and to separate repositories and their specific intention. (I'm using a single settings.xml for years which has not been changed. Only the configuration in my repository manager is handling that; This makes life easier and also on CI systems).
Based on the docs of sbt you can configure the proxy repositories like this in the ~/.sbt/repositories file:
[repositories]
local
my-ivy-proxy-releases: http://repo.company.com/ivy-releases/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
my-maven-proxy-releases: http://repo.company.com/maven-releases/
Though anytime later or soon I would suggest the same practice as by #khmarbaise to be followed in any of the projects that you are building.
Since there seems to be no point of keeping different folders for repositories if they are from the same group and artifact or even if they are different with maven/sbt providing the support to build different projects using differently specified dependencies.
Similar to maven there can be a Build Settings Concatenation for build.sbt which would work as -
They are appended in this order:
Settings from Build.settings and Project.settings in your .scala files.
Your user-global settings; for example in ~/.sbt/build.sbt you can define settings affecting all your projects.
Settings injected by plugins, see using plugins coming up next.
Settings from .sbt files in the project.
Build definition projects (i.e. projects inside project) have settings from global plugins (~/.sbt/plugins) added. Using plugins
explains this more. Later settings override earlier ones. The entire
list of settings forms the build definition.
So you can override your global build.sbt to specify the repository path using
"Local Maven" at Path.userHome.asFile.toURI.toURL + ".m2/repository"
You should be able to use a Configuration object to do this.
For example in an .sbt file:
val MyConfig = Configurations.config("my-config").extend(Compile)
(resolvers in MyConfig) := Seq(???)
Then when you use the sbt shell you can call
my-config/compile
And it will use the settings you've declared only for that scope.
Then you can declare as many configurations as needed.
I think you are looking for publishing artifacts to your company's internal JFrog repository.
The relevant JFrog documentation can be found here - https://www.jfrog.com/confluence/display/RTF/SBT+Repositories#SBTRepositories-DeployingArtifacts
I was looking at sbt command line help and I found I can set path to local ivy repository using -ivy option.
Related
We have some Maven modules shared between several teams, with the mandate to share the source code even though our projects use different dependencies and resources. To accomplish this, we have our modules set up as recommended in Using Maven When You Can't Use the Conventions under "Producing Multiple Unique JARs from a Single Source Directory." Specifically, we have a shared parent module containing the src directory but whose pom declares <packaging>pom</packaging> and only builds the two submodules. Each submodule inherits from this parent and refers to the shared src directory using this:
<build>
<sourceDirectory>../src/main/java</sourceDirectory>
</build>
The two submodules have different artifact ids, allowing dependent modules and projects to specify which version and dependency set they need. It also upholds the Maven principle of "one module, one output."
This all works great in Maven-land: compilation, installation, deployments, etc. What doesn't work well is Eclipse integration. Some things work fine: building the modules, deploying to our Maven repo, pulling in dependencies to build our project. But things such as code completion and jumping to class/method definitions do not work at all. It's as though Eclipse doesn't recognize the source at all.
If we just check out a module from SVN, Eclipse doesn't know about the classes but instead uses jars from the repo. If we then import the modules as Maven modules, they show up in package explorer and the project build path. However, all references to those classes and methods are now flagged as errors by Eclipse. And we still do not have code completion or navigation.
So my questions are these: How can we get Eclipse to recognize the code and do its normal code navigation while still satisfying our varying project requirements? Am I missing some simple Eclipse configuration? Do we need to rework our Maven module structure, and if so, how?
Some additional context: The different dependencies for the projects are rather large, including different major versions for things such as Weblogic and Spring. The Weblogic versions will converge some time next year, but the other dependencies will be slower (and some resource files will likely always remain distinct). So for the near- to mid-future, we have to account for different dependencies between the projects.
We are using profiles to allow our Jenkins server to build both submodules while allowing individual developers to build only the submodule their project needs. Using profiles to manage the dependencies is problematic because we lose transitivity of dependencies.
Update (12/8/15)
I was eventually able to make Eclipse recognize the source directory by using "Link Source..." on the "Configure Build Path..." dialog. Adding a source folder would not let me reference the module's parent directory, but Link Source let me assign an arbitrary directory to use. It's not ideal, but it seems to be working.
I was eventually able to make Eclipse recognize the source directory by using "Link Source..." on the "Configure Build Path..." dialog. Adding a source folder would not let me reference the module's parent directory, which derailed me for a while. However,Link Source let me assign an arbitrary directory to use.
It's not ideal, but it seems to be working. We can now jump to definitions with F3, and errors are now highlighted correctly. It's good enough that I don't feel bad recommending it to the other team. I wish Eclipse would automatically allow a parent source directory to be referenced, but at least the manual intervention worked right.
I have a setup with 13 different eclipse projects (mostly scala and java). All projects have dependencies on each other in different ways. Now the project is starting to get big so we want to transition to a build tool and I wanted to try SBT.
First question: Is there any way to export the build files from eclipse? I mean, I have everything working in eclipse so It feels like an "export build.sbt" would be possible.
Second question: I have not found any easy way to add the project dependencies in a sbt file. Some sites say that I should publish all projects to a local maven repo and then using dependencies to be able to build it, but that requirement seems a little extreme.
I found my answers by a friendly person on the #sbt irc-channel.
For the first question: No, there seems to be none at the moment.
For the second qestion: I should create a multi-project build and define dependencies between projects that way (following the guide at: http://www.scala-sbt.org/release/docs/Getting-Started/Multi-Project.html)
We want to use company internal ivy/maven repository (artifactory) to improve the speed of resolving, and downloading the jar files, and also we want to use it to exchange binary jar files between different teams in our organization.
I know we can force SBT to go through proxy by setting ~/.repositories with
[repositories]
local
my-ivy-proxy-releases: http://repo.alpinenow.com/artifactory/repo/, [organization]/[module]/(scala_[scalaVersion]/)(sbt_[sbtVersion]/)[revision]/[type]s/[artifact](-[classifier]).[ext]
my-maven-proxy-releases: http://repo.alpinenow.com/artifactory/repo/
and then launch SBT with -Dsbt.override.build.repos=true. This method works for me.
However, it's kind of cumbersome to ask all the developers to setup this way. We're wondering if we can override the default resolvers completely in Build.scala, and plugin.sbt without extra configuration.
So far, I've tried the following ways without success.
1) In both Build.scala and plugin.sbt, I added
resolvers := "Local Repo" at "http://repo.alpinenow.com/artifactory/repo/",
externalResolvers := Seq(Resolver.url("Local Repo", url("http://repo.alpinenow.com/artifactory/repo"))(Resolver.ivyStylePatterns)),
but it still downloads the jars from typesafe and maven1.
2) I then decided to put repositories file into project folder, and tried to add java option directly inside plugin.sbt, and Build.scala with
System.setProperty("-Dsbt.override.build.repos", "true"),
System.setProperty("-Dsbt.repository.config", "project/repositories"),
but it still doesn't work. I'm curious when the SBT gets the java options for resolvers since obviously, it's before plugin.sbt and Build.scala.
Any idea?
Thanks.
DB Tsai
Project level
According to the documentation we should be using externalResolvers:
https://www.scala-sbt.org/release/docs/Library-Dependencies.html#Overriding+default+resolvers
externalResolvers := Seq(
"Local Repo" at "http://repo.alpinenow.com/artifactory/repo/",
// some more internal Nexus repositories
)
Plugin level
You'll have to do it also in your project folder for plugins like in project/resolvers.sbt.
Global SBT level
And if you also want SBT it self to resolve from a specific repo, you'll need to do as described here:
https://www.scala-sbt.org/1.x/docs/Proxy-Repositories.html
If you depart from the sbt-extras shell script as a replacement for the default launcher script, I guess you could easily modify that with setting up ~/.repositories and adding -Dsbt.override.build.repos=true. Then all you need to do is ensure your developers use that script.
I am always adding SBT build as part of my repo in SVN/GIT, close with code. Then I have no such problems.
It costs about 1MB space so is quite cheap and solves a lot of problems. All developers use identical build tool. Even if I try to create Continues Integration or more advanced Continues Delivery process all SBT configs will be already well configured in my SCM. I will get one source of true :)
I have moved to Maven recently, and since it works fine for resources up to date in some repositories, it's not obvious for non-maven ones.
I have something very simple to achieve (in the idea), but that I am unable to express so far:
I need to compile my code with a jar that can be found here:
https://hudson.eclipse.org/hudson/view/WTP/job/cbi-wtp-wst.xsl.psychopath/ws/sourceediting/plugins/org.eclipse.wst.xml.xpath2.processor/target/
What do I have to put in my pom.xml to make Maven downloading the .jar + the java source + the javadoc, and eventually the other dependencies (actually IBM ICU, Xerces, JavaCup) that are mentionned in the supplied MANIFEST ?
I have read lots of documents, including those with a plugin called Tycho, but nothing helpfull for that simple task.
Thanks for your help.
Maven only works well if all artifacts needed for a build are contained in the local or a configured remote repository. So you have to do the following jobs:
Find out if eclipse plugins are deployed in a Maven2-style repository, and what the URL of that repository is.
Then find out which version of that plugin (artifact) you need.
Maven allows you to configure what will be copied locally: jar file, sources and api doc if you want to.
Maven should then be responsible to download as well all needed artifacts for the plugin you want to use.
After looking at the contents of the URL you gave us (especially the file p2content.xml), it looks like there should be a repository. I searched for the maven repository for org.eclipse.wst.xml.xpath2 and found the URL http://maven.eclipse.org/nexus/content/repositories/testing/org/eclipse/wst/org.eclipse.wst.xml.xpath2/1.1.0/org.eclipse.wst.xml.xpath2-1.1.0.pom
So the repository you are searching for is located at http://maven.eclipse.org/nexus. Just open it, search for example for xpath2, and Nexus, the repository software used there will you show the available artifacts. Depending on what was deployed to that repository, it may contain only the library, or have even sources and JavaDoc bundled with it. For the example above (xpath2), there seems to be only the POM itself and the library (the jar). If you take as example junit, you will find all versions and variants, even with sources.jar and javadoc.jar.
After you have found the needed artifact, you can include it in the dependency section of your POM. And you have to add http://maven.eclipse.org/nexus as a remote repository in the configuration of your Maven installation.
The question and its answer Get source JARs from Maven repository explain how to fetch sources and JavaDoc (if they are available).
You need a maven repository which contains this artifacts (i don't know, if Eclipse hosts a repository for their projects). You can also deploy manually the artifacts to a local repository on your computer.
When sharing a project with team members through version control, it is customary to include the .project in the source under version control. This makes sure that others on the team get all the dependencies and resources for the project. But the .project uses full/rooted paths to the resource, and not all members of a team will be working in the same environment. Even if all the members are on the same platform, the paths can often be in the user's home directory.
For the .classpath file, we can get around this problem by using build path variables. Each member defines the path to location of dependent libraries on their system, and the .classpath only refers to the variable.
This is a particular concern for Grails project - when we add a plugin, it updates the .project accordingly.
IMO resources themselves should not be part of the project at all. There is excellent plugin called m2eclipse which simplifies such tasks using Maven. It will immensely simplify your dependency management. All you'd have to keep in your version control system, besides your source code, is project configuration (pom.xml) - all the dependencies will be downloaded and cached automatically no matter what environment developer works in. There a lot more advantages in this approach - just read up on it :)
UPDATE: Just noticed "grails" tag on your question. if you're using Groovy - Maven can be replaced with Gradle. STS is probably the best Eclipse build to use if you're coding in Groovy. Next version of STS will have Gradle support.
General Approach
As others have mentioned, you should not keep the IDE files in VCS, you should keep an IDE-agnostic description of the project in VCS and generate the IDE-specific project files from them.
Java-Maven Example
Keep the pom.xml file(s) in VCS and generate the Eclipse files by running mvn eclipse:eclipse
Grails Example
A Grails project is described by application.properties and grails-app/conf/BuildConfig.groovy. These files are present in every Grails application. You can generate the Eclipse project descriptions from them by running:
grails integrate-with --eclipse
This command also supports other tools such as IntelliJ and Textmate
I don't think its standard practice to include the project file. I personally tell my VCS to ignore all IDE files, and just use VCS for the source. I include at the root level a README telling others how to configure the project (e.g. jars are in lib)
The resource links feature that you are referring to also has ability to use path variables. These are defined under Preferences -> General -> Workspace -> Linked Resources.
You could try keeping the project files in a shared Dropbox with an agreed upon path for each developer.