read application.conf from build.sbt - scala

I found a lot of similar questions, but some of them are unanswered and others don't work for me.
My task is to read application.conf from build.sbt, to keep all configuration in a single place. But my build.sbt with code
import com.typesafe.config._
val conf = com.typesafe.config.ConfigFactory.parseFile(new File("conf/application.conf")).resolve()
version := conf.getString("app.version")
don't compile with error
error: object typesafe is not a member of package com
How to fix this problem?

To make this work it is important to know that sbt projects have a recursive structure, so when you run sbt you actually start a project that has as root folder the project folder within your root project.
Given that, to fix your problem you need to add typesafe-config as a library dependency also for you sbt project; in order to do that add a build.sbt in the project folder of your root project and specify the dependency in there as you would normally do (i.e. libraryDependencies += ...).

Use this in your code
import com.typesafe.config.{Config, ConfigFactory}
private val config = ConfigFactory.load()
and in your build.sbt file
libraryDependencies += "com.typesafe" % "config" % "1.3.0"

Related

How to use sbt-assembly in sbt-plugin?

I am writing an sbt-plugin to abstract away some boilerplate.
Let's call it sbt-redux
then there is one more plugin sbt-assembly.
In this quest, my plugin(sbt-redux) needs to know about where the project ( Project which is using sbt-redux ) will create Uber jar using sbt-assembly and what will be the name of jar.
I tried adding sbt-assembly in plugins of sbt-redux, but for the obvious reasons it will not add dependencies in my src folder as it has limitations only in build.sbt.
I tried using .dependsOn(assembly) but still no luck.
So, How can I use other plugins into src?
P.S. Please let me know if the question is not clear.
There I found a solution and it is working for me.
If you want to read assembly's settings, you have to make sure your plugins depend on it. In your build.sbt file, you can add:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.7").
And then in your AutoPlugin implementation, you should override requires this way:
override def requires = super.requires && sbtassembly.AssemblyPlugin
After that, you'll have access to assembly settings and tasks.
Thanks to gpoirier.
To expand on the correct answer from #arglee, when you have a multi-module project, the addSbtPlugin line needs to be part of the .settings entries for the module that needs the dependency, like:
lazy val mod = (project in file("mod"))
.settings(
...,
addSbtPlugin( ... ),
libraryDependencies ++= Seq( ... ),
...
)

scala create and use local library

I am trying to create a local library which contains a class
myproject.scala:
object test {
def info(message: String): Unit = println(s"INFO: $message")
}
build.sbt:
name := "MyProject"
version := "0.1"
organization := "MyCorp"
scalaVersion := "2.11.0"
sbtVersion := "0.13"
I ran sbt clean compile publishLocal and I see the jar in my local ivy2 directory. What I'm unsure about is how to now use that library in another project.
do I added libraryDependencies += "MyCorp"%"myproject_2.11"%"0.1"
to the second project's sbt, and I see it in the classPath when I print it out in the repl. The problem is when I try
import MyCorp.myproject
I get an error not found. I'm sure I'm missing something simple, but it's driving me nuts.
I ran sbt clean compile and I see the jar in my local ivy2 directory.
That's weird. sbt clean compile does not publish the artifact in the local repository. (Have you copied it manually there?) That should have been done with publishLocal command and the artifact should become available at {path_to_.ivy2}/local/MyCorp/MyProject/0.1/jars/MyProject.jar.
Now in your second project, it can be added as
libraryDependencies += "MyCorp" % "MyProject" % "0.1"
// or in libraryDependencies ++= Seq(...)
Please notice that the _2.11 suffix that you have used in the name depends on how the first project was built, whether its build was differentiated by Scala versions. If it was, the suffix would be usually present in the artifact .jar file name. And it is preferable to avoid including the suffix in the library dependency declaration, but rather use %% for built-in support.
After checking it, also try to restart the SBT CLI, because unfortunately sometimes changes in build.sbt are not taken into account on-the-fly.
Update
I assume its mycorp.myproject.test , but I tried every possible combination. #Brian
Following the comments, I think that there still should be something misconfigured in the project and/or missing in the description.
Assuming there is a file {path/to/project}/src/main/scala/mycorp/myproject/Test.scala, with the following contents:
package mycorp.myproject
object Test {
def info(message: String): Unit = println(s"INFO: $message")
}
When the artifact is published, the .jar file should contain the folders mycorp/myproject with Test.class and Test$.class files.
After adding the .jar to the dependencies of the second project, importing Test into another class should look like:
package mycorp.myproject2
import mycorp.myproject.Test
object AnotherTest extends App {
Test.info("hello")
}
I hope this helps.
End-of-update

sbt assembly, including my jar

I want to build a 'fat' jar of my code. I understand how to do this mostly but all the examples I have use the idea that the jar is not local and I am not sure how to include into my assembled jar another JAR that I built that the scala code uses. Like what folder does this JAR I have to include reside in?
Normally when I run my current code as a test using spark-shell it looks like this:
spark-shell --jars magellan_2.11-1.0.6-SNAPSHOT.jar -i st_magellan_abby2.scala
(the jar file is right in the same path as the .scala file)
So now I want to build a build.sbt file that does the same and includes that SNAPSHOT.jar file?
name := "PSGApp"
version := "1.0"
scalaVersion := "2.11.8"
resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"
//provided means don't included it is there. already on cluster?
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.2.0" % "provided",
"org.apache.spark" %% "spark-sql" % "2.2.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.2.0" % "provided",
//add magellan here somehow?
)
So where would I put the jar in the SBT project folder structure so it gets picked up when I run sbt assembly? Is that in the main/resources folder? Which the reference manual says is where 'files to include in the main jar' go?
What would I put in the libraryDependencies here so it knows to add that specific jar and not go out into the internet to get it?
One last thing, I was also doing some imports in my test code that doesn't seem to fly now that I put this code in an object with a def main attached to it.
I had things like:
import sqlContext.implicits._ which was right in the code above where it was about to be used like so:
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._
import org.apache.spark.sql.functions.udf
val distance =udf {(a: Point, b: Point) =>
a.withinCircle(b, .001f); //current radius set to .0001
}
I am not sure can I just keep these imports inside the def main? or do I have to move them elsewhere somehow? (Still learning scala and wrangling the scoping I guess).
One way is to build your fat jar using the assembly plugin (https://github.com/sbt/sbt-assembly) locally and publishLocal to store the resulting jar into your local ivy2 cache
This will make it available for inclusion in your other project based on build.sbt settings in this project, eg:
name := "My Project"
organization := "org.me"
version := "0.1-SNAPSHOT"
Will be locally available as "org.me" %% "my-project" % "0.1-SNAPSHOT"
SBT will search local cache before trying to download from external repo.
However, this is considered bad practise, because only final project should ever be a fat-jar. You should never include one as dependency (many headaches).
There is no reason to make project magellan a fat-jar if library is included in PGapp. Just publishLocal without assembly
Another way is to make projects dependant on each other as code, not library.
lazy val projMagellan = RootProject("../magellan")
lazy val projPSGApp = project.in(file(".")).dependsOn(projMagellan)
This makes compilation in projPSGApp tigger compilation in projMagellan.
It depends on your use case though.
Just don't get in a situation where you have to manage your .jar manually
The other question:
import sqlContext.implicits._ should always be included in the scope where dataframe actions are required, so you shouldn't put that import near the other ones in the header
Update
Based on discussion in comments, my advise would be:
Get the magellan repo
git clone git#github.com:harsha2010/magellan.git
Create a branch to work on, eg.
git checkout -b new-stuff
Change the code you want
Then update the versioning number, eg.
version := "1.0.7-SNAPSHOT"
Publish locally
sbt publishLocal
You'll see something like (after a while):
[info] published ivy to /Users/tomlous/.ivy2/local/harsha2010/magellan_2.11/1.0.7-SNAPSHOT/ivys/ivy.xml
Go to your other project
Change build.sbt to include
"harsha2010" %% "magellan" % "1.0.7-SNAPSHOT" in your libraryDependencies
Now you have a good (temp) reference to your library.
Your PSGApp should be build as an fat jar assembly to pass to Spark
sbt clean assembly
This will pull in the custom build jar
If the change in the magellan project is usefull for the rest of the world, you should push your changes and create a pull request, so that in the future you can just include the latest build of this library

sbt: set the base-directory of a remote RootProject

Disclaimer: I am new to sbt and Scala so I might be missing obvious things.
My objective here is to use the Scala compiler as a library from my main project. I was initially doing that by manually placing the scala jars in a libs directory in my project and then including that dir in my classpath. Note that at the time I wasn't using sbt. Now, I want to use sbt and also download the scala sources from github, build the scala jars and then build my project. I start by creating 2 directories: myProject and myProject/project. I then create the following 4 files:
The sbt version file:
// File 1: project/build.properties
sbt.version=0.13.17
The plugins file (not relevant to this question):
// File 2: project/plugins.sbt
addSbtPlugin("com.eed3si9n" % "sbt-buildinfo" % "0.7.0")
The build.sbt file:
// File 3: build.sbt
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
organization := "me",
scalaVersion := "2.11.12",
version := "0.1.0-SNAPSHOT"
)),
name := "a name"
).dependsOn(ScalaDep)
lazy val ScalaDep = RootProject(uri("https://github.com/scala/scala.git"))
My source file:
// File 4: Test.scala
import scala.tools.nsc.MainClass
object Test extends App {
println("Hello World !")
}
If I run sbt inside myProject then sbt will download the scala sources from github and then try to compile them. The problem is that the base-directory is still myProject. This means that if the scala sbt source files refer to something that is in the scala base-directory they won't find it. For example, the scala/project/VersionUtil.scala file tries to open the scala/versions.properties file that lies in the scala base-directory.
Question: How can I set sbt to download a github repo and then build it using that project's base-directory instead of mine's (by that I mean the base-directory of myProject in the above example) ??
Hope that makes sense.
I would really appreciate any feedback on this.
Thanks in advance !
In the Scala ecosystem you usually depend on binary artifacts (libraries) that are published in Maven or Ivy repositories. Virtually all Scala projects publish binaries, including the compiler. So all you have to do is add the line below to your project settings:
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value
dependsOn is used for dependencies between sub-projects in the same build.
For browsing sources you could use an IDE. IntelliJ IDEA can readily import Sbt projects and download/attach sources for library dependencies. Eclipse has an Sbt plugin that does the same. Ensime also, etc. Or just git clone the repository.

Setting a module project in Scala as an sbt project?

I'd like to know how to convert a regular scala project into an sbt project. I've tried manually creating an sbt file on the root directory, correctly implemented, but Intellij still doesn't recognize this as a sbt project, i.e, it won't show me in the "View -> Tool Windows" the "SBT" option.
How should I go about this? What I'm actually attempting to do is to create an empty project with multiple (independent) modules.
From what I've gathered there seems to be no way to add a module directly with sbt support, am I right?
Thanks
Here is an example of a multi-project build. The root project "aggregates" them all in case you want to compile them all together or package them all together, etc. The "coreLibrary" project depends on the code of "coreA" and "coreB".
import sbt.Keys._
import sbt._
name := "MultiProject"
lazy val root = project.in(file(".")).aggregate(coreA, coreB, coreLibrary)
lazy val coreA = Project("CoreA", file("core-a")).settings(
organization := "org.me",
version := "0.1-SNAPSHOT"
)
lazy val coreB = Project("CoreB", file("core-b")).settings(
organization := "org.me",
libraryDependencies += "org.apache.kafka" %% "kafka" % "0.8.2-beta",
version := "0.3-SNAPSHOT"
)
lazy val coreLibrary = Project("UberCore", file("core-main")).dependsOn(coreA, coreB).settings(
organization := "org.me",
version := "0.2-SNAPSHOT"
)
You can (for example) compile each project from the command line:
>sbt CoreB/compile
Or you can do this interactively:
>sbt
>project CoreB
>compile
I recommend you to use a single multiple-module SBT project. sbt is a great build tool for scala, you can do a lot of things with sbt, including checking out from the repository one module and built it.
sbt
projects
project <helloProject>
Actually, this feature allows multiple people to work on the same project in parallel. Please take a look at this: http://www.scala-sbt.org/0.13.5/docs/Getting-Started/Multi-Project.html.