could not resolve dependency for scalabuff-runtime in sbt - scala

I am trying to use the generated scala class in a sbt-managed class (using the scalabuff tool to generate the scala class from proto file https://github.com/SandroGrzicic/ScalaBuff). I now try to add the dependency to the sbt config:
addSbtPlugin("net.sandrogrzicic" %% "scalabuff-runtime" % "1.3.6")
But sbt reports the dependency could not be resoloved.
Anyone has similar issue before?

Just looking at the readme, scalabuff-runtime isn't the plugin, it's the runtime dependency. It looks like you actually need
addSbtPlugin("com.github.sbt" %% "sbt-scalabuff" % "0.2")
libraryDependencies += "net.sandrogrzicic" %% "scalabuff-runtime" % "1.3.6"

Related

Scala library available in both compile and test configuration

I have a library that I wish to expose in both the unit tests in Scala and the code itself.
In sbt, I added my library dependency with configuration "test" and then it's available for tests but I cannot use it in the code. If I leave the configuration be or add "compile" it's not available to be imported in unit tests.
libraryDependencies ++= Seq(
"org.scalacheck" %% "scalacheck" % "1.14.0",
"org.scalatest" %% "scalatest" % "3.0.6" % "test",
"org.scalactic" %% "scalactic" % "3.0.6" % "test")
The main problem is that I expose an abstract class I want to use all over the place in other code: abstract class UnitSpec extends FlatSpec with Matchers with ScalaCHeckDrivenPropertyChecks and also use in the tests of the library. If I add "test" to ScalaCheck it cannot find it in the main code of the library. If I leave it as is, it cannot from org.scalatestplus.scalacheck.ScalaCheckDrivenPropertyChecks. This used to be OK and work fine with 3.0.5 and GeneratorDrivenProperyChecks but that's been deprecated.
Is there a way to achieve what I want? I tried "test->compile" but that also doesn't do what I had hoped...
You can combine configurations. In order to have a library both in compile and test you just add bot configurations.
// wrong: libraryDependencies += "<organization>" %% "<module>" % "<version>" % "compile->compile" % "test->compile"
The syntax means roughly: project configuration dependsOn(->) configuration of libraryDependency.
Update
You can also add the dependency twice with different configurations.
libraryDependencies += "<organization>" %% "<module>" % "<version>",
libraryDependencies += "<organization>" %% "<module>" % "<version>" % "test"
Update 2
I think the syntax in the first example is not what I meant to provide.
libraryDependencies += "<organization>" %% "<module>" % "<version>" % "compile->compile;test->compile"
At least that is what I use in my libraryDependencies.
So you need a trait from the Scalatest JAR in non-test code. I am not sure why it worked before, but it would make sense to me just to remove % "test" from the scalatest dependency. That will make it available in compile and everything from compile is available in test too.
And for Scalactic I think the main use-case for it as a separate dependency is when you need it in compile but only use Scalatest in test (or don't use it at all). If they are both needed for tests only (or for compile), Scalatest will bring Scalactic with it.
I tried "test->compile" but that also doesn't do what I had hoped...
"test->compile" is the same as "test":
A configuration without a mapping (no "->") is mapped to "default" or "compile". The -> is only needed when mapping to a different configuration than those.

Unable to find the correct SBT dependency

Today is my first day with Finch.
I am unable to find the right set of SBT dependencies for finch and finagle.
I have tried all the dependencies as shown in Image 2
You are using Scala 2.12 but your dependencies are for Scala 2.11.
This is the correct way to write what you need:
libraryDependencies += "com.github.finagle" %% "finch-core" % "0.13.0"
Build.scala, % and %% symbols meaning

Scala SBT elasticsearch-hadoop unresolved dependency

When adding dependency libraryDependencies += "org.elasticsearch" % "elasticsearch-hadoop" % "5.1.1" and refreshing project, I get many unresolved dependencies(cascading, org.pentaho,...).
However if I add another dependency, like libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0" it works and I can use the library in my scala files.
So, is the problem coming from elasticsearch-hadoop ? I'm using SBT 0.13.13 but also tried with 0.13.8.
I took the dependency from https://mvnrepository.com/artifact/org.elasticsearch/elasticsearch-hadoop/5.1.1 I know that for some dependencies you need to add the repository aswell (resolvers += ...), but here it doesn't seems to need a repo.
Add the following in your build.sbt file:
resolvers += "conjars.org" at "http://conjars.org/repo"
Can update your .sbt file
name:="HelloSparkApp"
version:="1.0"
scalaVersion:="2.10.4"
libraryDependencies+="org.apache.spark"%%"spark-core"%"1.5.2"
And execute the below commands from the project directory
sbt clean
sbt package
sbt eclipse

Not found object error when importing external library in intellij

Here is my sbt file myproject/build.sbt
version := "1.0"
scalaVersion := "2.12.1"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.4.16",
"io.circe" %% "circe-core" % "0.6.1",
"io.circe" %% "circe-generic" % "0.6.1",
"io.circe" %% "circe-parser" % "0.6.1"
)
Here is my scala file myproject/src/test.scala
package mytest
import akka._
object test {
def main(args: Array[String]) {
print(2)
}
}
I verified that my external library contains, akka
but intellij keep saying that
Error:(7, 8) not found: object akka
import akka._
I am using intellij community edition 2016.3 with the latest scala plugin (which should include latest sbt)
Can someone give me a hint on how to resolve this?
To fix the problem, you have to place your Scala source file into src/main/scala directory. Otherwise IntelliJ/SBT can't recognize it as file related to the project, so it can't associate project dependencies with it.
By default Scala source files can be placed either in the root directory of your project, or in src/main/scala (for main sources, there is also src/test/scala for tests).
If you want to use some other directories to store your Scala source files, you can configure it this way in your build.sbt:
sourceDirectories in Compile += new File("src")
I had a similar problem and it was nothing to do with the directory structure in my case. IntelliJ asks you to refresh when you add a new dependency in build.sbt. I also manually refreshed it form the SBT Shell and still same error.
In the end I closed the project and re-opened and it was fixed.

How to add Java dependencies to Scala projects's sbt file

I have a spark streaming Scala project which uses Apache NiFi receiver. The projects runs fine under Eclipse/Scala IDE and now I want to package it for deployment now.
When I add it as
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
sbt assumes it's a Scala library and tries to resolve it.
How doe I add NiFi receiver and all it's dependencies to project's SBT file?
Also, is it possible to pint dependencies to local directories instead of sbt trying to resolve?
Thanks in advance.
Here is my sbt file contents:
name := "NiFi Spark Test"
version := "1.0"
scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.2" % "provided"
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
libraryDependencies += "org.apache.nifi" %% "nifi-spark-receiver" % "0.3.0"
Double % are used for adding scala version as suffix to the maven artefact. It is required because different scala compiler versions produces incompatible bytecode. If you are would like to use java library from maven, then you should use single % character
libraryDependencies += "org.apache.nifi" % "nifi-spark-receiver" % "0.3.0"
I also found that I can put libraries the project depends on into the lib folder and they will be picked up during assembly.