Subproject dependencies in SBT - scala

I am having a strange problem with SBT subprojects which I think is dependency related. Here's my setup:
I have an SBT project with two subprojects A and B.
A contains a class and companion object MyA
B depends on A.
B contains an object MyB which has a main method.
When I try to execute MyB from the SBT prompt, I get a NoSuchMethodError on MyA. This is not a ClassNotFoundException, but maybe it's happening because it sees the MyA class on the classpath, but not the MyA object.
As a sanity check, I dropped the B subproject and moved its source into the A source tree. When I run MyB from the SBT prompt, it works as expected.
Has anyone run into this, or am I doing something obviously wrong?
Here is my project configuration:
class MyProject(info: ProjectInfo) extends ParentProject(info) {
lazy val a = project("a", "a", new AProject(_))
lazy val b = project("b", "b", new BProject(_), a)
object Dependencies {
lazy val scalaTest = "org.scalatest" % "scalatest_2.9.0" % "1.4.1" % "test"
}
class AProject(info: ProjectInfo) extends DefaultProject(info) with AutoCompilerPlugins {
val scalaTest = Dependencies.scalaTest
val continuationsPlugin = compilerPlugin("org.scala-lang.plugins" % "continuations" % "2.9.0")
override def compileOptions = super.compileOptions ++ compileOptions("-P:continuations:enable") ++ compileOptions("-unchecked")
}
class BProject(info: ProjectInfo) extends DefaultProject(info)
}

It turns out to have been a problem enabling the continuations plugin on project B. Here's my working configuration:
class MyProject(info: ProjectInfo) extends ParentProject(info) {
lazy val a = project("a", "a", new AProject(_))
lazy val b = project("b", "b", new BProject(_), a)
object Dependencies {
lazy val scalaTest = "org.scalatest" % "scalatest_2.9.0" % "1.4.1" % "test"
}
class AProject(info: ProjectInfo) extends DefaultProject(info) with AutoCompilerPlugins {
val scalaTest = Dependencies.scalaTest
val continuationsPlugin = compilerPlugin("org.scala-lang.plugins" % "continuations" % "2.9.0")
override def compileOptions = super.compileOptions ++ compileOptions("-P:continuations:enable") ++ compileOptions("-unchecked")
}
class BProject(info: ProjectInfo) extends DefaultProject(info) with AutoCompilerPlugins {
override def compileOptions = super.compileOptions ++ compileOptions("-P:continuations:enable") ++ compileOptions("-unchecked")
}
}

Related

Unit test LazyLogging using Mockito

I am having a class which extends LazyLogging trait
class TaskProcessor()
extends Processor
with LazyLogging {
def a1() = {
logger.info("Test logging")
}
}
Now, I want to test whether my logging works. So I followed this example Unit test logger messages using specs2 + scalalogging and wrote my test as follows
"TaskProcessor" should "test logging" in {
val mockLogger = mock[Logger]
val testable = new TaskProcessor {
override val logger: Logger = mockLogger
}
verify(mockLogger).info("Test logging")
}
I get the following error
Error:(32, 20) overriding lazy value logger in trait LazyLogging of type com.typesafe.scalalogging.Logger;
value logger must be declared lazy to override a concrete lazy value
override val logger: Logger = mockLogger
To resolve this, I modify statement
override val logger: Logger = mockLogger
to
override lazy val logger: Logger = mockLogger
I get the following error
Cannot mock/spy class com.typesafe.scalalogging.Logger
Mockito cannot mock/spy following:
- final classes
- anonymous classes
- primitive types
org.mockito.exceptions.base.MockitoException:
Cannot mock/spy class com.typesafe.scalalogging.Logger
Mockito cannot mock/spy following:
- final classes
- anonymous classes
- primitive types
at org.scalatest.mockito.MockitoSugar.mock(MockitoSugar.scala:73)
at org.scalatest.mockito.MockitoSugar.mock$(MockitoSugar.scala:72)
My dependecies are as follows
"org.scalatest" %% "scalatest" % "3.0.5" % "test",
"org.mockito" % "mockito-all" % "1.10.19" % Test,
"com.typesafe.scala-logging" %% "scala-logging" % "3.9.2",
Can anyone please guide me as to how I can mock the logger and do the testing.
The problem is that com.typesafe.scalalogging.Logger class cannot be mocked because it's final, but we still can mock underlying org.slf4j.Logger.
import org.scalatest.mockito.MockitoSugar
import org.slf4j.{Logger => UnderlyingLogger}
import com.typesafe.scalalogging.Logger
import org.scalatest.{Matchers, WordSpec, FlatSpec}
import org.mockito.Mockito._
class TaskProcessorSpec extends FlatSpec with Matchers with MockitoSugar {
"TaskProcessor" should "test logging" in {
val mockLogger = mock[UnderlyingLogger]
when(mockLogger.isInfoEnabled).thenReturn(true)
val testable = new TaskProcessor {
override lazy val logger = Logger(mockLogger)
}
testable.a1()
verify(mockLogger).info("Test logging")
}
}

SBT AutoPlugin dependencies

I've created an SBT plugin placed into project folder. This plugin extends sbt.AutoPlugin and adds a custom task.
Something like this:
object MyCustomTask extends AutoPlugin {
...
lazy val myCustomTask = Def.task {
runner.value.run("my.support.project.classpath.Utility")
}
}
and I have this build.sbt
lazy val support = (project in file("support"))
.settings(libraryDependencies ++= Seq(
"com.h2database" % "h2" % "1.4.197"
))
lazy val root = (project in file("root"))
.settings(...)
.dependsOn(support) // <- how can I remove this?
.enablePlugin(MyCustomTask)
I don't want to make a dependency between root project and support project, because in this way root inherits all the dependecies from support that it doesn't needs (like the h2database dependency), but if I remove the dependsOn(support) the task defined in MyCustomTask can't find my.support.project.classpath.Utility.
Ho can I move that dependency into MyCustomTask plugin definition?
Dependencies can be added to the plugin overriding the projectSettings field, like the following:
object MyCustomTask extends AutoPlugin {
...
lazy val myCustomTask = Def.task {
runner.value.run("my.support.project.classpath.Utility")
}
override val projectSettings: Seq[Def.Setting[_]] = Seq(
libraryDependencies += "com.h2database" % "h2" % "1.4.197"
)
}

Play Framework 2.5 Test ApplicationLifecycle Guice Specs2 setup

I am trying to run some functional tests with play2-reactivemongo. I will try to be as concrete as possible, but if something is missing please let me know.
My dependencies are here
libraryDependencies ++= Seq(
cache,
"org.reactivemongo" %% "play2-reactivemongo" % "0.12.0",
"com.mohiva" %% "play-silhouette" % "4.0.0",
"com.mohiva" %% "play-silhouette-testkit" % "4.0.0" % "test",
specs2 % Test
)
In MongoUserDao.scala
import play.modules.reactivemongo._
import play.modules.reactivemongo.json._
import reactivemongo.play.json.collection.JSONCollection
class MongoUserDao #Inject() (val reactiveMongoApi: ReactiveMongoApi) extends UserDao {
val usersFuture = reactiveMongoApi.database.map(_.collection[JSONCollection]("users"))
...
}
In DaoSpecResources.scala
trait DaoSpecResources {
val timeout = DurationInt(10).seconds
val fakeApp = new GuiceApplicationBuilder()
.in(Mode.Test)
.configure(
"play.modules.enabled" -> List("play.modules.reactivemongo.ReactiveMongoModule"),
"mongodb.uri" -> "mongodb://localhost:27017/test"
)
.build
val reactiveMongoApi = fakeApp.injector.instanceOf[ReactiveMongoApi]
...
}
When I try to run the test I get this error
[error] cannot create an instance for class daos.UserDaoSpec
[error] caused by com.google.inject.CreationException: Unable to create injector, see the following errors:
[error]
[error] 1) No implementation for play.api.inject.ApplicationLifecycle was bound.
[error] while locating play.api.inject.ApplicationLifecycle
[error] for parameter 1 at services.ApplicationTimer.<init>(ApplicationTimer.scala:24)
[error] at Module.configure(Module.scala:23) (via modules: com.google.inject.util.Modules$OverrideModule -> Module)
[error]
[error] 1 error
This is caused by app/services/ApplicationTimer.scala, which depends on ApplicationLifecycle, but you haven't bound any implementation to ApplicationLifecycle. ApplicationTimer is a demo included in every new Play project. You should probably remove it completely if you don't need it, otherwise at least disable it if running in test mode. See disabling modules and overriding modules.
Solution
However, since DefaultReactiveMongoApi also depends on ApplicationLifecycle, you'll need to provide a binding to an implementation of ApplicationLifecycle. The easiest way:
import play.api.inject.{ ApplicationLifecycle, DefaultApplicationLifecycle }
import play.api.inject.bind
trait DaoSpecResources {
val timeout = DurationInt(10).seconds
val fakeApp = new GuiceApplicationBuilder()
.in(Mode.Test)
.configure(
"play.modules.enabled" -> List("play.modules.reactivemongo.ReactiveMongoModule"),
"mongodb.uri" -> "mongodb://localhost:27017/test"
)
.bindings(bind[ApplicationLifecycle].to[DefaultApplicationLifecycle])
.build
val reactiveMongoApi = fakeApp.injector.instanceOf[ReactiveMongoApi]
val lifecycle = fakeApp.injector.instanceOf[DefaultApplicationLifecycle]
def stopApp = lifecycle.stop()
}
(added 5 lines: imports, bindings, lifecycle and stopApp)
Then, in your test spec, add step(stopApp) at the end, like so:
class FooSpec extends PlaySpecification with DaoSpecResources {
// Your examples...
step(stopApp)
}
Alternative solution
Personally, I'd create a specialized trait extending Specification or PlaySpecification which would set up and tear down everything automatically, like in this example from specs2 documentation.
trait PlayWithMongoSpecification extends PlaySpecification {
val timeout = DurationInt(10).seconds
val fakeApp = new GuiceApplicationBuilder()
.in(Mode.Test)
.configure(
"play.modules.enabled" -> List("play.modules.reactivemongo.ReactiveMongoModule"),
"mongodb.uri" -> "mongodb://localhost:27017/test"
)
.bindings(bind[ApplicationLifecycle].to[DefaultApplicationLifecycle])
.build
val reactiveMongoApi = fakeApp.injector.instanceOf[ReactiveMongoApi]
val lifecycle = fakeApp.injector.instanceOf[DefaultApplicationLifecycle]
def stopApp = lifecycle.stop()
override def map(fs: =>Fragments) = fs ^ step(stopApp)
}
class FooSpec extends PlayWithMongoSpecification {
// Your examples...
}
You may consider making reactiveMongoApi a lazy val.

Expect a specific instance for mock in Scalamock

Why can I not tell a mock that it should expect an instance of a class without explicitly giving the type? Here is what I mean:
val myClass = new MyClass(...)
val traitMock = mock[MyTrait]
(traitMock.mymethod _).expects(myClass).returning(12.3)
does not work, while
val myClass: MyClass = new MyClass(...)
val traitMock = mock[MyTrait]
(traitMock.mymethod _).expects(myClass).returning(12.3)
does work. How come the type can not be inferred?
My testing part in build.sbt is
libraryDependencies ++= Seq(
"org.scalatest" %% "scalatest" % "3.0.0" % "test"
exclude("org.scala-lang", "scala-reflect")
exclude("org.scala-lang.modules", "scala-xml")
)
libraryDependencies += "org.scalamock" %% "scalamock-scalatest-support" % "3.3.0" % "test"
Since I was asked for MyClass (it is SpacePoint here):
trait SpacePoint {
val location: SpaceLocation
}
val sp = new SpacePoint {
override val location: SpaceLocation = new SpaceLocation(DenseVector(1.0, 1.0))
}
So actually it works for me. Let me mention that type inference in the code:
val myClass = new MyClass(...)
has nothing to do with ScalaMock but is guaranteed by scala itself. Below I will specify working sample with library versions and sources of the classes.
Testing libraries:
"org.scalatest" %% "scalatest" % "2.2.4" % "test",
"org.scalamock" %% "scalamock-scalatest-support" % "3.2.2" % "test"
Source code of classes:
class MyClass(val something: String)
trait MyTrait {
def mymethod(smth: MyClass): Double
}
Source code of test:
import org.scalamock.scalatest.MockFactory
import org.scalatest.{Matchers, WordSpec}
class ScalamockTest extends WordSpec with MockFactory with Matchers {
"ScalaMock" should {
"infers type" in {
val myClass = new MyClass("Hello")
val traitMock = mock[MyTrait]
(traitMock.mymethod _).expects(myClass).returning(12.3)
traitMock.mymethod(myClass) shouldBe 12.3
}
}
}
Hope it helps. Will be ready to update answer once you provide more details.

How do you share a custom task in a SBT multi-project

I have a project set up as a SBT multi-build. That looks like this:
- project
Dependencies.scala
- core
build.sbt
- server
build.sbt
build.sbt
I want to use Dependencies.scala as a container for version numbers of libraries that are shared between the sub-projects.
sealed trait Dependencies {
val commonsIo = "2.4"
}
object DependencyVersions extends Dependencies
In the root build.sbt I added a Setting that is given to each sub-project.
lazy val dependencies = settingKey[Dependencies]("versions")
val defaultSettings = Defaults.coreDefaultSettings ++ Seq(
dependencies := DependencyVersions)
def projectFolder(name: String, theSettings: Seq[Def.Setting[_]] = Nil) = Project(name, file(name), settings = theSettings)
lazy val core = projectFolder("core", defaultSettings)
I can't access the dependencies setting in core/build.sbt.
"commons-io" % "commons-io" % dependencies.value.commonsIo, <-- doesn't work
How can I get this to work?
You can define common settings (dependencies) in an object Common extends AutoPlugin (in project/Common.scala), and then use .enablePlugin(Common) on sub-project descriptor (see it in Anorm).
Thanks #cchantep got it working now using the AutoPlugin below
import sbt._
sealed trait Dependencies {
val commonsIo = "2.4"
}
object DependencyVersions extends Dependencies
object DependencyVersionsPlugin extends AutoPlugin {
override def trigger = allRequirements
object autoImport {
lazy val dependencies = settingKey[Dependencies]("Bundles dependency versions")
}
import autoImport._
override def projectSettings = Seq(
dependencies := DependencyVersions
)
}