How to integrate subcut dependent modules? - scala

Based on available documentation this task seems straightforward, however I've been hitting my head in the wall for a couple of days on this and still can't make it work for a simple inter-module dependency...
Here's a reduced example:
trait Bla {
def m: String
}
class BlaImpl(implicit val bindingModule: BindingModule) extends Bla with Injectable {
val s = inject[String]('bla)
def m = "got " + s
}
object Program extends App with Injectable {
implicit val bindingModule =
new NewBindingModule({ implicit module ⇒ module.bind[Bla] toSingle { new BlaImpl } }) ~
new NewBindingModule(_.bind[String] idBy 'bla toSingle "bla!")
val bla = inject[Bla]
assert(bla.m == "got bla!")
}
Running this code fails with the following error, When trying to build the BlaImpl instance:
org.scala_tools.subcut.inject.BindingException: No binding for key BindingKey(java.lang.String,Some(bla))
Debugging shows that the binding module handed to BlaImpl's constructor doesn't contain the 'bla String in its bindings, and that Program.bindingModule.bindings has all bindings (including the needed String).
I've seen other question similar but it dows refer only to composition but not to dependencies crossing module borders.
What am I doing wrong?

Unfortunately it will not work. Even if you merge 2 modules together
with subcut, this does not mean that they will see each-others dependencies.
If you want to archive desired results, then you need to pass BindingModule explicitly to the BlaImpl constructor. Something like this:
val anotherModule = new NewBindingModule(_.bind[String] idBy 'bla toSingle "bla!")
implicit val bindingModule =
new NewBindingModule({ implicit module => module.bind[Bla] toSingle { new BlaImpl()(anotherModule) } })
Module merge will not do the trick in this case. I actually discussed this problem of subcut in this answer. I even created a example code that compares subcut and Scaldi and demonstrates how they both solve this problem (or don't solve it, in case of subcut):
https://gist.github.com/OlegIlyenko/5623423
If I would rewrite your example with Scaldi, then it would be something like this:
import scaldi.{Injector, Injectable, Module, DynamicModule}
trait Bla {
def m: String
}
class BlaImpl(implicit inj: Injector) extends Bla with Injectable {
val s = inject[String]('bla)
def m = "got " + s
}
object Program extends App with Injectable {
implicit val appModule =
new Module { bind [Bla] to new BlaImpl } ::
DynamicModule(_.binding identifiedBy 'bla to "bla!")
val bla = inject [Bla]
assert(bla.m == "got bla!")
}
As you can see, in Scaldi modules can see each-others dependencies if they are composed with :: or ++.

Related

How to match a Case Class containing a Parameter with Generic Type

I have an interesting Problem matching a Case Class in Scala....
I am using Akka and I have functionality that I will use in every Actor in my System, so created a Base Class for my Actor and I try to Match that Command there....
My Command looks like the following...
sealed trait ReportCommand extends ProcessCommand
final case class onReport(key: Key, replyTo: ActorRef[ResponseBase[State]]) extend ReportCommand
while I constructed Base Class so that it might be used from different Actors, onReport is delivered to Base Actor as generic parameter to be used in pattern match with a case class ...
abstract class BaseActor[E: ClassTag, R <: ReportBase[STATE], COMMAND](signal: TypeCase[R]) {
private val report = signal
def base[B <: E: ClassTag](cmd: E, state: STATE)(f: B => ReplyEffect[COMMAND, STATE]): ReplyEffect[COMMAND, STATE] =
cmd match {
case report(report) =>
Effect.reply(report.replytTo)(new ResponseBase[STATE]{
override def state: STATE = state
})
}
}
First if you think this construct will not work, it works, I have another Command (which I didn't place here) which does not have a generic parameter in the Command Class and above snippet is able to match that Snippet.
Now when I first try this code, Shapeless is complained about there is no mapping to ActorRef for Typeable of TypeCase, so after researching the internet I found I have to do the following....
implicit def mapActorRef[T: ClassTag]: Typeable[ActorRef[T]] =
new Typeable[ActorRef[T]] {
private val typT = Typeable[T]
override def cast(t: Any) : Option[ActorRef[T]] = {
if(t==null) None
else if(t.isInstanceOf[ActorRef[_]]) {
val o= t.asInstanceOf[ActorRef[_]]
for {
_ <- typT.cast(myClassOf)
} yield o.asInstanceOf[ActorRef[T]]
} else None
}
}
def myClassOf[T: ClassTag] = implicitly[ClassTag[T]].runtimeClass
implicit def responseBaseIsTypeable[S: Typeable] : Typeable[ResponseBase[S]] =
new Typeable[ResponseBase[S]] {
private val typS = Typeable[S]
override def cast(t: Any) : Option[ResponseState[S]] = {
if(t==null) None
else if(t.isIntanceOf[ResponseBase[_]]) {
val o = t.asInstanceOf[ResponseBase[_]]
for {
_ <- typS.cast(o.state)
} yield o.asInstanceOf[ResponseBase[S]]
} else None
}
}
Now after this changes I don't receive any Exceptions from Shapeless but case report(report) is not matching, I have no idea how we get a reasoning from Scala why it decide it does not match. During my debugging session I observed the following.
I am using the Akka's Ask Pattern to communicate with this actor...
val future : Future[BaseActor.ResponseBase[Actor.State]] = actorRef.ask[BaseActor.ResponseBase[Actor.State]](ref =>
Actor.onReport(key, ref)
)
now if I observe the cmd object that BaseActor receives, I see that 'ask' Pattern of the Akka change ActorRef in the onReport Command class to an ActorRefAdapter, of course ActorRefAdapter is a subclass of an ActorRef but I am not sure what I defined in the implicit for mapping ActorRef to TypeCase can deal with that stuff but I can't figure a way to change implicit to be aware of the Subtypes....
Unfortunately ActorRefAdapter is private to package package akka.actor.typed.internal.adapter so I can't define an extra mapping for ActorRefAdapter.
So can anybody see why Scala is not matching over my Shapeless <-> TypeCase configuration and give me some tips...
Thx for answers...
Your instance Typeable[ActorRef[T]] is incorrect.
Why did you decide to substitute a ClassTag in typT.cast(myClassOf)? This can't be meaningful.
I guess you used something like "No default Typeable for parametrized type" using Shapeless 2.1.0-RC2
If your gole is to make case report(replyTo) matching then you can define
implicit def mapActorRef[T: Typeable]: Typeable[ActorRef[T]] =
new Typeable[ActorRef[T]] {
private val typT = Typeable[T]
override def cast(t: Any): Option[ActorRef[T]] = {
if (t == null) None
else util.Try(t.asInstanceOf[ActorRef[T]]).toOption
}
override def describe: String = s"ActorRef[${typT.describe}]"
}
The problem is that this instance is also bad. Now case report(replyTo) is matching too much.
val actorTestKit = ActorTestKit()
val replyToRef = actorTestKit.spawn(ReplyToActor(), "replyTo")
import BaseActor._ // importing implicits
import shapeless.syntax.typeable._
val future: Future[BaseActor.ResponseBase[Actor.State]] = replyToRef.cast[ActorRef[Int]].get.ask[BaseActor.ResponseBase[Actor.State]](ref =>
1
)(5.seconds, system.scheduler)
Await.result(future, 10.seconds) // ClassCastException
A legal instance of the type class Typeable can be defined not for every type.
Providing instances for (concrete instantiations of) polymorphic types (where well defined) is pretty much the whole point of Typeable, both here and in Haskell.
The key phrase in the above is "where well defined". It's well defined in the case of non-empty container-like things. It's clearly not well defined for function values.
https://github.com/milessabin/shapeless/issues/69
ResponseBase is a non-empty container-like thing. But ActorRef is like a function T => Unit, so there shouldn't be a Typeable for it
trait ActorRef[-T] extends ... {
def tell(msg: T): Unit
...
}
You should reconsider your approach.

Checking if a Scala List is of class List [duplicate]

I'm trying to incorporate ScalaTest into my Java project; replacing all JUnit tests with ScalaTests. At one point, I want to check if Guice's Injector injects the correct type. In Java, I have a test like this:
public class InjectorBehaviour {
#Test
public void shouldInjectCorrectTypes() {
Injector injector = Guice.createInjector(new ModuleImpl());
House house = injector.getInstance(House.class);
assertTrue(house.door() instanceof WoodenDoor);
assertTrue(house.window() instanceof BambooWindow);
assertTrue(house.roof() instanceof SlateRoof);
}
}
But I have a problem doing the same with ScalaTest:
class InjectorSpec extends Spec {
describe("An injector") {
it("should inject the correct types") {
val injector = Guice.createInjector(new ModuleImpl)
val house = injector.getInstance(classOf[House])
assert(house.door instanceof WoodenDoor)
assert(house.window instanceof BambooWindow)
assert(house.roof instanceof SlateRoof)
}
}
}
It complains that the value instanceof is not a member of Door/Window/Roof. Can't I use instanceof that way in Scala?
Scala is not Java. Scala just does not have the operator instanceof instead it has a parametric method called isInstanceOf[Type].
You might also enjoy watching a ScalaTest Crash Course.
With Scalatest 2.2.x (maybe even earlier) you can use:
anInstance mustBe a[SomeClass]
If you want to be less JUnit-esque and if you want to use ScalaTest's matchers, you can write your own property matcher that matches for type (bar type erasure).
I found this thread to be quite useful: http://groups.google.com/group/scalatest-users/browse_thread/thread/52b75133a5c70786/1440504527566dea?#1440504527566dea
You can then write assertions like:
house.door should be (anInstanceOf[WoodenDoor])
instead of
assert(house.door instanceof WoodenDoor)
The current answers about isInstanceOf[Type] and junit advice are good but I want to add one thing (for people who got to this page in a non-junit-related capacity). In many cases scala pattern matching will suit your needs. I would recommend it in those cases because it gives you the typecasting for free and leaves less room for error.
Example:
OuterType foo = blah
foo match {
case subFoo : SubType => {
subFoo.thingSubTypeDoes // no need to cast, use match variable
}
case subFoo => {
// fallthrough code
}
}
Consolidating Guillaume's ScalaTest discussion reference (and another discussion linked to by James Moore) into two methods, updated for ScalaTest 2.x and Scala 2.10 (to use ClassTag rather than manifest):
import org.scalatest.matchers._
import scala.reflect._
def ofType[T:ClassTag] = BeMatcher { obj: Any =>
val cls = classTag[T].runtimeClass
MatchResult(
obj.getClass == cls,
obj.toString + " was not an instance of " + cls.toString,
obj.toString + " was an instance of " + cls.toString
)
}
def anInstanceOf[T:ClassTag] = BeMatcher { obj: Any =>
val cls = classTag[T].runtimeClass
MatchResult(
cls.isAssignableFrom(obj.getClass),
obj.getClass.toString + " was not assignable from " + cls.toString,
obj.getClass.toString + " was assignable from " + cls.toString
)
}
I use 2.11.8 to do the assertion with collections. The newer syntax is as follows:
val scores: Map[String, Int] = Map("Alice" -> 10, "Bob" -> 3, "Cindy" -> 8)
scores shouldBe a[Map[_, _]]

Pattern matching ParseResult in unit test

I'm stepping through my first Scala project, and looking at parser combinators in particular. I'm having trouble getting a simple unit test scenario to work, and trying to understand what I'm missing.
I'm stuck on pattern matching a ParseResult into the case classes of Success, Failure and Error. I can't get Scala to resolve the case classes. There's a few examples around of this, but they all seem to be using them inside something that extends of of the parser classes. For example the tests on github are inside the same package. The example here is inside a class extending a parser.
The test i'm trying to write looks like:
package test.parsertests
import parser.InputParser // my sut
import scala.util.parsing.combinator._
import org.scalatest.FunSuite
class SetSuite extends FunSuite {
val sut = new InputParser()
test("Parsing a valid command") {
val result = sut.applyParser(sut.commandParser, "SOME VALID INPUT")
result match {
case Success(x, _) => println("Result: " + x.toString) // <-- not found: value Success
case Failure(msg, _) => println("Failure: " + msg) // similar
case Error(msg, _) => println("Error: " + msg) // similar
}
}
}
and the method I'm calling is designed to let me exersize each of my parsers on my SUT:
package parser
import scala.util.parsing.combinator._
import scala.util.parsing.combinator.syntactical._
class InputParser extends StandardTokenParsers {
def commandParser: Parser[Command] =
("Command " ~> coord ~ coord ~ direction) ^^ { case x ~ y ~ d => new Command(x, y, d) }
def applyParser[T](p: Parser[T], c: String): ParseResult[T] = {
val tokens = new lexical.Scanner(c)
phrase(p)(tokens)
}
The fundamental issue is getting the case classes resolved in my test scope. Based on the source for the Parsers class, how can I get them defined? Can I resolve this with some additional import statements, or are they only accessible via inheritance? I've tried all the combinations that should resolve this issue, but I'm obviously missing something here.
Right hand column FTW! I stumbled on an answer in this related question. The problem was identifying the case classes as nested classes of Parsers.

Objects being "turned" to null randomly

Hey guys,
im working on a project in scala and i encountered very weird problem. This is a part of the code :
class Entity(val id:String){
override def toString = id
}
class RequirementType
case class DependsEither(t:List[(Entity,Int)]) extends RequirementType
case class Depends(t:(Entity,Int)) extends RequirementType
class BuildableEntity(override val id:String,
val mineralCost:Int,
val gasCost:Int,
val buildTime:Int,
val buildCount:Int,
val supplyCount:Int,
val req:List[RequirementType],
val onBuildStart: GameState => GameState,
val onBuildFinish: GameState => GameState
)extends Entity(id)
class SimpleBuilding(id:String,
mineralCost:Int,
gasCost:Int,
buildTime:Int,
req:List[RequirementType]
) extends BuildableEntity(id,mineralCost,gasCost,buildTime,1,0,req:::List(ConsumesOnStart((Drone,1))),{s=>s},{x=>x})
object SpawningPool extends SimpleBuilding("spawningPool",200,0,65,List(DependsEither(List((Hatchery,1),(Lair,1),(Hive,1)))))
object Lair extends SimpleBuilding("lair",150,100,80,List(ConsumesOnFinish(Hatchery,1),Depends(SpawningPool,1)))
object InfestationPit extends SimpleBuilding("infestationPit",100,100,50,List(DependsEither(List((Lair,1),(Hive,1)))))
Now, when i call println(Lair.req), it sometimes prints as
List(ConsumesOnFinish((hatchery,1)), Depends((null,2)), ConsumesOnStart((drone,1)))
and sometimes as
List(ConsumesOnFinish((hatchery,1)),
Depends((spawningPool,2)), ConsumesOnStart((drone,1)))
Please, if anyone has any idea about what could be going wrong, i would love you for ever. I have no clue why is it act as it does. I have more extensions of SimpleBuilding but they seem to be working properly
EDIT:
I should also mention that the outcome changes after compilation. I mean that when i run unit test it sometimes appear as null and sometimes as proper instance.
This is indeed a case of circular dependency and initialization. Here is a shorter version of your issue:
class X(val x: List[X])
object A extends X(List(B))
object B extends X(List(A))
object Main {
def main(args:Array[String]) {
println("A.x: " + A.x)
println("B.x: " + B.x)
}
}
This will print this:
$ scala -cp classes Main
A.x: List(B$#143c8b3)
B.x: List(null)
You can use by names parameter to allow object construction to finish before you use it:
class X(x0: => List[X]) {
lazy val x = x0
}
object A extends X(List(B))
object B extends X(List(A))
The fix works on the small test case:
$ scala -cp classes Main
A.x: List(B$#1feca64)
B.x: List(A$#6d084b)
Based on this you may want to change req:List[RequirementType] to req0: => List[RequirementType] and add a lazy val req = req0.
If that works for you, we should retitle the question to mention object initialization and circular dependencies. Note this is very similar to this question/answer.
Lair use SpawningPool in its constructor and reciprocally. But at that time, the other doesn't exists.
You've got recursive definitions in constructors, and although I believe that is supported, it looks like something's going wrong. Can you try lazy vals instead and see if the problem goes away? That is,
object X extends C("this",that,1) { /* code */ }
becomes
lazy val X = new C("this",that,1) { /* code */ }

Syntactic sugar for compile-time object creation in Scala

Lets say I have
trait fooTrait[T] {
def fooFn(x: T, y: T) : T
}
I want to enable users to quickly declare new instances of fooTrait with their own defined bodies for fooFn. Ideally, I'd want something like
val myFoo : fooTrait[T] = newFoo((x:T, y:T) => x+y)
to work. However, I can't just do
def newFoo[T](f: (x:T, y:T) => T) = new fooTrait[T] { def fooFn(x:T, y:T):T = f(x,y); }
because this uses closures, and so results in different objects when the program is run multiple times. What I really need is to be able to get the classOf of the object returned by newFoo and then have that be constructable on a different machine. What do I do?
If you're interested in the use case, I'm trying to write a Scala wrapper for Hadoop that allows you to execute
IO("Data") --> ((x: Int, y: Int) => (x, x+y)) --> IO("Out")
The thing in the middle needs to be turned into a class that implements a particular interface and can then be instantiated on different machines (executing the same jar file) from just the class name.
Note that Scala does the right thing with the syntactic sugar that converts (x:Int) => x+5 to an instance of Function1. My question is whether I can replicate this without hacking the Scala internals. If this was lisp (as I'm used to), this would be a trivial compile-time macro ... :sniff:
Here's a version that matches the syntax of what you list in the question and serializes/executes the anon-function. Note that this serializes the state of the Function2 object so that the serialized version can be restored on another machine. Just the classname is insufficient, as illustrated below the solution.
You should make your own encode/decode function, if even to just include your own Base64 implementation (not to rely on Sun's Hotspot).
object SHadoopImports {
import java.io._
implicit def functionToFooString[T](f:(T,T)=>T) = {
val baos = new ByteArrayOutputStream()
val oo = new ObjectOutputStream(baos)
oo.writeObject(f)
new sun.misc.BASE64Encoder().encode(baos.toByteArray())
}
implicit def stringToFun(s: String) = {
val decoder = new sun.misc.BASE64Decoder();
val bais = new ByteArrayInputStream(decoder.decodeBuffer(s))
val oi = new ObjectInputStream(bais)
val f = oi.readObject()
new {
def fun[T](x:T, y:T): T = f.asInstanceOf[Function2[T,T,T]](x,y)
}
}
}
// I don't really know what this is supposed to do
// just supporting the given syntax
case class IO(src: String) {
import SHadoopImports._
def -->(s: String) = new {
def -->(to: IO) = {
val IO(snk) = to
println("From: " + src)
println("Applying (4,5): " + s.fun(4,5))
println("To: " + snk)
}
}
}
object App extends Application {
import SHadoopImports._
IO("MySource") --> ((x:Int,y:Int)=>x+y) --> IO("MySink")
println
IO("Here") --> ((x:Int,y:Int)=>x*y+y) --> IO("There")
}
/*
From: MySource
Applying (4,5): 9
To: MySink
From: Here
Applying (4,5): 25
To: There
*/
To convince yourself that the classname is insufficient to use the function on another machine, consider the code below which creates 100 different functions. Count the classes on the filesystem and compare.
object App extends Application {
import SHadoopImports._
for (i <- 1 to 100) {
IO(i + ": source") --> ((x:Int,y:Int)=>(x*i)+y) --> IO("sink")
}
}
Quick suggestion: why don't you try to create an implicit def transforming FunctionN object to the trait expected by the --> method.
I do hope you won't have to use any macro for this!