Is there anything standardized within the Scala library to support the disposable resource pattern?
I mean something similar to that supported by C# and .NET just to mention one.
For example does official Scala library provide something like this:
trait Disposable { def dispose() }
class Resource extends Disposable
using (new Resource) { r =>
}
Note: I'm aware of this article «Scala finally block closing/flushing resource» but it seems not integrated within the standard lib
Starting Scala 2.13, the standard library provides a dedicated resource management utility: Using.
You will just have to provide an implicit definition of how to release the resource, using the Releasable trait:
import scala.util.Using
import scala.util.Using.Releasable
case class Resource(field: String)
implicit val releasable: Releasable[Resource] = resource => println(s"closing $resource")
Using(Resource("hello world")) { resource => resource.field.toInt }
// closing Resource(hello world)
// res0: scala.util.Try[Int] = Failure(java.lang.NumberFormatException: For input string: "hello world")
Note that you can place the implicit releasable in the companion object of Resource for clarity.
Note that you can also use Java's AutoCloseable instead of Using.Releasable and thus any Java or Scala object implementing AutoCloseable (such as scala.io.Source or java.io.PrintWriter) can directly be used with Using:
import scala.util.Using
case class Resource(field: String) extends AutoCloseable {
def close(): Unit = println(s"closing $this")
}
Using(Resource("hello world")) { resource => resource.field.toInt }
// closing Resource(hello world)
// res0: scala.util.Try[Int] = Failure(java.lang.NumberFormatException: For input string: "hello world")
At this time you will need to look to the Scala ARM for a common implementation. Though, as you mentioned, it is a separate library.
For more information:
This answer at functional try & catch w/ Scala links to the Loan Pattern on the scala wiki which has code samples. (I am not re-posting the link because link is subject to change)
Using a variable in finally block has a few answers showing ways you could write your own.
Related
No instance of play.api.libs.json.Format is available for akka.actor.typed.ActorRef[org.knoldus.eventSourcing.UserState.Confirmation] in the implicit scope (Hint: if declared in the same file, make sure it's declared before)
[error] implicit val userCommand: Format[AddUserCommand] = Json.format
I am getting this error even though I have made Implicit instance of Json Format for AddUserCommand.
Here is my code:
trait UserCommand extends CommandSerializable
object AddUserCommand{
implicit val format: Format[AddUserCommand] = Json.format[AddUserCommand]
}
final case class AddUserCommand(user:User, reply: ActorRef[Confirmation]) extends UserCommand
Can anyone please help me with this error and how to solve it?
As Gael noted, you need to provide a Format for ActorRef[Confirmation]. The complication around this is that the natural serialization, using the ActorRefResolver requires that an ExtendedActorSystem be present, which means that the usual approaches to defining a Format in a companion object won't quite work.
Note that because of the way Lagom does dependency injection, this approach doesn't really work in Lagom: commands in Lagom basically can't use Play JSON.
import akka.actor.typed.scaladsl.adapter.ClassicActorSystemOps
import play.api.libs.json._
class PlayJsonActorRefFormat(system: ExtendedActorSystem) {
def reads[A] = new Reads[ActorRef[A]] {
def reads(jsv: JsValue): JsResult[ActorRef[A]] =
jsv match {
case JsString(s) => JsSuccess(ActorRefResolver(system.toTyped).resolveActorRef(s))
case _ => JsError(Seq(JsPath() -> Seq(JsonValidationError(Seq("ActorRefs are strings"))))) // hopefully parenthesized that right...
}
}
def writes[A] = new Writes[ActorRef[A]] {
def writes(a: ActorRef[A]): JsValue = JsString(ActorRefResolver(system.toTyped).toSerializationFormat(a))
}
def format[A] = Format[ActorRef[A]](reads, writes)
}
You can then define a format for AddUserCommand as
object AddUserCommand {
def format(arf: PlayJsonActorRefFormat): Format[AddUserCommand] = {
implicit def arfmt[A]: Format[ActorRef[A]] = arf.format
Json.format[AddUserCommand]
}
}
Since you're presumably using JSON to serialize the messages sent around a cluster (otherwise, the ActorRef shouldn't be leaking out like this), you would then construct an instance of the format in your Akka Serializer implementation.
(NB: I've only done this with Circe, not Play JSON, but the basic approach is common)
The error says that it cannot construct a Format for AddUserCommand because there's no Format for ActorRef[Confirmation].
When using Json.format[X], all the members of the case class X must have a Format defined.
In your case, you probably don't want to define a formatter for this case class (serializing an ActorRef doesn't make much sense) but rather build another case class with data only.
Edit: See Levi's answer on how to provide a formatter for ActorRef if you really want to send out there the actor reference.
I am trying ZIO.
I do not understand why Live is added as Trait, and then an object is provided, like:
object Live extends Live
This pattern is found in different places, for example zio.console.Console.
Is there a reason, or are there cases where this makes sense?
What you see in ZIO is the usage of a pattern called Selfless Trait.
To implement the selfless trait pattern you simply provide a companion object for a trait that itself mixes in the trait.
trait Greeting {
def greet() { println("hi there") }
}
object Greeting extends Greeting
Then the user of the library has the choice to either mix-in Greeting:
object MixinExample extends Application with Greeting {
greet()
}
or to import the members of the Greeting companion object, like this:
import Greeting._
object ImportExample extends Application {
greet()
}
Just as an addition to Krzysztof Atłasik answer.
As mentioned in jq170727 comment you find these two cases here:
introduce-a-database-module
Object:
In the worst case, if we are pressed for time and need to ship code
today, maybe we choose to provide the production database wherever we
call inviteFriends.
inviteFriends(userId).provide(DatabaseLive)
In this case, instead of using the DefaultRuntime that ships with ZIO,
we can define our own Runtime, which provides the production database
module):
val myRuntime = Runtime(DatabaseLive, PlatformLive)
Trait:
When you have multiple Runtimes.
val myRuntime =
Runtime(
new DatabaseLive
with SocialLive
with EmailLive, PlatformLive)
I have a 3rd party library with package foo.bar
I normally use it as:
import foo.bar.{Baz => MyBaz}
object MyObject {
val x = MyBaz.getX // some method defined in Baz
}
The new version of the library has renamed the package from foo.bar to newfoo.newbar. I have now another version of my code with the slight change as follows:
import newfoo.newbar.{Baz => MyBaz}
object MyObject {
val x = MyBaz.getX // some method defined in Baz
}
Notice that only the first import is different.
Is there any way I can keep the same version of my code and still switch between different versions of the 3rd party library as and when needed?
I need something like conditional imports, or an alternative way.
The other answer is on the right track but doesn't really get you all the way there. The most common way to do this kind of thing in Scala is to provide a base compatibility trait that has different implementations for each version. In my little abstracted library, for example, I have the following MacrosCompat for Scala 2.10:
package io.travisbrown.abstracted.internal
import scala.reflect.ClassTag
private[abstracted] trait MacrosCompat {
type Context = scala.reflect.macros.Context
def resultType(c: Context)(tpe: c.Type)(implicit
tag: ClassTag[c.universe.MethodType]
): c.Type = {
import c.universe.MethodType
tpe match {
case MethodType(_, res) => resultType(c)(res)
case other => other
}
}
}
And this one for 2.11:
package io.travisbrown.abstracted.internal
import scala.reflect.ClassTag
private[abstracted] trait MacrosCompat {
type Context = scala.reflect.macros.whitebox.Context
def resultType(c: Context)(tpe: c.Type): c.Type = tpe.finalResultType
}
And then my classes, traits, and objects that use the macro reflection API can just extend MacrosCompat and they'll get the appropriate Context and an implementation of resultType for the version we're currently building (this is necessary because of changes to the macros API between 2.10 and 2.11).
(This isn't originally my idea or pattern, but I'm not sure who to attribute it to. Probably Eugene Burmako?)
If you're using SBT, there's special support for version-specific source trees—you can have a src/main/scala for your shared code and e.g. src/main/scala-2.10 and src/main/scala-2.11 directories for version-specific code, and SBT will take care of the rest.
You can try to use type aliases:
package myfoo
object mybar {
type MyBaz = newfoo.newbar.Baz
// val MyBaz = newfoo.newbar.Baz // if Baz is a case class/object, then it needs to be aliased twice - as a type and as a value
}
And then you may simply import myfoo.mybar._ and replace the object mybar to switch to different version of the library.
I'm learning Spray and Akka. And I'm learning it through TypeSafe's templates, and this one is very complex at least:
http://typesafe.com/activator/template/akka-spray-websocket
I now understand the werid structure this template has is to separate routing logic and business logic and it's amazingly done. However, although I know the purpose of this structure, I don't know what's the functionality of this small piece and why is it necessary:
They have a class called MainActors.scala:
trait MainActors {
this: AbstractSystem =>
lazy val find = system.actorOf(Props[FindActor], "find")
lazy val hide = system.actorOf(Props[HideActor], "hide")
}
Then the template concatenates all the routings under a class called ReactiveApi.scala:
trait AbstractSystem {
implicit def system: ActorSystem
}
trait ReactiveApi extends RouteConcatenation with StaticRoute with AbstractSystem {
this: MainActors =>
val rootService = system.actorOf(Props(classOf[RootService], routes))
lazy val routes = logRequest(showReq _) {
new FindService(find).route ~
new HideService(hide).route ~
staticRoute
}
private def showReq(req : HttpRequest) = LogEntry(req.uri, InfoLevel)
}
Actually, my question is simple: what is the purpose of AbstractSystem trait? how is it used and why is it used?
This trait is also passed into actual actor:
class FindService(find : ActorRef)(implicit system : ActorSystem) extends Directives {
lazy val route = ...
}
Also, if it is not entirely inconvenient, what's the functionality of logRequest() and showReq()?
For Spray: why do I have to pass an actor (ActorRef) into FindServce? I don't see any specific methods being invoked from inside.
This is a very simple example of using abstract defs in order to do the cake pattern (very simplified though). The goal is to say "hey, I need this thing", and the implementor must then provide the actor system to you – by implementing the def system. The goal is of course to make this def available to MainActors.
As for the self type reference, you can refer to ktoso/types-of-types#self-type-annotation to find out more about it.
I have a number of use cases for this, all around the idea of interop between existing Java libraries and new Scala Code. The use case I've selected is the easiest I think.
Use Case:
I working on providing a JUnit Runner for some scala tests (so that I can get my lovely red / green bar in Eclipse)
The runner needs to have a constructor with a java class as a parameter. So in Scala I can do the following:
class MyRunner(val clazz: Class[Any]) extends Runner {
def getDescription(): Description
def run(notifier: RunNotifier)
}
When I use either
#RunWith(MyRunner)
object MyTestObject
or
#RunWith(MyRunner)
class MyTestClass
then the runner is indeed instantiated correctly, and is passed a suitable class object
Unfortunately what i want to do now is to "get hold of" the object MyTestObject, or create a MyTestClass, which are both Scala entities. I would prefer to use Scala Reflection, but I also want to use the standard Junit jar.
What I have done
The following Stackover flow questions were educational, but not the same problem. There were the nearest questions I could find
How to create a TypeTag manually?
Any way to obtain a Java class from a Scala (2.10) type tag or symbol?
Using Scala reflection with Java reflection
The discussion on Environments, Universes and Mirrors in http://docs.scala-lang.org/overviews/reflection/environment-universes-mirrors.html was good, and the similar documents on other scala reflection also helped. Mostly through it is about the Scala reflection.
I browsed the Scaladocs, but my knowledge of Scala reflection wasn't enough (yet) to let me get what I wanted out of them.
Edit:
As asked here is the code of the class that is being created by reflection
#RunWith(classOf[MyRunner])
object Hello2 extends App {
println("starting")
val x= "xxx"
}
So the interesting thing is that the solution proposed below using the field called MODULE$ doesn't print anything and the value of x is null
This solution works fine if you want to use plan old java reflection. Not sure if you can use scala reflection given all you will have is a Class[_] to work with:
object ReflectTest {
import collection.JavaConversions._
def main(args: Array[String]) {
val fooObj = instantiate(MyTestObject.getClass())
println(fooObj.foo)
val fooClass = instantiate(classOf[MyTestClass])
println(fooClass.foo)
}
def instantiate(clazz:Class[_]):Foo = {
val rm = ru.runtimeMirror(clazz.getClassLoader())
val declaredFields = clazz.getDeclaredFields().toList
val obj = declaredFields.find(field => field.getName() == "MODULE$") match{
case Some(modField) => modField.get(clazz)
case None => clazz.newInstance()
}
obj.asInstanceOf[Foo]
}
}
trait Foo{
def foo:String
}
object MyTestObject extends Foo{
def foo = "bar"
}
class MyTestClass extends Foo{
def foo = "baz"
}