scala self aware trait - scala

I've made a Logging trait which encapsulates the details of a logging implementation, it's also nice and lazy so is efficient especially when a particular log level is not active.
/**
* A SLF4J based logging trait
*/
trait Log {
import org.slf4j.Logger
import org.slf4j.LoggerFactory
val loggedClazz: Class[_]
lazy val logger: Logger = LoggerFactory.getLogger(loggedClazz.getClass)
def logDebug(codeblock: => String) = {
if (logger.isDebugEnabled) {
logger.debug(codeblock)
}
}
def logError(codeblock: => String) = {
if (logger.isErrorEnabled) {
logger.error(codeblock)
}
}
def logInfo(codeblock: => String) = {
if (logger.isInfoEnabled) {
logger.info(codeblock)
}
}
def logWarn(codeblock: => String) = {
if (logger.isWarnEnabled) {
logger.warn(codeblock)
}
}
}
However it requires the class into which this trait is mixed-in to implement the following..
object MyServer extends Log {
val loggedClazz = MyServer.getClass
}
My question is, is it possible to somehow enable the Trait to know into which class it has been mixed into? Removing the need to do:
val loggedClazz = MyServer.getClass
SOLUTION: Following the provided feedback, I rewrote the class in the following manner.
/**
* A SLF4J based logging trait
*/
trait Log {
import org.slf4j.Logger
import org.slf4j.LoggerFactory
lazy val logger: Logger = LoggerFactory.getLogger(getClass)
def logDebug(codeblock: => String) = {
if (logger.isDebugEnabled) {
logger.debug(codeblock)
}
}
def logError(codeblock: => String) = {
if (logger.isErrorEnabled) {
logger.error(codeblock)
}
}
def logInfo(codeblock: => String) = {
if (logger.isInfoEnabled) {
logger.info(codeblock)
}
}
def logWarn(codeblock: => String) = {
if (logger.isWarnEnabled) {
logger.warn(codeblock)
}
}
}
Totally simple. When you do it right, first time ;)

You could replace val loggedClazz: Class[_] with val loggedClazz = getClass.

Your current code won't work as expected as the returned logger will always be for class Class[Class[_]], as you're calling getClass on a Class[_] object.
Use this instead:
lazy val logger: Logger = LoggerFactory.getLogger(getClass)
You may also want to have a look SLF4S, a thin wrapper around SLF4J, which is very similar to what you're doing.

Related

scala: How to obtain class name through complex polymorphism with compile time macros?

When attempting to get the name of a class via a WeakTypeTag reference when defining a macro implementation, I can't seem to get the proper info if multiple layers of polymorphism are applied.
For example if I have the following setup:
object MacroSupport {
def get_name_impl[A: c.WeakTypeTag](c: blackbox.Context): c.Expr[String] = {
val nameOfA: String = weakTypeOf[A].toString
...
}
def getName[A] = macro get_name_impl[A]
}
abstract class GenericInterface[T] {
def getName: String = MacroSupport.getName[T]
}
case class ContainerA(
someValue: String
)
class FunctionalClass extends GenericInterface[ContainerA] {
val containerName: String = getName
}
What I hope to achieve is having any number of FunctionalClass's, each with their own Container class, and they can report the name of their container, which is used for some meta configuration. Basically MacroSupport and GenericInterface will exist in a library I'm writing while the FunctionalClass and Container levels will be written by others using the library.
The issue I'm having, due to the pass through type in the GenericInterface, FunctionalClass.containerName == "t", and attempts to access Type declarations yield nothing. How can I get the type information from the FunctionalClass declaration to the MacroSupport level?
Try materialization of type class
https://docs.scala-lang.org/overviews/macros/implicits.html#implicit-materializers
import scala.reflect.macros.blackbox
import scala.language.experimental.macros
object MacroSupport {
def getName[A](implicit gn: GetName[A]): String = gn()
trait GetName[A] {
def apply(): String
}
object GetName {
implicit def materializeGetName[A]: GetName[A] = macro materializeGetNameImpl[A]
def materializeGetNameImpl[A: c.WeakTypeTag](c: blackbox.Context): c.Expr[GetName[A]] = {
import c.universe._
c.Expr[GetName[A]] {
q"""
new MacroSupport.GetName[${weakTypeOf[A]}] {
override def apply(): _root_.java.lang.String = ${weakTypeOf[A].toString}
}
"""
}
}
}
}
import MacroSupport.GetName
abstract class GenericInterface[T: GetName] {
def getName: String = MacroSupport.getName[T]
}
case class ContainerA(
someValue: String
)
class FunctionalClass extends GenericInterface[ContainerA] {
val containerName: String = getName
}
(new FunctionalClass).containerName // ContainerA
By the way, shapeless.Typeable does the job. Typeable[A].describe is like our MacroSupport.getName[A].

How to create generic JSON serializer/deserializer in Scala?

I wanted to serialize and deserialize some case classes and realized I was repeating code. Unfortunately I cannot figure out a way to keep things DRY. Hoping someone can provide some assistance. Below I will provide a sample problem that is not DRY.
Sample Problem
import org.json4s.jackson.Serialization
import java.time.ZonedDateTime
import java.time.format.DateTimeFormatter
import org.json4s.JsonAST.JString
import org.json4s.{CustomSerializer, DefaultFormats}
case class Bar(bar: String, date: ZonedDateTime)
case class Foo(foo: String)
trait JsonParser {
private case object ZDTSerializer extends CustomSerializer[ZonedDateTime](_ => (
{ case JString(s) => ZonedDateTime.parse(s) },
{ case zdt: ZonedDateTime => JString(zdt.format(DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSXXX"))) }
))
implicit val formats = DefaultFormats + ZDTSerializer
}
object BarParser extends JsonParser {
def deserialize(jsonBar: String): Bar = {
Serialization.read[Bar](jsonBar)
}
def serialize(bar: Bar): String = {
Serialization.write[Bar](bar)
}
}
object FooParser extends JsonParser {
def deserialize(jsonFoo: String): Foo = {
Serialization.read[Foo](jsonFoo)
}
def serialize(foo: Foo): String = {
Serialization.write[Foo](foo)
}
}
object Main {
def main(args: Array[String]): Unit = {
val foo = Foo("foo")
println(FooParser.serialize(foo)) // {"foo":"foo"}
println(FooParser.deserialize(FooParser.serialize(foo))) // Foo(foo)
}
}
Above it is clear that the logic to serialize and deserialize is repeated. This is one of the things I've tried (which doesn't compile).
Attempt to Solve
case class Bar(product: String, date: ZonedDateTime)
case class Foo(title: String)
abstract class GenericJsonParser[T] {
private case object ZDTSerializer extends CustomSerializer[ZonedDateTime](_ => (
{ case JString(s) => ZonedDateTime.parse(s) },
{ case zdt: ZonedDateTime => JString(zdt.format(DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSXXX"))) }
))
implicit val formats = DefaultFormats + ZDTSerializer
def deserialize(json: String): T = {
Serialization.read[T](json) // No Manifest available for T
}
def serialize(thing: T): String = {
Serialization.write[T](thing) // type arguments [A] conform to the bounds of none of the overloaded alternatives ...
}
}
object BarJsonParser extends GenericJsonParser[Bar]
object FooParser extends GenericJsonParser[Foo]
Any guidance would be appreciated.
I think you can use Json.format[ACaseClass], for example:
import play.api.libs.json.{Format, Json}
case class ACaseClass(value: String, anotherValue: Int)
implicit val formatACaseClass = Json.format[ACaseClass]
I guess for Seralization.read and write you still have to pass an implicit value, jvm need must know how to read/write your object/

Scala, typeclass and "could not find implicit value"

I am facing some weird problem with typeclass below: for some reason implicit object ContentUploader is not resolved on call to upload method of DemoActor.
import akka.actor.Actor
import java.io.File
import org.slf4j.LoggerFactory
class DemoActor extends Actor {
import DemoActor.UploaderImpl._
override def receive = {
case (x: DemoActor.Content) =>
DemoActor.upload(x)
}
}
object DemoActor {
val LOG = LoggerFactory.getLogger("DemoActor")
sealed trait UploadData {
val data: Array[File]
}
case class Content(data: Array[File]) extends UploadData
case class UploadResult(url: String, contentType: String, size: Long)
trait S3Uploader[T <: UploadData] {
def uploadToS3(filez: Array[File]): Iterable[UploadResult]
}
object UploaderImpl {
val LOG = LoggerFactory.getLogger("Uploader")
private def contentType(name: String): String = {
"application/octet-stream"
}
private def doUpload(filez: Array[File], bucketName: String) = {
LOG.debug("Uploading: {} to {}", filez, bucketName)
filez.flatMap {
case f =>
try {
val key = f.getName
val mime = contentType(f.getName)
Some(UploadResult("http://" + bucketName + ".s3.amazonaws.com/" + key, mime, f.length()))
} catch {
case e =>
LOG.error("Can not upload", e)
None
}
}
}
implicit object ContentUploader extends S3Uploader[Content] {
lazy val bucketName = "resources.aws.bucketname"
lazy val awsSecret = "resources.aws.secret.key"
lazy val awsAccess = "resources.aws.access.key"
override def uploadToS3(filez: Array[File]) = doUpload(filez, bucketName)
}
}
def upload[T <: UploadData](src: T)(implicit uploader: S3Uploader[T]) = uploader.uploadToS3(src.data)
}
What have I missed here?
UPD
if I move definition of class for DemoActor inside object DemoActor, like
import akka.actor.Actor
import java.io.File
import org.slf4j.LoggerFactory
object DemoActor {
val LOG = LoggerFactory.getLogger("DemoActor")
sealed trait UploadData {
val data: Array[File]
}
case class Content(data: Array[File]) extends UploadData
case class UploadResult(url: String, contentType: String, size: Long)
trait S3Uploader[UploadData] {
def uploadToS3(filez: Array[File]): Iterable[UploadResult]
}
object UploaderImpl {
val LOG = LoggerFactory.getLogger("Uploader")
private def contentType(name: String): String = {
"application/octet-stream"
}
private def doUpload(filez: Array[File], bucketName: String) = {
LOG.debug("Uploading: {} to {}", filez, bucketName)
filez.flatMap {
case f =>
try {
val key = f.getName
val mime = contentType(f.getName)
Some(UploadResult("http://" + bucketName + ".s3.amazonaws.com/" + key, mime, f.length()))
} catch {
case e =>
LOG.error("Can not upload", e)
None
}
}
}
implicit object ContentUploader extends S3Uploader[DemoActor.Content] {
lazy val bucketName = "resources.aws.bucketname"
lazy val awsSecret = "resources.aws.secret.key"
lazy val awsAccess = "resources.aws.access.key"
override def uploadToS3(filez: Array[File]) = doUpload(filez, bucketName)
}
}
def upload[T <: UploadData](src: T)(implicit uploader: S3Uploader[T]) = uploader.uploadToS3(src.data)
class DemoActor extends Actor {
import DemoActor.UploaderImpl._
override def receive = {
case (x: DemoActor.Content) =>
DemoActor.upload(x)
}
}
}
then everything works well. Are there some issues with namespacing?
It is not finding it because implicit forward references must be explicitly typed to be considered, and this one isn't.
If this is confusing, maybe two ways of fixing it might make it clear. First, you can declare the type of the implicit. Remove the implicit from the object, and declare a val pointing to it:
implicit val contentUploader: S3Uploader[DemoActor.Content] = ContentUploader
The second way is moving the class DemoActor declaration to the end of the file, so it stays after the the object DemoActor declaration.
The reason it works like this is that the compiler must search for the implicit before the rest of the file is fully typed, so it doesn't know, at that time, that object ContentUploader satisfy the search.

Scala implicit type class dependency injection

I'd like some help sorting out this scenario. I have an Akka actor where I want to inject a dependency, in this case RemoteFetcher, which I would also like mock in my tests. Like so:
main/src/scala/mypackage/Services.scala
package mypackage
import RemoteFetcherFileSystem._
trait RemoteFetcher {
def fetch( path:String ): Future[Stream[String]]
}
class MyRemoteResourceActor extends Actor with ActorLogging {
def fetchRemote( path:String ) = implicitly[RemoteFetcher].fetch( path )
def receive = {
case FetchRemoteResource( path ) => fetchRemote( path ).map( _.foreach( sender ! _ ) )
}
}
For this to work I have an implicit object that I import into the file above. Would look something like this:
implicit object RemoteFetcherFileSystem extends RemoteFetcher {
def fetchRemote( path:String ) = Future[Stream[String]] { ... reading from file system ... }
}
Now in my tests I have TestActor from the akka-testkit. Here I want to instead import my mock dependency:
implicit object RemoteFetcherMock extends RemoteFetcher {
def fetchRemote( path:String ) = Future[Stream[String]] { ... mock implementation ... }
}
My problem is that to compile Services.scala I need to import the implicit object. But how do I go about to shadow/override this in my test-files. The reason I'm not using implicit arguments is that I want to avoid having to modify all my actors constructor arguments.
I when looking around and reading up on the type class dependency injection pattern and I get it to work according to the tutorials, but I don't get it to work when I want to test and override like in my example.
I'm not sure how to do it with implicits, but typically one could inject instead like so:
trait RemoteFetcherComponent {
def remoteFetcher: RemoteFetcher
trait RemoteFetcher {
def fetch(path: String): Future[Stream[String]]
}
}
trait RemoteFetcherFileSystemComponent extends RemoteFetcherComponent {
val remoteFetcher = RemoteFetcherFileSystem
object RemoteFetcherFileSystem extends RemoteFetcher {
def fetch(path: String): Future[Stream[String]] = ???
}
}
class MyRemoteResourceActor extends Actor with ActorLogging with RemoteFetcherFileSystemComponent {
def fetchRemote(path: String) = remoteFetcher.fetch(path)
def receive = {
case FetchRemoteResource(path) => fetchRemote(path).map( _.foreach(sender ! _))
}
}
val myRemoteResourceActor = new MyRemoteResourceActor()
And then a test value would be defined like so:
trait RemoteFetcherMockComponent extends RemoteFetcherComponent {
def remoteFetcher = RemoteFetcherMock
object RemoteFetcherMock extends RemoteFetcher {
def fetch(path: String): Future[Stream[String]] = ???
}
}
val myMockedResourceActor = new MyRemoteResourceActor with RemoteFetcherMockComponent {
override val remoteFetcher = super[RemoteFetcherMockComponent].remoteFetcher
}
The reason you are having an issue with implicits is because the way you're using it is no different from simply using def fetchRemote(path: String) = RemoteFetcherFileSystem.fetch(path). With the import, you've defined the implementation, rather than allowed it to be injected later.
You could also change the implicitly to an implicit parameter:
trait RemoteFetcher {
def fetch(path: String): Future[Stream[String]]
}
object RemoteFetcher {
implicit val fetcher = RemoteFetcherFileSystem
}
class MyRemoteResourceActor extends Actor with ActorLogging {
def fetchRemote(path: String)(implicit remoteFetcher: RemoteFetcher) = remoteFetcher.fetch(path)
def receive = {
case FetchRemoteResource(path) => fetchRemote(path).map( _.foreach(sender ! _))
}
}
Then you could override the implicit that is resolved in the companion object of RemoteFetcher by simply importing RemoteFetcherMock.
See this post for more information about implicit parameter resolution precedence rules.

Is it possible to pass "this" as implicit parameter in Scala?

Suppose I want to wrap code that can throw exceptions with a try-catch block that logs the exception and continues. Something like:
loggingExceptions {
// something dangerous
}
Ideally, I would like to use for logging the Logger defined on the calling object, if any (and if none, get a compile-time error). I'd love to define something like this:
def loggingExceptions[L <: { def logger: Logger }](work: => Unit)(implicit objectWithLogger: L): Unit = {
try {
work
} catch {
case t: Exception => objectWithLogger.logger.error(t.getMessage)
}
}
where objectWithLogger would somehow "magically" expand to "this" in client code. Is this (or a similar thing) possible?
It can in fact be done just as you want. The other answerers surrendered too quickly. No white flags!
package object foo {
type HasLogger = { def logger: Logger }
implicit def mkLog(x: HasLogger) = new {
def loggingExceptions(body: => Unit): Unit =
try body
catch { case ex: Exception => println(ex) }
}
}
package foo {
case class Logger(name: String) { }
// Doesn't compile:
// class A {
// def f = this.loggingExceptions(println("hi"))
// }
// 1124.scala:14: error: value loggingExceptions is not a member of foo.A
// def f = this.loggingExceptions(println("hi"))
// ^
// one error found
// Does compile
class B {
def logger = Logger("B")
def f = this.loggingExceptions(println("hi"))
def g = this.loggingExceptions(throw new Exception)
}
}
object Test {
def main(args: Array[String]): Unit = {
val b = new foo.B
b.f
b.g
}
}
// output
//
// % scala Test
// hi
// java.lang.Exception
Debilski's answer will work, but I'm not sure I see a good reason to use a structural type (i.e. { def logger: Logger }) here. Doing so will incur extra runtime overhead whenever logger is invoked, since the implementation of structural types relies on reflection. The loggingExceptions method is closely tied to logging, so I would just make it part of a Logging trait:
trait Logging {
def logger: Logger
final def loggingExceptions(body: => Unit) =
try body catch { case e: Exception => logger.error(e.getMessage) }
}
trait ConcreteLogging extends Logging {
val logger = // ...
}
object MyObject extends SomeClass with ConcreteLogging {
def main {
// ...
loggingExceptions {
// ...
}
}
}
You could add a trait to all classes which want to use def loggingExceptions and in this trait add a self-type which expects def logger: Logger being available.
trait LoggingExceptions {
this: { def logger: Logger } =>
def loggingExceptions(work: => Unit) {
try { work }
catch { case t: Exception => logger.error(t.getMessage) }
}
}
object MyObjectWithLogging extends OtherClass with LoggingExceptions {
def logger: Logger = // ...
def main {
// ...
loggingExceptions { // ...
}
}
}