In Scala cats 2.x, how to create a generic zero functor (given its type class)? - scala

I'm looking for a method to initialise a zero functor z: F[_], given the type class Functor[F[_]]. The zero functor should ensure that any other function compose with it yields a fixed point. I've tried the following code:
import cats.Functor
import cats.syntax.functor._
import cats.instances.option._
import cats.instances.list._
def zeroFunctor[F[_]: Functor]: F[Nothing] = {
val functorInstance = implicitly[Functor[F]]
functorInstance.map(functorInstance.empty)(_ => ??? : Nothing)
}
val optZero = zeroFunctor[Option] // None
val listZero = zeroFunctor[List] // Nil
But it failed with the following information:
[Error] /home/peng/git/spookystuff/parent/core/src/main/scala/com/tribbloids/spookystuff/utils/SpookyViews_Imp0.scala:45:41: value empty is not a member of cats.Functor[F]
SO what is the proper way to create it?

Related

How to apply sequence function to List of ValidatedNel in cats?

I have the following code
sealed trait DomainValidation {
def errorMessage: String
}
type ValidationResult[A] = ValidatedNel[DomainValidation, A]
val ai:ValidationResult[String] = "big".validNel
val bi:ValidationResult[String] = "leboski".validNel
val l = List(ai,bi)
I want to convert l to ValidationResult[List[String]]. I came across sequence functionality but I am unable to use cats sequence as some implicit has to be there which knows how to handle ValidationResult[A]. But I am unable figure out what exactly is needed. I wrote the following
object helper {
implicit class hello[A](l: List[ValidationResult[A]]) {
def mysequence: ValidationResult[List[A]] = {
val m = l.collect { case Invalid(a) => Invalid(a) }
if (m.isEmpty) l.map { case Valid(a) => a }.validNel
else /* merge the NonEmpty Lists */
}
}
}
I am able to do l.mysequence. But how do I use cats sequence.
PS: I am a scala beginner. Having a hard time learning :). Forgive for any incorrect mentions.
The following should work as expected on Scala 2.12:
import cats.data.ValidatedNel, cats.syntax.validated._
// Your code:
sealed trait DomainValidation {
def errorMessage: String
}
type ValidationResult[A] = ValidatedNel[DomainValidation, A]
val ai:ValidationResult[String] = "big".validNel
val bi:ValidationResult[String] = "leboski".validNel
val l = List(ai,bi)
And then:
scala> import cats.instances.list._, cats.syntax.traverse._
import cats.instances.list._
import cats.syntax.traverse._
scala> l.sequence
res0: ValidationResult[List[String]] = Valid(List(big, leboski))
You don't show your code or explain what's not working, so it's hard to diagnose your issue, but it's likely to be one of the following problems:
You're on Scala 2.11, where .sequence requires you to enable -Ypartial-unification in your compiler options. If you're using sbt, you can do this by adding scalacOptions += "-Ypartial-unification" to your build.sbt (assuming you're on 2.11.9+).
You've omitted one of the necessary imports. You need at least the Traverse instance for List and the syntax for Traverse. The example code above includes the two imports you need, or you can just import cats.implicits._ and make your life a little easier.
If it's not one of these two things, you'll probably need to include more detail in your question for us to be able to help.

Scala shapeless implicit resolution StackOverflowError with ArgonautShapeless

Im using ArgonautShapeless to define some json codecs.
When I provide the type for my codec I get StackOverflowError's but If I leave the type off it works. How can I provide the type?
My understanding of the problem is that the implicit lookup from def of[A: DecodeJson] = implicitly[DecodeJson[A]] finds my definition on the same line implicit def fooCodec: DecodeJson[Foo] and thus is recursive so breaks.
Is there some other way that will allow me to provide the type? Ideally I want to have one object in my project where I define all of the codes and they may depend on each other.
import $ivy.`com.github.alexarchambault::argonaut-shapeless_6.2:1.2.0-M4`
import argonaut._, Argonaut._
case class Foo(a: Int)
object SomeCodecs {
import ArgonautShapeless._
// this doesnt work
implicit def fooCodec: DecodeJson[Foo] = DecodeJson.of[Foo]
}
import SomeCodecs._
"""{"a":1}""".decode[Foo]
java.lang.StackOverflowError
ammonite.$sess.cmd3$SomeCodecs$.fooCodec(cmd3.sc:3)
It works if I leave the type off.
object SomeCodecs {
import ArgonautShapeless._
// this works
implicit def fooCodec = DecodeJson.of[Foo]
}
import SomeCodecs._
"""{"a":1}""".decode[Foo]
res4: Either[Either[String, (String, CursorHistory)], Foo] = Right(Foo(1))
Thanks

How to implement a trait with a generic case class that creates a dataset in Scala

I want to create a Scala trait that should be implemented with a case class T. The trait is simply to load data and transform it into a Spark Dataset of type T. I got the error that no encoder can be stored, which I think is because Scala does not know that T should be a case class. How can I tell the compiler that? I've seen somewhere that I should mention Product, but there is no such class defined.. Feel free to suggest other ways to do this!
I have the following code but it is not compiling with the error: 42: error: Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing sqlContext.implicits._
[INFO] .as[T]
I'm using Spark 1.6.1
Code:
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.{Dataset, SQLContext}
/**
* A trait that moves data on Hadoop with Spark based on the location and the granularity of the data.
*/
trait Agent[T] {
/**
* Load a Dataframe from the location and convert into a Dataset
* #return Dataset[T]
*/
protected def load(): Dataset[T] = {
// Read in the data
SparkContextKeeper.sqlContext.read
.format("com.databricks.spark.csv")
.load("/myfolder/" + location + "/2016/10/01/")
.as[T]
}
}
Your code is missing 3 things:
Indeed, you must let compiler know that T is subclass of Product (the superclass of all Scala case classes and Tuples)
Compiler would also require the TypeTag and ClassTag of the actual case class. This is used implicitly by Spark to overcome type erasure
import of sqlContext.implicits._
Unfortunately, you can't add type parameters with context bounds in a trait, so the simplest workaround would be to use an abstract class instead:
import scala.reflect.runtime.universe.TypeTag
import scala.reflect.ClassTag
abstract class Agent[T <: Product : ClassTag : TypeTag] {
protected def load(): Dataset[T] = {
val sqlContext: SQLContext = SparkContextKeeper.sqlContext
import sqlContext.implicits._
sqlContext.read.// same...
}
}
Obviously, this isn't equivalent to using a trait, and might suggest that this design isn't the best fit for the job. Another alternative is placing load in an object and moving the type parameter to the method:
object Agent {
protected def load[T <: Product : ClassTag : TypeTag](): Dataset[T] = {
// same...
}
}
Which one is preferable is mostly up to where and how you're going to call load and what you're planning to do with the result.
You need to take two actions :
Add import sparkSession.implicits._ in your imports
Make your trait trait Agent[T <: Product]

Scala returns no annotations for a field

I have this:
class Max(val value : Int) extends StaticAnnotation{}
class Child() extends Parent {
#Max(5) val myMember= register("myMember")
}
abstract class Parent {
def register(fieldName : String) = {
val cls = getClass
import scala.reflect.runtime.universe._
val mirror = runtimeMirror(cls.getClassLoader)
val clsSymbol = mirror.staticClass(cls.getCanonicalName)
val fieldSymbol = clsSymbol.typeSignature.member(TermName(fieldName))
println(s"${fieldSymbol.fullName} " + fieldSymbol.annotations.size)
}
}
this does not work, somehow, it returns 0 annotations, if instead, I put the annotation on the class, then I can read it fine. Why?
Discovered that the previous line:
clsSymbol.typeSignature.member(TermName(fieldName))
was returning the symbol of the auto generated getter for the "val" (which of course does not have any annotation), instead of the symbol from the val itself. If instead I do:
clsSymbol.toType.decl(TermName(s"${fieldName} "))
that seems to work fine. For any reason that I do not know, if I write a space at the end of the TermName, then it returns the field symbol with the annotations.
Adding a bit of additional information to your answer to demonstrate and ilustrate the problem:
scala> import scala.annotation.StaticAnnotation
import scala.annotation.StaticAnnotation
scala> import scala.reflect.runtime.universe._
import scala.reflect.runtime.universe._
scala> class Max(val value : Int) extends StaticAnnotation
defined class Max
scala> class Child {
| #Max(5) val myMember = 2
| }
defined class Child
scala> val cls = classOf[Child]
cls: Class[Child] = class Child
scala> val mirror = runtimeMirror(cls.getClassLoader)
mirror: reflect.runtime.universe.Mirror = JavaMirror with... (I truncated this part which was super long and not useful)
scala> mirror.classSymbol(cls).selfType.decls
res0: reflect.runtime.universe.MemberScope = SynchronizedOps(constructor Child, value myMember, value myMember)
scala> println(mirror.classSymbol(cls).selfType.decls)
Scope{
def <init>: <?>;
val myMember: <?>;
private[this] val myMember: scala.Int
}
scala> mirror.classSymbol(cls).selfType.decls.map(_.annotations)
res2: Iterable[List[reflect.runtime.universe.Annotation]] = List(List(), List(), List(Max(5)))
scala> mirror.classSymbol(cls).selfType.decls.map(_.isMethod)
res4: Iterable[Boolean] = List(true, true, false)
scala> mirror.classSymbol(cls).selfType.decls.map(_.asTerm.name)
res15: Iterable[reflect.runtime.universe.TermName] = List(<init>, myMember, myMember )
Only one of them include the annotation, and we can see that the last one which is the actual attribute you defined and not the synthetic getter that the compiler defined automatically, has a space at the end of its name ! I really wonder who thought it was a good idea to do such horrible thing, but it seems to be the reality. I am no Scala expert but this whole API seems very complex to me and unpractical to work with. It probably suffers from the complexity of Scala as a language itself, which under appearances of simplicity and "magic" features, actually has some very complex mechanisms.
To me, a better API should propose one method to get def declarations and another one for getting val and var declarations. Also, the names should probably not be dedupes by a completely unexpected space at the end of the name !
PS: Martin Odersky explained this design choice on the following thread: https://contributors.scala-lang.org/t/design-choice-reflection-valdef-and-synthetic-getter/565

Problems with Salat methods in MongoDB: implicit view & not enough arguments

I'm new with Salat,Casbah and MongoDB. When I've been trying to make a simple method to get all users from db,
import DAL.Instances.User.{UserDAO, User}
import com.novus.salat._
import com.novus.salat.global._
import com.novus.salat.annotations._
import com.novus.salat.dao._
import com.mongodb.casbah.Imports._
import com.mongodb.casbah.MongoConnection
object UserRepository {
def getAllUsers() = {
val userList= UserDAO.find()
userList.isEmpty match {
case true => throw new Exception("None users in your db")
case false => userList
}
}
I faced with two errors:
Error:(29, 31) No implicit view available from Unit => com
.mongodb.DBObject.
val userList= UserDAO.find()
^
Error:(29, 31) not enough arguments for method find: (implicit evidence$2: Unit => com.mongodb.DBObject)com.novus.salat.dao.SalatMongoCursor[DAL.Instances.User.User].
Unspecified value parameter evidence$2.
val userList= UserDAO.find()
^
Here is my User code:
object User {
case class User( _id: ObjectId = new ObjectId, name:String, age:Int)
object UserDAO extends SalatDAO[User, ObjectId](collection = MongoConnection()("fulltestdb")("user"))
}
I'm not sure what version of Salat you are using but if you look at the signature for find it'll give you a clue as to what the issue is:
def find[A <% DBObject](ref: A): SalatMongoCursor[ObjectType]
You need to call find with a parameter that has a view bound so that this parameter may be viewed as a DBObject. This means that an implicit conversion from A => DBObject is expected to be in scope.
In your case you aren't passing any parameter. This is being treated as Unit and so the compiler tries to find an implicit conversion from Unit => DBObject. This can't be found so compilation fails.
To fix this you're best bet is to pass in an empty DBObject, you can achieve this with MongoDBObject.empty from casbah. You could add an implicit conversion from Unit => MongoDBObject but I'd probably lean towards making it explicit where possible.