I know there are other questions in stack overflow, but None of them works for me.
I'm trying to map a simple inheritance in slick projection
I try hundreds of combinations, I can't make this compile. I end of with the follow code and the error bellow.
I simplify the case classes because they have much more data. Without inheritance my other layers (controllers, services and interface) will have to deal with complexity, because in this case the model represents real objects that are in trisect inherited, there is no better representation that inheritance for this classes. In the service and controller layer I make the Json representation of the classes be exacted what I need, I can send and consume a Json API that represents my model, the only thing in the way is persisting this representation in relational database, my relational model is capable of persist this entities in an single table inheritance, but translate rows to relational is a pain.
I'm using scala 2.10.3 + sbt 0.13.1
abstract class Pessoa2(val nome:String, val tipo:String)
case class PessoaFisica2(override val nome:String, val cpf:String) extends Pessoa2(nome,"F")
case class PessoaJuridica2(override val nome:String, val cnpj:String) extends Pessoa2(nome, "J")
class PessoaTable(tag: Tag) extends Table[Pessoa2](tag, "PESSOAS"){
// def id = column[Int]("ID", O.PrimaryKey, O.AutoInc)
def nome = column[String]("nome")
def tipo = column[String]("tipo")
def cpf = column[String]("CPF", O.Nullable)
def cnpj = column[String]("CNPJ", O.Nullable)
def * = (nome, tipo, cpf.?, cnpj.?) <> ({
case (nome:String,"F",cpf:Option[String],_) => new PessoaFisica2(nome, cpf.get):Pessoa2
case (nome:String,"J",_,cnpj:Option[String]) => new PessoaJuridica2(nome, cnpj.get):Pessoa2
},{
case PessoaFisica2(nome, Some(cpf)) => Some((nome, "F", cpf, ""))
case PessoaJuridica2(nome, Some(cnpj)) => Some((nome, "J", "", cnpj))
})
}
This ends with the error:
The argument types of an anonymous function must be fully known. (SLS 8.5)
[error] Expected type was: ? => ?
[error] def * = (nome, tipo, cpf.?, cnpj.?) <> ({
[error] ^
[error] /Users/giovanni/Projetos/atende/clientes/app/model/Pessoas.scala:158: type mismatch;
[error] found : Any
[error] required: String
[error] case (nome,"F",cpf,_) => new PessoaFisica2(nome, cpf):Pessoa2
[error] ^
[error] /Users/giovanni/Projetos/atende/clientes/app/model/Pessoas.scala:158: type mismatch;
[error] found : Any
[error] required: String
[error] case (nome,"F",cpf,_) => new PessoaFisica2(nome, cpf):Pessoa2
[error] ^
[error] /Users/giovanni/Projetos/atende/clientes/app/model/Pessoas.scala:159: type mismatch;
[error] found : Any
[error] required: String
[error] case (nome,"J",_,cnpj) => new PessoaJuridica2(nome, cnpj):Pessoa2
[error] ^
[error] /Users/giovanni/Projetos/atende/clientes/app/model/Pessoas.scala:159: type mismatch;
[error] found : Any
[error] required: String
[error] case (nome,"J",_,cnpj) => new PessoaJuridica2(nome, cnpj):Pessoa2
[error] ^
[error] /Users/giovanni/Projetos/atende/clientes/app/model/Pessoas.scala:160: missing parameter type
for expanded function
[error] The argument types of an anonymous function must be fully known. (SLS 8.5)
[error] Expected type was: ? => Option[?]
[error] },{
[error] ^
[error] /Users/giovanni/Projetos/atende/clientes/app/model/Pessoas.scala:157: No matching Shape found.
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection. Or you use an unsupported type in a Query (e.g. scala List).
[error] Required level: scala.slick.lifted.ShapeLevel.Flat
[error] Source type: (scala.slick.lifted.Column[String], scala.slick.lifted.Column[String], scala.slick.lifted.Column[Option[String]], scala.slick.lifted.Column[Option[String]])
[error] Unpacked type: (String, String, String, String)
[error] Packed type: Any
[error] def * = (nome, tipo, cpf.?, cnpj.?) <> ({
[error] ^
[error] 7 errors found
Try that:
abstract class Pessoa2(val nome:String, val tipo:String)
case class PessoaFisica2(override val nome:String, val cpf:String) extends Pessoa2(nome,"F")
case class PessoaJuridica2(override val nome:String, val cnpj:String) extends Pessoa2(nome, "J")
class PessoaTable(tag: Tag) extends Table[Pessoa2](tag, "PESSOAS"){
// def id = column[Int]("ID", O.PrimaryKey, O.AutoInc)
def nome = column[String]("nome")
def tipo = column[String]("tipo")
def cpf = column[String]("CPF", O.Nullable)
def cnpj = column[String]("CNPJ", O.Nullable)
def * = (nome, tipo, cpf.?, cnpj.?) <> ({ t : (String, String, Option[String], Option[String])=> t match{
case (nome:String,"F",cpf:Option[String],_) => new PessoaFisica2(nome, cpf.get):Pessoa2
case (nome:String,"J",_,cnpj:Option[String]) => new PessoaJuridica2(nome, cnpj.get):Pessoa2
}},{ k: Pessoa2 => k match{
case PessoaFisica2(nome, cpf) => Some((nome, "F", Some(cpf), Some(""))): Option[(String, String, Option[String], Option[String])]
case PessoaJuridica2(nome, cnpj) => Some((nome, "J", Some(""), Some(cnpj))): Option[(String, String, Option[String], Option[String])]
}})
}
It should compile that way
Related
I have the following code, that does not get compiled:
final class DbSystemEnvironment[F[_] : MonadError[F, Throwable]] private(env: Environment[F])
extends DbSetting[F] {
override def read(url: String, user: String, pw: String): F[DbParameter] =
(for {
a <- OptionT(env.get(url))
b <- OptionT(env.get(user))
c <- OptionT(env.get(pw))
} yield DbParameter(a, b, c))
.value
.flatMap {
case Some(v) => v.pure[F]
case None => DbSettingError.raiseError[F, DbParameter]
}
}
The compiler complains:
[error] db/DbSystemEnvironment.scala:10:38: cats.MonadError[F,Throwable] does not take type parameters
[error] final class DbSystemEnvironment[F[_] : MonadError[F, Throwable]] private(env: Environment[F])
[error] ^
[error] db/DbSystemEnvironment.scala:16:9: Could not find an instance of Functor for F
[error] c <- OptionT(env.get(pw))
[error] ^
[error] db/DbSystemEnvironment.scala:20:27: value pure is not a member of Any
[error] case Some(v) => v.pure[F]
[error] ^
[error] db/DbSystemEnvironment.scala:21:37: value raiseError is not a member of object io.databaker.db.DbSettingError
[error] case None => DbSettingError.raiseError[F, DbParameter]
It seems, that I do not use MonadError correctly.
The rest of the code:
final case class DbParameter(url: String, user: String, pw: String)
trait Environment[F[_]] {
def get(v: String) : F[Option[String]]
}
object Environment {
def apply[F[_]](implicit ev: Environment[F]): ev.type = ev
def impl[F[_]: Sync]: Environment[F] = new Environment[F] {
override def get(v: String): F[Option[String]] =
Sync[F].delay(sys.env.get(v))
}
}
How to get the code compiled?
The issue here is the constraint syntax. For a type constructor with a single parameter (like Monad), you can write class Foo[F[_]: Monad]. When you need to "partially apply" a type constructor with multiple parameters, like MonadError, the situation is slightly different.
If you're using kind-projector, you can write the following:
class DbSystemEnvironment[F[_]: MonadError[*[_], Throwable]]
This is non-standard syntax, though, and it isn't currently included in Dotty's partial -Ykind-projector compatibility support. I'd recommend just desugaring the implicit parameter list:
class DbSystemEnvironment[F[_]](implicit F: MonadError[F, Throwable]])
This does exactly what you want, doesn't require an extra compiler plugin, and is much more future-proof.
I have following code snippet:
def determineProducerType(keySerializer: KkSerializer)(valueSerializer: KkSerializer)(props: Properties)
: Eval[KafkaProducer[java.lang.Object, java.lang.Object]] = (keySerializer, valueSerializer) match {
case (KkStringSeDe, KkStringSeDe) => Later(new KafkaProducer[String, String](props))
case (KkStringSeDe, KkByteArraySeDe) => Later(new KafkaProducer[String, Byte](props))
case (KkStringSeDe, KkIntegerSeDe) => Later(new KafkaProducer[String, Integer](props))
case (KkStringSeDe, KkLongSeDe) => Later(new KafkaProducer[String, Long](props))
}
The compiler complains:
[info] Compiling 2 Scala sources to /home/developer/Desktop/scala/PureProducer/target/scala-2.12/classes ...
[error] /home/developer/Desktop/scala/PureProducer/src/main/scala/TheProducer.scala:113:48: type mismatch;
[error] found : org.apache.kafka.clients.producer.KafkaProducer[String,String]
[error] required: org.apache.kafka.clients.producer.KafkaProducer[Object,Object]
[error] Note: String <: Object, but Java-defined class KafkaProducer is invariant in type K.
[error] You may wish to investigate a wildcard type such as `_ <: Object`. (SLS 3.2.10)
[error] Note: String <: Object, but Java-defined class KafkaProducer is invariant in type V.
[error] You may wish to investigate a wildcard type such as `_ <: Object`. (SLS 3.2.10)
[error] case (KkStringSeDe, KkStringSeDe) => Later(new KafkaProducer[String, String](props))
[error] ^
[error] /home/developer/Desktop/scala/PureProducer/src/main/scala/TheProducer.scala:114:51: type mismatch;
[error] found : org.apache.kafka.clients.producer.KafkaProducer[String,Byte]
[error] required: org.apache.kafka.clients.producer.KafkaProducer[Object,Object]
[error] case (KkStringSeDe, KkByteArraySeDe) => Later(new KafkaProducer[String, Byte](props))
[error] ^
[error] /home/developer/Desktop/scala/PureProducer/src/main/scala/TheProducer.scala:115:49: type mismatch;
[error] found : org.apache.kafka.clients.producer.KafkaProducer[String,Integer]
[error] required: org.apache.kafka.clients.producer.KafkaProducer[Object,Object]
[error] Note: String <: Object, but Java-defined class KafkaProducer is invariant in type K.
[error] You may wish to investigate a wildcard type such as `_ <: Object`. (SLS 3.2.10)
[error] Note: Integer <: Object, but Java-defined class KafkaProducer is invariant in type V.
[error] You may wish to investigate a wildcard type such as `_ <: Object`. (SLS 3.2.10)
[error] case (KkStringSeDe, KkIntegerSeDe) => Later(new KafkaProducer[String, Integer](props))
[error] ^
[error] /home/developer/Desktop/scala/PureProducer/src/main/scala/TheProducer.scala:116:46: type mismatch;
[error] found : org.apache.kafka.clients.producer.KafkaProducer[String,Long]
[error] required: org.apache.kafka.clients.producer.KafkaProducer[Object,Object]
[error] case (KkStringSeDe, KkLongSeDe) => Later(new KafkaProducer[String, Long](props))
[error] ^
[error] four errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 2 s, completed Nov 12, 2017 10:39:14 AM
What I am trying to do is, I defined a sum type:
sealed trait KkSerializer
case object KkStringSeDe extends KkSerializer
case object KkByteArraySeDe extends KkSerializer
case object KkIntegerSeDe extends KkSerializer
case object KkLongSeDe extends KkSerializer
When it matches an appropriate sum type, then it should returns corresponding type.
Create an instance of KafkaProducer is:
val producer = new KafkaProducer[String, String](props)
How to solve it?
I think in this case you can just use path dependent types to get what you want:
sealed trait KkSerializer { type Out }
case object KkStringSeDe extends KkSerializer {
type Out = String
}
case object KkByteArraySeDe extends KkSerializer {
type Out = Byte
}
def determineProducerType(k: KkSerializer)(v: KkSerializer)(props: Properties): Eval[KafkaProducer[k.Out, v.Out]] =
Later(new KafkaProducer[k.Out, v.Out](props))
Could you please tell me what is wrong with this Scala code?
package com.user.common
class Notification(message: String, next: Option[Notification]) {
def write(): String = {
message
}
def getAll(): Stream[Notification] = {
next match {
case Some(n) => Stream.cons(n, n.getAll())
case None => Stream.empty
}
}
}
case class Email(msg: String)
extends Notification(msg, None)
case class SMS(msg: String)
extends Notification(msg, Option(Email))
case class VoiceRecording(msg: String)
extends Notification(msg, Option(SMS))
The errors from compiler are as below.
[error] /common/Test.scala:15: type mismatch;
[error] found : Some[A]
[error] required: Option[com.user.common.Notification]
[error] case Some(n) => Stream.cons(n, n.getAll())
[error] ^
[error] /common/Test.scala:15: type mismatch;
[error] found : A
[error] required: com.user.common.Notification
[error] case Some(n) => Stream.cons(n, n.getAll())
[error] ^
[error] /common/Test.scala:15: value getAll is not a member of type parameter A
[error] case Some(n) => Stream.cons(n, n.getAll())
[error] ^
[error] /common/Test.scala:25: type mismatch;
[error] found : com.user.common.Email.type
[error] required: com.user.common.Notification
[error] extends Notification(msg, Option(Email))
[error] ^
[error] /common/Test.scala:28: type mismatch;
[error] found : com.user.common.SMS.type
[error] required: com.user.common.Notification
[error] extends Notification(msg, Option(SMS))
[error] ^
[error] 5 errors found
[error] (compile:compileIncremental) Compilation failed
I could not understand the problem. Similarly, I have no idea how to restructure the code. My basic idea is to keep value of one case class and iterate over them until I reach to None. From top level case class until low level one.
case class SMS(msg: String)
extends Notification(msg, Option(Email))
case class VoiceRecording(msg: String)
extends Notification(msg, Option(SMS))`
In your second parameter, you are passing an option on a class type whereas an instance of the class is expected
Maybe what you want is
case class SMS(msg: String)
extends Notification(msg, Option(Email(msg)))
case class VoiceRecording(msg: String)
extends Notification(msg, Option(SMS(msg)))`
I'm following the Scala Slick beginner guide trying to create a simple schema, and I can't seem to find the 'column' type when importing the stuff in the beginning of the documentation.
import slick.driver.H2Driver.api._
import scala.concurrent.ExecutionContext.Implicits.global
/**
* Created by chris on 9/7/16.
*/
class BlockHeaderTable(tag: Tag) extends Table[BlockHeader](tag,"block_headers") {
def version: column[UInt32]
def previousBlockHash: column[DoubleSha256Digest]
def merkleRootHash: column[DoubleSha256Digest]
def time: column[UInt32]
def nBits: column[UInt32]
def nonce: column[UInt32]
}
and here is the error I am getting:
chris#chris-870Z5E-880Z5E-680Z5E:~/dev/bitcoins-spv-node$ sbt compile
[info] Loading project definition from
/home/chris/dev/bitcoins-spv-node/project [info] Set current project
to bitcoins-spv-node (in build
file:/home/chris/dev/bitcoins-spv-node/) [info] Compiling 1 Scala
source to
/home/chris/dev/bitcoins-spv-node/target/scala-2.11/classes... [error]
/home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:14:
not found: type column [error] def version: column[UInt32] [error]
^ [error]
/home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:16:
not found: type column [error] def previousBlockHash:
column[DoubleSha256Digest] [error] ^ [error]
/home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:18:
not found: type column [error] def merkleRootHash:
column[DoubleSha256Digest] [error] ^ [error]
/home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:20:
not found: type column [error] def time: column[UInt32] [error]
^ [error]
/home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:22:
not found: type column [error] def nBits: column[UInt32] [error]
^ [error]
/home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:24:
not found: type column [error] def nonce: column[UInt32] [error]
^ [error] 6 errors found [error] (compile:compileIncremental)
Compilation failed
The type of the columns is not column but Rep. column is actually a function that tells slick which column to use:
class BlockHeaderTable(tag: Tag) extends Table[BlockHeader](tag,"block_headers") {
def version: Rep[UInt32] = column[UInt32]("version")
def previousBlockHash: Rep[DoubleSha256Digest] = column[DoubleSha256Digest]("previous_block_hash")
...
}
Also I'm not sure what types you are using, but those are not supported out of the box by slick see here. You will need write custom type mappers. For example an UInt32 mapper:
implicit val UInt32Mapper = MappedColumnType.base[UInt32, Long](
u => u.toLong, // convert UInt32 to Long here
l => UInt32(l) // and Long to UInt32 here
)
Slick will not understand custom types other than standard JDBC types like Timestamp, Long, String, Char, Boolean etc. In order to work with the custom types you have to provide Slick Mapping of custom types to jdbc types.
Provide Slick Mapping for UInt32 and DoubleSha256Digest
For Example
DateTime is custom type which slick does not understand, but slick understands java.sql.Timestamp.We provide a slick mapping. So that slick can understand how to deal with DateTime
implicit def jodaTimeMapping: BaseColumnType[DateTime] = MappedColumnType.base[DateTime, Timestamp](
dateTime => new Timestamp(dateTime.getMillis),
timeStamp => new DateTime(timeStamp.getTime))
Complete Example
case class Foo(str: String) //Foo needs Slick Mapping for below code to compile
implicit def fooMapping: BaseColumnType[Foo] = MappedColumnType.base[Foo, String](
str => Foo(str),
foo => foo.str)
case class Person(name: String, foo: Foo)
class Persons(tag) extends Table[Person](tag, "persons") {
def name = column[String]("name")
def foo = column[Foo]("foo") //use Foo directly because Implicit mapping is available in scope
def * = (name, foo) <> (Person.tupled, Person.unapply)
}
I have a Java interface like the following:
interface Bazz {
void bar(String msg, Object args...);
}
I want to implement that interface (just happens to be in Scala) and use SLF4J logging to log the args parameter (for sake of this question, using another Scala logging library is not an option).
And the following (contrived) scala code:
object Foo extends Bazz {
private val log = LoggerFactory.getLogger(Main.getClass)
def bar(args: AnyRef*) {
log.info("Processing params: {}, {}, {}", args: _*)
// ... do stuff with args...
}
}
object Main extends App {
val arr: Seq[String] = Array("A","B","C")
val anyRef: AnyRef = arr
Foo.bar(arr)
}
}
Running Main, I get the following output:
22:49:54.658 [run-main-0] INFO: sample.Main$ Processing params: WrappedArray(A, B, C), {}, {}
That's no good because the :_* is exploding the args, but the first element is an array. For reasons I won't go into here, I actually need to pass the elements of that array as the single Object[] parameter to SLF4J's Logger.info(String, Object[]) method.
Here's my first attempt:
def bar(args: AnyRef*) {
val flatArgs = args.flatMap {
case s: Seq[_] => s
case x => Seq(x)
}
log.info("Processing params: {}, {}, {}", flatArgs: _*)
// ... do stuff with args...
}
This does not compiles with the following error:
[error] Main.scala:18: overloaded method value info with alternatives:
[error] (x$1: org.slf4j.Marker,x$2: String,x$3: <repeated...>[Object])Unit <and>
[error] (x$1: org.slf4j.Marker,x$2: String)Unit <and>
[error] (x$1: String,x$2: Throwable)Unit <and>
[error] (x$1: String,x$2: <repeated...>[Object])Unit <and>
[error] (x$1: String,x$2: Any)Unit
[error] cannot be applied to (String, Any)
[error] log.info("Processing params: {}, {}, {}", flatArgs: _*)
[error] ^
[error] one error found
How can I change this code to make it compile and also print Processing params: A, B, C?
A problem here is that Int is an AnyVal which means that it is not an AnyRef/Object
Also an Array is not a TraversableOnce.
Now in the foo method we pattern match on the varargs.
def foo(msg:String,varargs:AnyRef*) { varargs.toList match {
case (h:TraversableOnce[_]) :: Nil => log.info(msg, h.toSeq.asInstanceOf[Seq[AnyRef]]:_*)
case (h:Array[_]) :: Nil => log.info(msg, h.toSeq.asInstanceOf[Seq[AnyRef]]:_*)
case _ => log.info(msg,varargs:_*)
}}
There are 3 cases we deal with:
case (h:TraversableOnce[_]) :: Nil => log.info(msg, h.toSeq.asInstanceOf[Seq[AnyRef]]:_*)
TraversableOnce is one of the Scala collections base traits; every collection I know of extends this (Array is not a collection, it doesn't extend it). It can contain either AnyVal or AnyRef but the actual items at runtime will be wrapped primitive (java.lang.Integer, etc.). So we can down cast from Seq[Any] to Seq[AnyRef] here and we should be okay.
case (h:Array[_]) :: Nil => log.info(msg, h.toSeq.asInstanceOf[Seq[AnyRef]]:_*)
Just like we did with the TraversableOnce, turn it into a Seq and then downcast. The transformation into Seq will wrap any primitives for us.
case _ => log.info(msg,varargs:_*)
The general case in which varargs could be empty, or contain more than one entry.
Here's the answer to my re-stated (now obvious) problem. Apologies for not having a clear question earlier:
def bar(args: AnyRef*) {
val flatArgs = args.flatMap {
case s: Seq[_] => s
case x => Seq(x)
}.toArray.asInstanceOf[Array[Object]]
log.info("Processing params: {}, {}, {}", flatArgs: _*)
}
Note: without the asInstanceOf[Array[Object]] the following occurs:
[error] type mismatch;
[error] found : Array[Any]
[error] required: Array[Object]
[error] Note: Any >: Object, but class Array is invariant in type T.
[error] You may wish to investigate a wildcard type such as `_ >: Object`. (SLS 3.2.10)
[error] }.toArray
[error] ^
[error] one error found
Simpler variation on ggovan's, doing the match before calling foo:
val arr: Seq[Int] = Array(1,2,3)
val anyRef: AnyRef = arr
val msg = "Another message: {}, {}, {}"
anyRef match {
case seq: Seq[_] => foo(msg, seq.asInstanceOf[Seq[AnyRef]]: _*)
case x => foo(msg, x)
}