Cannot make slick 3.2 Mapped table example working - scala

I'm just trying to make "User" example working (http://slick.lightbend.com/doc/3.2.0/schemas.html#mapped-tables), but it doesn't compile.
As I'm targeting MySQL, I added th folowing imports :
import slick.jdbc.MySQLProfile.Table
import slick.jdbc._
import slick.lifted._
That didn't compiled either, I got a lot of errors like
Error:(16, 23) could not find implicit value for parameter tt: slick.ast.TypedType[Int]
def id = column[Int]("id", O.PrimaryKey, O.AutoInc)
After looking for implicits, I added with MySQLProfile.ImplicitColumnTypes to the Users class extending Table:
class Users(tag: Tag) extends Table[User](tag, "users") with MySQLProfile.ImplicitColumnTypes
Now I'm stuck with
Error:(19, 15) value ? is not a member of slick.lifted.Rep[Int]
def * = (id.?, first, last) <> (User.tupled, User.unapply _)
<> is not found either.
You may notice User.unapply _ instead of User.unapply as stated in doc; but the compiler was complaining whith User.unapply
What I'm I doing wrong ? Why is the doc so unclear ?

The code imports slick.jdbc.MySQLProfile.Table but instead it needs to bring in the whole api:
import slick.jdbc.MySQLProfile.api._
That will give you the implicts you were looking for, and the code should compile.
BTW: the Slick examples are compiled
Incidently, the Slick manual examples are compiled. This means you can get to the code to see if there are extra details in there you need.
For example, for the page you linked to, if you scroll to the top there's an "Edit this page on github" link. Clicking that takes you to the source and in there you'll find a reference to the Scala source:
.. includecode:: code/LiftedEmbedding.scala#mappedtable
...and that file is also in GitHub: LiftedEmbedding.scala
A bit long-winded, but useful sometimes to know the examples are compiled and you can find them.
The details of how that happens are just about to change to a different system, but the principles should remain the same. The details (filenames, import syntax) above will be different.

Related

Lightbend examples syntax error

I just wonder if I have messed something up or is it just unavoidable pain of using Scala. I wanted to test out slick so I decided to run activator-play-slick-angularjs example from Lightbend. Unfortunatelly I get syntax errors while using
lazy protected val empTableQuery: TableQuery[EmployeeTable] = TableQuery[EmployeeTable]
in any possible way. In filtering examples, the type I am required by Scala plugin to use is Any e. g.
def delete(id: Int): Future[Int] = db.run { empTableQuery.filter(_.id === id).delete }
_.id part yields syntax error. I bet that I am just missing something because I can't imagine a single developer willing to work in 2017 without syntax assistance from IDE.
In case somebody met this problem in future - in this example EmployeeTable is defined with private[EmployeeTable] what makes in not visible in EmployeeRepository class. Just skip the private[EmployeeTable] part in class definition to make everything work smoothly.

How to get the full class name of a dynamically created class in Scala

I have a situation where I have to get the fully qualified name of a class I generate dynamically in Scala. Here's what I have so far.
import scala.reflect.runtime.universe
import scala.tools.reflect.ToolBox
val tb = universe.runtimeMirror(getClass.getClassLoader).mkToolBox()
val generatedClass = "class Foo { def addOne(i: Int) = i + 1 }"
tb.compile(tb.parse(generatedClass))
val fooClass:String = ???
Clearly this is just a toy example, but I just don't know how to get the fully qualified name of Foo. I tried sticking a package declaration into the code but that threw an error when calling tb.compile.
Does anyone know how to get the fully qualified class name or (even better) to specify the package that Foo gets compiled under?
Thanks
EDIT
After using the solution proposed I was able to get the class name. However, the next step is the register this class to take some actions later. Specifically I'm trying to make use of the UDTRegistration within Apache Spark to handle my own custom UserDefinedTypes. This strategy works fine when I manually create all the types, however, I want to use them to extend other types I may not know about.
After reading this it seems like what I'm trying to do might not be possible using code compiled at runtime using reflection. Maybe a better solution is to use Scala macros, but I'm very new to that area.
You may use define instead of compile to generate new class and get its package
val cls = tb.define(tb.parse(generatedClass).asInstanceOf[universe.ImplDef])
println(cls.fullName) //__wrapper$1$d1de39015284494799acd2875643f78e.Foo

Spark toDF cannot resolve symbol after importing sqlContext implicits

I'm working on writing some unit tests for my Scala Spark application
In order to do so I need to create different dataframes in my tests. So I wrote a very short DFsBuilder code that basically allows me to add new rows and eventually create the DF. The code is:
class DFsBuilder[T](private val sqlContext: SQLContext, private val columnNames: Array[String]) {
var rows = new ListBuffer[T]()
def add(row: T): DFsBuilder[T] = {
rows += row
this
}
def build() : DataFrame = {
import sqlContext.implicits._
rows.toList.toDF(columnNames:_*) // UPDATE: added :_* because it was accidently removed in the original question
}
}
However the toDF method doesn't compile with a cannot resolve symbol toDF.
I wrote this builder code with generics since I need to create different kinds of DFs (different number of columns and different column types). The way I would like to use it is to define some certain case class in the unit test and use it for the builder
I know this issue somehow relates to the fact that I'm using generics (probably some kind of type erasure issue) but I can't quite put my finger on what the problem is exactly
And so my questions are:
Can anyone show me where the problem is? And also hopefully how to fix it
If this issue cannot be solved this way, could someone perhaps offer another elegant way to create dataframes? (I prefer not to pollute my unit tests with the creation code)
I obviously googled this issue first but only found examples where people forgot to import the sqlContext.implicits method or something about a case class out of scope which is probably not the same issue as I'm having
Thanks in advance
If you look at the signatures of toDF and of SQLImplicits.localSeqToDataFrameHolder (which is the implicit function used) you'll be able to detect two issues:
Type T must be a subclass of Product (the superclass of all case classes, tuples...), and you must provide an implicit TypeTag for it. To fix this - change the declaration of your class to:
class DFsBuilder[T <: Product : TypeTag](...) { ... }
The columnNames argument is not of type Array, it's a "repeated parameter" (like Java's "varargs", see section 4.6.2 here), so you have to convert the array into arguments:
rows.toList.toDF(columnNames: _*)
With these two changes, your code compiles (and works).

How can I use Monocle's built in law implementations to test my own lenses?

I noticed that Monocle has implementations of lens laws that are used to test the libraries internals. They seem to be nicely generalized and modularized. I tried to use them to test my own lenses, but I am lost in the jungle of dependencies. Has anybody tried to do this, and could post an example? The documentation does not seem to talk about laws at all. Thank you.
To elaborate, here is what I am trying to do (fumbling, not sure if I am using the intended way to use the API):
it should "pass the LensLaws" in check {
forAll {(c: (String,Int), a: String) =>
new monocle.law.LensLaws(l).setGet(c,a) } }
where l is the Monocle lens, visible in scope. I am receiving the following error message:
No implicit view available from monocle.internal.IsEq[String] => org.scalacheck.Prop
As far as I can see the setGet law constructs an IsEq object and I was not able to find how to turn it into a Prop (or Boolean).
I can also see that the framework is using a function checkAll to test all LensLaws at the same time, but I could not get this to work in my own code either. Any help appreciated.
The following works for me
import org.scalatest.FunSuite
import org.typelevel.discipline.scalatest.Discipline
class LensesSuite extends FunSuite with Discipline {
import scalaz._
import Scalaz._
checkAll("Lens l", monocle.law.discipline.LensTests(l))
}
It turns out that the main problem was my relatively short knowledge on scalatest. checkAll is a checker provided by org.typelevel.discipline.scalatest.Discipline, which only works in FunSuite (not in FlatSpec that I was using). It took me ages to figure that out ...
Still have no idea how to elegantly use this RuleSet (LensTests) into another spec. It seems strange that the choice of RuleSet by Monocle would enforce the Spec style on the project using these tests.

How do I find the package for the time conversion implicits in Scala?

On the official API page, I've searched for "time", but can't find the class. I looked at all methods on RichLong and RichInt, but can't find the methods.
I'm specifically talking about the methods that convert between int/long to some kind of rich object:
2 hours + 12 seconds
Note I'm not asking what the package is, I want to know how to find it.
Those aren't in the standard Scala API. Are you using the scala-time wrapper for JodaTime? If so, that would tell you where to look. In general, if you know which import enables the capability, it helps a lot when trying to find documentation!
If you know a method name, you can click on the little letters at the top of the left panel, just below the search text field in ScalaDoc--this will bring up a list of everything in the docs with that name, including methods (and tell you how to find it).
If a class doesn't have a method itself, you can use Scala to tell you what class it's getting converted to:
def whoHasSize[A](a: A)(implicit ev: A => { def size: Int }) = ev(a).getClass.getName
scala> whoHasSize("fish")
res1: java.lang.String = scala.collection.immutable.StringOps
So here you see that the string "fish" does not itself have a size method; instead, that's been granted by the scala.collection.immutable.StringOps method. You could do the same thing to find out about seconds.
Finally, stew's answer is probably what you were really looking for!
You can use the -Xlog-implicits flag to have the compiler show you where it is finding the implicit conversions.