Trouble with SqlParser using Anorm - scala

I'm trying to parse some SQL and save it as PushMessage (which is a class, not a case class - don't know if that matters). Following the Anorm documentation I have
implicit val parser: RowParser[PushMessage] = Macro.namedParser[PushMessage]
val result = db.withConnection { implicit connection: Connection =>
SQL"select * from PUSH_MESSAGES where VENDOR_ID=$requestedVendorId;".as(parser.*)
}
However, I'm getting a problem as IntelliJ tells me that Macro.namedParser[PushMessage] returns an Any, and not a RowParser[PushMessage]. I tried removing the declaration type, but then I couldn't run the parser using the .as(parser.*) syntax.
How do I get this to return a RowParser?
Thanks in advance,

I guess you are using an Anorm version before 2.5.1 (April 2016), when the macros have been updated to use whitebox context. In such case, your IDE cannot properly infer the return type.
Note that Anorm 2.5.2 has just been released.

Related

error after updating spark to 2.4 version

I recently updated my Spark version from 2.2 to 2.4.0
I started having an error in this block (which was working fine with 2.2 version):
object Crud_mod {
def f(df: DataFrame,
options: JDBCOptions,
conditions: List[String]) {
val url = options.url
val tables = options.table
val dialect = JdbcDialects_mod.get(url)
error: value table is not a member of org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions
[ERROR] val tables = options.table
So I took a look inside Spark sources and value table seems to exist in JDBCOptions class.
What am I missing please?
Your sources link is pointing to a constructor, that accepts table as an argument, but I can't find table value in class itself.
However, there's tableOrQuery (here) method, that can be used for your needs, I think.

Nested JSON in scala, Circe, Slick

I have a nested JSON in my database. I have figured out the case class for the same. I am using circe, slick and Akka HTTP in my Web api application.
My case class is :
case class Sen
(
sentences: Array[File]
)
case class File
(
content: String,
)
I have written GetResult for the same nesting. I have problems with the array in the case class.
implicit lazy val getFile = GetResult(r => Array[File](r.<<))
implicit lazy val SenObj = GetResult(r => Sen(getFile(r)))
Can anyone tell me how to solve this?
Following is the error I get while compiling
Error:diverging implicit expansion for type slick.jdbc.GetResult[T]
starting with method createGetTuple22 in object GetResult
implicit lazy val getFile = GetResult(r => Array[File](r.<<))
Your definition of getFile is manually constructing an Array, and specifically you're asking for an Array[File]. There's no GetResult[File], meaning that r.<< won't be able to convert a column value into a File.
Can anyone tell me how to solve this?
You'll at least need a GetResult[File] defined.
However, it's not clear from the question how the JSON part is intended to work:
Perhaps you have a column containing text which your application treats as JSON. If that's the case, I suggest doing JSON array conversion outside of your Slick code layer. That will keep the mapping to and from the database straightforward.
Or perhaps you have a JSON-type in your database and you're using database-specific operations. In that case, I guess it'll depend on what control you have there, and it probably does make sense to try to do JSON-array operations at the Slick layer. (That's the case for Postgress, for example, via the pg-slick library)
But that's a different question.
As a general note, I suggest always being explicit about the types of GetResult you are defining:
implicit lazy val getFile: GetResult[Array[File]]=
GetResult(r => Array[File](r.<<))
implicit lazy val SenObj: GetResult[Sen] =
GetResult(r => Sen(getFile(r)))
...to be clear about what instances you have available. I find that helps in debugging these situations.

Scala Slick: MappedColumnType cannot find implicit value for BaseColumlnType[String]

I'm trying to set up database columns in Slick with non-primitive objects. I've spent the past day researching MappedColumnType for mapping custom objects to columns, and as far as I can tell I'm implementing them as people recommend. Unfortunately, the following code produces an error:
implicit val localDateMapper = MappedColumnType.base[LocalDate, String]
(
//map date to String
d => d.toString,
//map String to date
s => LocalDate.parse(s)
)
And here is the error:
could not find implicit value for evidence parameter of type slick.driver.H2Driver.BaseColumnType[String]
I've seen multiple examples where people map custom objects to and from Strings. I figure there must be something I'm missing?
For reference, I'm using Play Slick 1.1.1 and Scala 2.11.6. The former supports Slick 3.1.
You can import a BaseColumnType[String] with:
import slick.driver.H2Driver.api.stringColumnType

Avro4S : Could not find implicit value for parameter builder

I am using avro4s
https://github.com/sksamuel/avro4s
I wrote this code
implicit val schema = AvroSchema[SalesRecord]
val output = AvroOutputStream[SalesRecord](new File(outputLocation))
output.write(salesList)
output.flush
output.close
But I get a compile time error
could not find implicit value for parameter builder: shapeless.Lazy[....]
Not enough arguments for method apply
There was a bug in 1.2.x with private vals in a case class which caused the error you've seen here. That's fixed in 1.3.0 and should solve your problem.
(If it's not private vals, you'd need to post up your SalesRecord object for us to take a look at and I'll update this answer with a solution).

Is ??? a valid symbol or operator in scala

I have seen the symbol
???
used in scala code, i however don't know if it's meant to be pseudo code or actual scala code, but my eclipse IDE for scala doesn't flag it and the eclipse worksheet actually evaluates it.
I haven't been able to find anything via google search.
Any help will be appreciated.
Thanks
Yes, this is a valid identifier.
Since Scala 2.10, there is a ??? method in Predef which simply throws a NotImplementedError.
def ??? : Nothing = throw new NotImplementedError
This is intended to be used for quickly sketching the skeleton of some code, leaving the implementations of methods for later, for example:
class Foo[A](a: A) {
def flatMap[B](f: A => Foo[B]): Foo[B] = ???
}
Because it has a type of Nothing (which is a subtype of every other type), it will type-check in place of any value, allowing you to compile the incomplete code without errors. It's often seen in exercises, where the solution needs to be written in place of ???.
To search for method names that are ASCII or unicode strings:
SO search: https://stackoverflow.com/search?q=[scala]+%22%3F%3F%3F%22
finds this thread Scala and Python's pass
scalex covers scala 2.9.1 and scalaz 6.0 http://scalex.org/?q=%3C%3A%3C