Error gridfs with reactiveMongo 0.11.9 scala and playframework - mongodb

I'm trying to follow this example https://github.com/sgodbillon/reactivemongo-demo-app and then implement this in my project.
but I found many difficulties, because I'm using "org.reactivemongo" %% "play2-reactivemongo" % "0.11.9" not the same version used in the tutorial, I solved some problems using some imports but for this one I'm really blocked:
type arguments [play.api.libs.json.JsObject,Articles.this.JSONReadFile] do not conform to method find's type parameter bounds [S,T <: reactivemongo.api.gridfs.ReadFile[reactivemongo.play.json.JSONSerializationPack.type, _]](line 97)
result <- maybeArticle.map { article =>
97 gridFS.find[JsObject, JSONReadFile](
98 Json.obj("article" -> article.id.get)).collect[List]().map { files =>
99 val filesWithId = files.map { file =>
Any help !!

Related

for comprehension with tuple, withFilter is not a member error

The following code snippet
import util.control.TailCalls._
for {(num, ch) <- done((3, '3'))
} yield num
fails to compile with error message:
value withFilter is not a member of util.control.TailCalls.TailRec[(Int, Char)]
I am using Scala 2.12.7. How to avoid this error? (IntelliJ Idea 18.3.1 with Scala plugin v2018.3.4 does not show error.)
To avoid the call to withFilter and keep the current syntax, it helps if a compiler plugin is used to treat for comprehensions differently. An option is using better-monadic-for.
Adding this to the build.sbt file should make the code in the question compile:
addCompilerPlugin("com.olegpy" %% "better-monadic-for" % "0.3.0-M4")
(Though it has other -usually positive- effects too, please check its documentation.)
Another option is implementing withFilter with an extension method, for example like this (and having it in scope at usage site):
implicit class TailCallsExtension[A](t: TailRec[A]) {
def withFilter(pred: A => Boolean): TailRec[A] = t.flatMap(a => if (pred(a)) t else done(a))
}
Seemingly there is no filtering in the code, but actually the pattern match in Scala for comprehensions (before <-) are translated as a call to withFilter. TailCalls does not support withFilter, so this will not compile. The following rewrite compiles though:
import util.control.TailCalls._
done((3, '3')).map{ case (num, ch) => num}

Understanding methods Vs functions in scala

I am learning the difference between methods and functions. I am following this link
http://jim-mcbeath.blogspot.co.uk/2009/05/scala-functions-vs-methods.html
The article says if you compile the following code:
class test {
def m1(x:Int) = x+3
val f1 = (x:Int) => x+3
}
We should get two files
1. test.class
2. test$$anonfun$1.class
But I do not get it. Secondly the example says if we execute the following command in REPL, we will get the below
scala> val f1 = (x:Int) => x+3
f1: (Int) => Int = <function>
But I get only this
scala> val f1 = (x:Int) => x+3
f1: Int => Int = $$Lambda$1549/1290654769#6d5254f3
Is it because we are using a different version? Please help.
Scala 2.11 and earlier versions behave as shown in the blog post.
The behavior changed in Scala 2.12. Scala now uses the lambda support that was added to version 8 of the JVM, so it doesn't need to emit the extra .class file. As a result, the .jar files produced by 2.12 are usually a lot smaller.
As a side effect of this, Scala can't override toString anymore, so you see the standard JVM toString output for lambdas.

Converting tutorial to a running program in IntelliJ IDEA

I am trying to convert the tutorial here http://atnos-org.github.io/eff/org.atnos.site.Introduction.html into a running Scala program inside IntelliJ-IDEA. The code runs in the command line REPL, but not in the IDE.
I have simply copied and pasted all the code into one file, added an object Intro extends App
Here is the code:
class Tutorial{
}
object Intro extends App {
import cats._
import cats.data._
import org.atnos.eff._
type ReaderInt[A] = Reader[Int, A]
type WriterString[A] = Writer[String, A]
type Stack = Fx.fx3[WriterString, ReaderInt, Eval]
import org.atnos.eff.all._
import org.atnos.eff.syntax.all._
// useful type aliases showing that the ReaderInt and the WriterString effects are "members" of R
// note that R could have more effects
type _readerInt[R] = ReaderInt |= R
type _writerString[R] = WriterString |= R
def program[R: _readerInt : _writerString : _eval]: Eff[R, Int] = for {
// get the configuration
n <- ask[R, Int]
// log the current configuration value
_ <- tell("the required power is " + n)
// compute the nth power of 2
a <- delay(math.pow(2, n.toDouble).toInt)
// log the result
_ <- tell("the result is " + a)
} yield a
println(program[Stack].runReader(6).runWriter.runEval.run)
}
The compiler error is Cannot resolve symbol run in the last line.
Here is my build.sbt file, following the instructions for the library:
name := "Tutorial"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.typelevel" %% "cats" % "0.9.0"
libraryDependencies += "org.atnos" %% "eff" % "3.1.0"
// to write types like Reader[String, ?]
addCompilerPlugin("org.spire-math" %% "kind-projector" % "0.9.3")
// to get types like Reader[String, ?] (with more than one type parameter) correctly inferred
// this plugin is not necessary with Scala 2.12
addCompilerPlugin("com.milessabin" % "si2712fix-plugin_2.11.8" % "1.2.0")
Edmund Noble answered my question here: https://github.com/atnos-org/eff/issues/80#issuecomment-287667353
The reason why IntelliJ cannot figure out your code in particular looks to be because IntelliJ is not capable of simulating implicit-directed type inference; the Member implicits have a type member, Out, inside them which represents the remaining effect stack. If the IDE cannot figure it out, it subs in a fresh type variable and thus the run ops constructor cannot be called because the inferred type according to IntelliJ is Eff[m.Out, A] and not Eff[NoFx, A].
FIX: I was able to separately compile the file and then run it, even though the error is still highlighted in the IDE.
Unless there is some feature in IntelliJ IDEA that I haven't enabled, this looks like a limitation and/or bug in IntelliJ IDEA .

Support for Scala Enumeration by net.liftweb.json

I am using the liftweb JSON converter and got it working, by including the dependency in build.sbt like this:
"net.liftweb" %% "lift-json" % "2.6.2"
This all works before I added Enumerations.
I can see here that Enumerations are supported, and you should do something like this:
// Scala enums
implicit val formats = net.liftweb.json.DefaultFormats + new EnumSerializer(MyEnum)
But the problem is in my environment the net.liftweb.json.ext package is not recognized. This is the package where EnumSerializer lives.
There is a separate extensions lib that you would need to include. Adding an extra line something like:
"net.liftweb" %% "lift-json-ext" % "2.6.2"
should do the trick.
I had an enumeration that was created by the gRPC proto and in that case the EnumSerializer didn't work for me. In that case, I created a custom serializer and worked awesome.
case object GrpcTimeUnitSerializer extends CustomSerializer[TimeUnit] (format => (
{
case JString(tu) => TimeUnit.fromName(tu.toUpperCase).get
case JNull => throw new GrpcServiceException(Status.INTERNAL.withDescription("Not allowed null value for the type TimeUnit."))
},
{
case tu: TimeUnit => JString(tu.toString)
}
))
And here is the DefaultFormats definition:
implicit val formats: Formats = DefaultFormats + GrpcTimeUnitSerializer

Avoiding recompilation with quasiquotes

Quasiquotes simplify many things when writing macros in Scala. However I noticed that macros containing quasiquotes could be recompiled every time a compilation in SBT is triggered, even though neither the macro implementation nor any of it's call sites have changed and need recompilation.
This doesn't seem to happen, if the code in quasiquotes is fairly simple, it seems to happen only if there's a dependency on another class. I noticed that rewriting everything with "reify" seems to solve the recompilation problem but I don't manage to rewrite the last part without quasiquotes...
My macro avoids reflection on startup by creating wrapper functions during compilation.
I have the following classes:
object ExportedFunction {
def apply[R: Manifest](f: Function0[R], fd: FunctionDescription): ExportedExcelFunction = new ExcelFunction0[R] {
def apply: R = f()
val functionDescription = fd
}
def apply[T1: Manifest, R: Manifest](f: Function1[T1, R], fd: FunctionDescription): ExportedExcelFunction = new ExcelFunction1[T1, R] {
def apply(t1: T1): R = f(t1)
val functionDescription = fd
}
... and so on... until Function17...
}
I then analyze an object and export any member function using the described interface like so:
def export(registrar: FunctionRegistrar,
root: Object,
<...more args...>) = macro exportImpl
def exportImpl(c: Context)(registrar: c.Expr[FunctionRegistrar],
root: c.Expr[Object],
<...>): c.Expr[Any] = {
import c.universe._
<... the following is simplified ...>
root.typeSignature.members.flatMap {
case x if x.isMethod =>
val method = x.asMethod
val callee = c.Expr(method))
val desc = q"""FunctionDescription(<...result from reflective lookup...>)"""
val export = q"ExportedFunction($callee _, $desc)"
q"$registrar.+=({$export})"
I can rewrite the first and last line with reify but I don't manage to rewrite the second line, my best shot is with quasiquotes:
val export = reify {
...
ExportedFunction(c.Expr(q"""$callee _"""), desc)
...
}.tree
But this results in:
overloaded method value apply with alternatives... cannot be applied to (c.Expr[Nothing], c.universe.Expr[FunctionDescription])
I think the compiler is missing the implicits, or maybe this code will only work for a function with a fixed number of arguments, since it needs to know at macro compile time how many arguments the method has? However it works if everything is written using quasiquotes...
From the description of the SBT problem I can assume that you're using macro paradise for 2.10.x and are facing https://github.com/scalamacros/paradise/issues/11. I was planning to address this issue this week, so the fix should arrive really soon. In the meanwhile you could use a workaround described on the issue page.
As for the reify problem, not all quasiquotes can be rewritten using reify. Limitations such as the one you have faced here were a very strong motivator towards shifting our focus to quasiquotes in Scala 2.11.
For the record, these SBT settings (upgrade to newer version) fixed it:
...
settings = Seq(
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value,
libraryDependencies += "org.scalamacros" % "quasiquotes" % "2.0.0-M3" cross CrossVersion.full,
autoCompilerPlugins := true,
addCompilerPlugin("org.scalamacros" % "paradise" % "2.0.0-M3" cross CrossVersion.full)
)