missing Cats Functor[Future] instance - scala

I am trying to use OptionT to combine methods returning Future[Option[T]] in a for-comprehension.
import cats.data._
import cats.implicits._
import cats.instances.future._
for {
data <- OptionT(repo.getData(id))
...
}
The compiler error I am getting:
could not find implicit value for parameter F cats.Functor[scala.concurrent.Future]
This recent example suggests that this is (was?) possible.
so do the docs in the pull request for adding OptionT
and the cats Functor docs
What am I missing here?
Thank you

By importing cats.implicits._ you are actually already importing cats.syntax.AllSyntax and cats.instances.AllInstances
Try using just those imports:
import cats.data._
import cats.implicits._
or (according to your needs):
import cats.data._
import cats.instances.future._
or more specifically:
import cats.data._
import cats.instances.future.catsStdInstancesForFuture
you may also need:
import scala.concurrent.Future
import scala.concurrent.ExecutionContext.Implicits.global
Note: of course you have to implicitly provide an actual ExecutionContext in a production environment.

Following imports work for me (also mentioned in approved answer),
import cats.data.OptionT
import cats.instances.future._ // or import cats.implicits._
// as implicits include FutureInstances
import scala.concurrent.Future
import scala.concurrent.ExecutionContext.Implicits.global
Also, the important thing was the dependencies as I was using org.typelevel:cats:0.9.0 along with cats-core-1.1.0 which was causing Symbol 'type cats.kernel.instances.EqInstances' is missing from the classpath.
Had to remove older cats-0.9.0 and use latest cats-core and cats-kernel.
libraryDependencies ++= Seq(
"org.typelevel" %% "cats-core" % "1.1.0",
"org.typelevel" %% "cats-kernel" % "1.2.0",
"org.scalatest" %% "scalatest" % "3.0.4" % Test
)

Seems like causes of this issue can be various; I just ran into this because I had two implicit ExecutionContexts available in scope, so cats was unable to select one to provide the Functor[Future]:
import scala.concurrent.ExecutionContext.Implicits.global
// ^ first implicit ExecutionContext
class MyClass {
def myMethod(e: EitherT[Future, _, _])(implicit ec: ExecutionContext) {
// Second implicit ExecutionContext
e.flatMap(...) // <- Functor unavailable
}
}
In my case I was able to resolve the issue by simply not passing ec into myMethod, though removing the global execution context would have also worked.

Had the same problem. cats actually has the instance.
The real issue is just the implicit ExecutionContext missing which you could just do the following
import scala.concurrent.ExecutionContext.Implicits.global
I realised this when I was trying to provide an instance for it.
implicit val ev: Functor[Future] = new Functor[Future] {
override def map[A, B](fa: Future[A])(f: A => B): Future[B] = fa.map(f)
}
It's at this point the compiler will say the missing the implicit ExecutionContext

Related

methode map in class cats.data.Nested not recognized

I have a problem when folowing the scala cats library tutorial, the map method applied to the Nested class is highlighted with red, and the compiler doesn't recognize it.
here is my main class code :
import cats._
import cats.data._
import cats.implicits._
import cats.syntax.functor._
import cats.Functor
import cats.instances.list._
import cats.instances.option._
object Main extends App{
val list = List(Some(1), Some(2), None, Some(4))
val nested: Nested[List, Option, Int] = Nested(list)
//here is the problem
nested.map(_ + 1)
}
here is my build.sbt file
name := "dailySBT3"
version := "0.1"
scalaVersion := "2.12.5"
scalacOptions += "-Ypartial-unification"
libraryDependencies += "org.typelevel" %% "cats-core" % "1.1.0"
The problem is you are importing the instances and syntax twice. The following works for me with no problems:
import cats._
import cats.data._
import cats.implicits._
object Main extends App{
val list = List(Some(1), Some(2), None, Some(4))
val nested: Nested[List, Option, Int] = Nested(list)
nested.map(_ + 1)
}
You could also do the same thing as above but get rid of the cats.implicits._ import instead.
When in doubt, check out the cats import guide.

Failed assertion with automated initialisation of database in Cassandra

Completely new to Cassandra. Tried to initialize a database in Cassandra using phantom-dsl. I received this error message.
*** RUN ABORTED ***
java.lang.AssertionError: assertion failed: no symbol could be loaded from class com.datastax.driver.core.Cluster in package core with name Cluster and classloader sun.misc.Launcher$AppClassLoader#279f2327
at scala.reflect.runtime.JavaMirrors$JavaMirror.scala$reflect$runtime$JavaMirrors$JavaMirror$$classToScala1(JavaMirrors.scala:1021)
at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToScala$1.apply(JavaMirrors.scala:980)
at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$classToScala$1.apply(JavaMirrors.scala:980)
at scala.reflect.runtime.JavaMirrors$JavaMirror$$anonfun$toScala$1.apply(JavaMirrors.scala:97)
at scala.reflect.runtime.TwoWayCaches$TwoWayCache$$anonfun$toScala$1.apply(TwoWayCaches.scala:39)
at scala.reflect.runtime.Gil$class.gilSynchronized(Gil.scala:19)
at scala.reflect.runtime.JavaUniverse.gilSynchronized(JavaUniverse.scala:16)
at scala.reflect.runtime.TwoWayCaches$TwoWayCache.toScala(TwoWayCaches.scala:34)
at scala.reflect.runtime.JavaMirrors$JavaMirror.toScala(JavaMirrors.scala:95)
at scala.reflect.runtime.JavaMirrors$JavaMirror.classToScala(JavaMirrors.scala:980)
I am not really sure whether it is an issue with the Connector in phantom-dsl or the ClusterBuilder in datastax-driver.
Connector.scala
package com.neruti.db
import com.neruti.db.models._
import com.websudos.phantom.database.Database
import com.websudos.phantom.connectors.ContactPoints
import com.websudos.phantom.dsl.KeySpaceDef
object Connector {
val host= Seq("localhost")
val port = 9160
val keySpace: String = "nrt_entities"
// val inet = InetAddress.getByName
lazy val connector = ContactPoints(host,port).withClusterBuilder(
_.withCredentials("cassandra", "cassandra")
).keySpace(keySpace)
}
CassandraSpec.scala
package com.neruti.db
import com.neruti.User
import com.neruti.db.models._
import com.neruti.db.databases._
import com.neruti.db.services._
import com.neruti.db.Connector._
import java.util.UUID
import com.datastax.driver.core.ResultSet
import org.scalatest._
import org.scalatest.{BeforeAndAfterAll,FlatSpec,Matchers,ShouldMatchers}
import org.scalatest.concurrent.ScalaFutures
import org.scalamock.scalatest.MockFactory
import scala.concurrent.duration._
import scala.concurrent.{Await, Future}
import scala.concurrent.ExecutionContext.Implicits.global
abstract class BaseCassandraSpec extends FlatSpec
with BeforeAndAfterAll
with Inspectors
with Matchers
with OptionValues
with ScalaFutures
class CassandraTest extends BaseCassandraSpec
with ProductionDatabase
with UserService
with Connector.connector.Connector{
val user = User(
Some("foobar"),
Some("foo#foobar.com"),
Some(UUID.randomUUID()),
)
override protected def beforeAll(): Unit = {
Await.result(database.userModel.create(user),10.seconds)
}
}
Looks there may be multiple issues you are looking at:
The latest version of phantom is 2.1.3, I'd strongly recommend using that, especially if you are just starting out. The migration guide is here in case you need it.
The entire reflection mechanism has been replaced in the latest version, so that error should magically go away. With respect to testing and generating objects, I would also look to include com.outworkers.util.testing, which is freely available on Maven Central
libraryDependencies ++= Seq(
//..,
"com.outworkers" %% "phantom-dsl" % "2.1.3",
"com.outworkers" %% "util-testing" % "0.30.1" % Test
)
This will offer you automated case class generation:
import com.outworkers.util.testing._
val sample = gen[User]

Could not access type Unmarshaller in value akka.http.javadsl.unmarshalling

I am trying to write a simple Http client using Akka Http Client API. Towards this I have written the following code
import akka.actor.ActorSystem
import akka.http.scaladsl.Http
import akka.http.scaladsl.model._
import akka.http.scaladsl.unmarshalling._
import akka.stream.ActorMaterializer
import akka.stream.scaladsl.{Sink, Source}
import scala.concurrent.duration._
import scala.concurrent.{Await}
import akka.http.scaladsl.server.Directives
import akka.http.scaladsl.marshallers.sprayjson.SprayJsonSupport
import spray.json._
final case class Post(postId: Int, id: Int, name: String, email: String, body: String)
trait JsonSupport extends SprayJsonSupport with DefaultJsonProtocol {
implicit val postFormat = jsonFormat5(Post.apply)
}
class AkkaHttpClient extends App with Directives with JsonSupport {
implicit val system = ActorSystem("my-Actor")
implicit val actorMaterializer = ActorMaterializer()
implicit val executionContext = system.dispatcher
val httpClient = Http().outgoingConnection(host="http://jsonplaceholder.typicode.com/")
val flow = Source.single(HttpRequest(uri = Uri("/comments/1")))
.via(httpClient)
.mapAsync(1)(r => Unmarshal(r.entity).to[Post])
.runWith(Sink.head)
val results = Await.result(flow, 15 seconds)
println(results)
}
My build.sbt file looks like
name := "Akka-Http-Client"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-http-experimental" % "2.4.9-RC1",
"com.typesafe.akka" %% "akka-http-spray-json-experimental" % "2.4.9-RC1"
)
When I try to compile my code I get these errors
Error:scalac: missing or invalid dependency detected while loading class file 'Unmarshaller.class'.
Could not access type Unmarshaller in value akka.http.javadsl.unmarshalling,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'Unmarshaller.class' was compiled against an incompatible version of akka.http.javadsl.unmarshalling.
I am having the same problem on 2.4.9-RC1, falling back to 2.4.8 solves the problem.
OR you could use this workaround described here: https://github.com/akka/akka/issues/21105

Developing SBT plugin with Dispatch 0.11.0 results in Scala compiler's mysterious errors

I'm new to Scala and Dispatch, and I can't seem to get a basic Post request working.
I'm actually building a sbt plugin that uploads files to a third party service.
Here is my build.sbt file:
sbtPlugin := true
name := "sbt-sweet-plugin"
organization := "com.mattwalters"
version := "0.0.1-SNAPSHOT"
libraryDependencies += "net.databinder.dispatch" %% "dispatch-core" % "0.11.0"
And here is the plugin's SweetPlugin.scala:
package com.mattwalters
import sbt._
import Keys._
import dispatch._
object SweetPlugin extends Plugin {
import SweetKeys._
object SweetKeys {
lazy val sweetApiToken =
SettingKey[String]("Required. Find yours at https://example.com/account/#api")
lazy val someOtherToken =
SettingKey[String]("Required. Find yours at https://example.com/some/other/token/")
lazy val sweetFile =
SettingKey[String]("Required. File data")
lazy val sweetotes =
SettingKey[String]("Required. Release notes")
lazy val sweetUpload =
TaskKey[Unit]("sweetUpload", "A task to upload the specified sweet file.")
}
override lazy val settings = Seq (
sweetNotes := "some default notes",
// define the upload task
sweetUpload <<= (
sweetApiToken,
someOtherToken,
sweetFile,
sweetNotes
) map { (
sweetApiToken,
someOtherToken,
sweetFile,
sweetNotes
) =>
// define http stuff here
val request = :/("www.example.com") / "some" / "random" / "endpoint"
val post = request.POST
post.addParameter("api_token", sweetApiToken)
post.addParameter("some_other_token", someOtherToken)
post.addParameter("file", io.Source.fromFile(sweetFile).mkString)
post.addParameter("notes", sweetNotes)
val responseFuture = Http(post OK as.String)
val response = responseFuture()
println(response) // see if we can get something at all....
}
)
}
The dispatch documentation shows:
import dispatch._, Defaults._
but I get
reference to Defaults is ambiguous;
[error] it is imported twice in the same scope by
removing , Defaults._ makes this error go away.
I also tried the recommendation from this post:
import dispatch._
Import dispatch.Default._
But alas I get:
object Default is not a member of package dispatch
[error] import dispatch.Default._
Also tried the advice from
Passing implicit ExecutionContext to contained objects/called methods:
import concurrent._
import concurrent.duration._
But I still get
Cannot find an implicit ExecutionContext, either require one yourself or import ExecutionContext.Implicits.global
Back to square one...
New to scala so any advice at all on the code above is appreciated.
Since the recommended sbt console run works fine, it looks like one of your other imports also has a Defaults module. It's a standard approach in Scala for collecting implicit values to be used as function params (another naming convention is Implicits).
The other problem is that there is a typo/out-of-date problem in the import statement you got from Google Groups - it's Defaults, plural.
In summary - the best solution is to explicitly let Scala know which module you want to use:
import dispatch._
import dispatch.Defaults._
In the general case, only if the library's docs don't say otherwise: the last error message is pretty common to concurrent programming in Scala. To quote the relevant part:
either require one yourself or import ExecutionContext.Implicits.global
So, unless you want to roll your own ExecutionContext, just import the scala.concurrent.ExecutionContext.Implicits.global one via the scala.concurrent.ExecutionContext.Implicits module.

value resolveOne is not a member of akka.actor.ActorSelection

I get the above error message from here:
implicit val askTimeout = Timeout(60 seconds)
val workerFuture = workerContext actorSelection(payload.classname) resolveOne()
val worker = Await.result(workerFuture, 10 seconds)
worker ask Landau(List("1", "2", "3"))
specifically from the second line.. the import made is
import akka.actor._
import akka.util.Timeout
import akka.pattern.{ ask, pipe }
import scala.concurrent.duration._
import scala.concurrent.Await
import java.util.concurrent.TimeUnit
akka version is 2.2.1 and scala is 2.10.2, i'm using sbt 0.13 to build it all..
I cannot really understand what's wrong, since resolveOne is definetely coming from that package..
EDIT: I made a print of all the methods of the class with
ActorSelection.getClass.getMethods.map(_.getName).foreach { p => println(p)}
and this is the result:
apply
toScala
wait
wait
wait
equals
toString
hashCode
getClass
notify
notifyAll
I had the same problem and changed my Scala and Akka versions as described in the link below.
I brought here part of my build.sbt for simplicity:
scalaVersion := "2.10.4"
resolvers += "Akka Snapshot Repository" at "http://repo.akka.io/snapshots/"
libraryDependencies ++= Seq("com.typesafe.akka" %% "akka-actor" % "2.4-SNAPSHOT")
link: http://doc.akka.io/docs/akka/snapshot/intro/getting-started.html