value resolveOne is not a member of akka.actor.ActorSelection - scala

I get the above error message from here:
implicit val askTimeout = Timeout(60 seconds)
val workerFuture = workerContext actorSelection(payload.classname) resolveOne()
val worker = Await.result(workerFuture, 10 seconds)
worker ask Landau(List("1", "2", "3"))
specifically from the second line.. the import made is
import akka.actor._
import akka.util.Timeout
import akka.pattern.{ ask, pipe }
import scala.concurrent.duration._
import scala.concurrent.Await
import java.util.concurrent.TimeUnit
akka version is 2.2.1 and scala is 2.10.2, i'm using sbt 0.13 to build it all..
I cannot really understand what's wrong, since resolveOne is definetely coming from that package..
EDIT: I made a print of all the methods of the class with
ActorSelection.getClass.getMethods.map(_.getName).foreach { p => println(p)}
and this is the result:
apply
toScala
wait
wait
wait
equals
toString
hashCode
getClass
notify
notifyAll

I had the same problem and changed my Scala and Akka versions as described in the link below.
I brought here part of my build.sbt for simplicity:
scalaVersion := "2.10.4"
resolvers += "Akka Snapshot Repository" at "http://repo.akka.io/snapshots/"
libraryDependencies ++= Seq("com.typesafe.akka" %% "akka-actor" % "2.4-SNAPSHOT")
link: http://doc.akka.io/docs/akka/snapshot/intro/getting-started.html

Related

unable to use WS.url() in Play app tutorial

I am doing a tutorial for Play framework with Scala. I am quite early into the tutorial and i am having a problem with ws. In my class WS is not recognized although that says to use WS.url("url-here") and to import play.api.libs._ which i have done both. I have also tried using ws.url("url-here") as well... and here wsis recognized but after that i get a "can't resolve symbol 'url'". Here is my build.sbt:
name := """play_hello"""
organization := "x.x"
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.12.3"
libraryDependencies ++= Seq(
"org.scalatestplus.play" %% "scalatestplus-play" % "3.1.2" % Test,
"com.ning" % "async-http-client" % "1.9.29",
guice,
ws
)
And here is the code for my class:
package controllers
import javax.inject.Inject
import com.ning.http.client.oauth.{ConsumerKey, RequestToken}
import play.api.Play.current
import play.api.libs._
import play.api.mvc._
import scala.concurrent.Future
class Application #Inject()(cc: ControllerComponents) extends
AbstractController(cc){
def tweets = Action.async{
credentials.map { case (consumerKey, requestToken) =>
WS.url("http://stream.twitter.com")
Future.successful{
Ok
}
}getOrElse{
Future.successful{
InternalServerError("Twitter credentials are missing!")
}
}
}
def credentials: Option[(ConsumerKey, RequestToken)] = for{
apiKey <- current.configuration.getString("twitter.apiKey")
apiSecret <- current.configuration.getString("twitter.apiSecret")
token <- current.configuration.getString("twitter.token")
tokenSecret <- current.configuration.getString("twitter.tokenSecret")
}yield (
new ConsumerKey(apiKey, apiSecret),
new RequestToken(token, tokenSecret)
)
}
I Figure that most likely this is some type of problem with a dependency conflict. Here is a screenshot of ws related libraries in project structure. I would appreciate any help in finding a solution to this. Thank you.
The solution was to add ws: WSClient to the parameters of Application class constructor. Apperently standalone WS object has been removed in more recent versions of ws library.
class Application #Inject()(cc: ControllerComponents, ws: WSClient) extends AbstractController(cc)
Now i can use:
ws.url("https://stream.twitter.com/1.1/statuses/filter.json")
Also according to the documentation on play website if you for some reason can not use an injected WSClient, then you can create an instance of one and use that.

How to import play in scala repl

How can I import play in Scala repl?
scala> import play.api.libs.json._
<console>:11: error: not found: value play
import play.api.libs.json._
1) setup simple build tool(sbt) {its easy - download from here - http://www.scala-sbt.org/download.html and instructions here - http://www.scala-sbt.org/0.13/docs/Installing-sbt-on-Windows.html}
2) Create a empty folder with build.sbt with following contents
//your-test-project/build.sbt
scalaVersion := "2.11.8"
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
libraryDependencies += "com.typesafe.play" %% "play" % "2.5.12"
3) Then simply do sbt console on the root on folder, which will download play and make it available to your console.
$ ls -l ~/.ivy2/cache/com.typesafe.play/play_2.11/jars/
total 15392
-rw-r--r-- 1 as18 185223974 4107407 Jan 22 15:59 play_2.11-2.5.12.jar
Then you are good to go.
scala> import play.api.libs.json._
import play.api.libs.json._
scala> val json: JsValue = Json.parse("""{ "compiler" : "scala", "ratings" : 5 }""")
json: play.api.libs.json.JsValue = {"compiler":"scala","ratings":5}
scala> val compiler = ( json \ "compiler" )
compiler: play.api.libs.json.JsLookupResult = JsDefined("scala")
Also, you can directly provide the jar if you already have it as below
scala -cp ~/.ivy2/cache/com.typesafe.play/play_2.11/jars/play_2.11-2.5.12.jar
scala> import play.api.libs._
import play.api.libs._
Things are much simpler with Ammonite REPL:
load.ivy("com.typesafe.play" %% "play" % "2.5.12")
import whatever.you.need
The package is not found because it is not in the class path of the REPL.
If you know the location of Play Framework's JAR on your computer, you can add it to the class path when starting the REPL:
> scala -cp path/to/play.jar
You can also add this directly from inside a REPL session:
:require play.jar
Note that you will still need to import your classes as before.

No RowReaderFactory can be found for this type error when trying to map Cassandra row to case object using spark-cassandra-connector

I am trying to get a simple example working mapping rows from Cassandra to a scala case class using Apache Spark 1.1.1, Cassandra 2.0.11, & the spark-cassandra-connector (v1.1.0). I have reviewed the documentation at the spark-cassandra-connector github page, planetcassandra.org, datastax, and generally searched around; but have not found anyone else encountering this issue. So here goes...
Building a tiny spark application using sbt (0.13.5), scala 2.10.4, spark 1.1.1 against Cassandra 2.0.11. Modelling the example from the spark-cassandra-connector docs the following two lines present an error in my IDE and fail to compile.
case class SubHuman(id:String, firstname:String, lastname:String, isGoodPerson:Boolean)
val foo = sc.cassandraTable[SubHuman]("nicecase", "human").select("id","firstname","lastname","isGoodPerson").toArray
The simple error presented by eclipse is:
No RowReaderFactory can be found for this type
The compile error is only slightly more verbose:
> compile
[info] Compiling 1 Scala source to /home/bkarels/dev/simple-case/target/scala-2.10/classes...
[error] /home/bkarels/dev/simple-case/src/main/scala/com/bradkarels/simple/SimpleApp.scala:82: No RowReaderFactory can be found for this type
[error] val foo = sc.cassandraTable[SubHuman]("nicecase", "human").select("id","firstname","lastname","isGoodPerson").toArray
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
[error] Total time: 1 s, completed Dec 10, 2014 9:01:30 AM
>
Scala source:
package com.bradkarels.simple
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import com.datastax.spark.connector._
import com.datastax.spark.connector.rdd._
// Likely don't need this import - but throwing darts hits the bullseye once in a while...
import com.datastax.spark.connector.rdd.reader.RowReaderFactory
object CaseStudy {
def main(args: Array[String]) {
val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new SparkContext("spark://127.0.0.1:7077", "simple", conf)
case class SubHuman(id:String, firstname:String, lastname:String, isGoodPerson:Boolean)
val foo = sc.cassandraTable[SubHuman]("nicecase", "human").select("id","firstname","lastname","isGoodPerson").toArray
}
}
With the bothersome lines removed, everything compiles fine, assembly works, and I can perform other Spark operations normally. For example, if I remove the problem lines and drop in:
val rdd:CassandraRDD[CassandraRow] = sc.cassandraTable("nicecase", "human")
I get back the RDD and work with it as expected. That said, I suspect that my sbt project, assembly plugin, etc. are not contributing to the issues. The working source (less the new attempt to map to a case class as the connector as intended) can be found on github here.
But, to be more thorough, my build.sbt:
name := "Simple Case"
version := "0.0.1"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.1.1",
"org.apache.spark" %% "spark-sql" % "1.1.1",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.1.0" withSources() withJavadoc()
)
So the question is what have I missed? Hoping this is something silly, but if you have encountered this and can help me get past this puzzling little issue I would very much appreciate it. Please let me know if there are any other details that would be helpful in troubleshooting.
Thank you.
This may be my newness with Scala in general, but I resolved this issue by moving the case class declaration out of the main method. So the simplified source now looks like this:
package com.bradkarels.simple
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import com.datastax.spark.connector._
import com.datastax.spark.connector.rdd._
object CaseStudy {
case class SubHuman(id:String, firstname:String, lastname:String, isGoodPerson:Boolean)
def main(args: Array[String]) {
val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new SparkContext("spark://127.0.0.1:7077", "simple", conf)
val foo = sc.cassandraTable[SubHuman]("nicecase", "human").select("id","firstname","lastname","isGoodPerson").toArray
}
}
The complete source (updated & fixed) can be found on github https://github.com/bradkarels/spark-cassandra-to-scala-case-class

Developing SBT plugin with Dispatch 0.11.0 results in Scala compiler's mysterious errors

I'm new to Scala and Dispatch, and I can't seem to get a basic Post request working.
I'm actually building a sbt plugin that uploads files to a third party service.
Here is my build.sbt file:
sbtPlugin := true
name := "sbt-sweet-plugin"
organization := "com.mattwalters"
version := "0.0.1-SNAPSHOT"
libraryDependencies += "net.databinder.dispatch" %% "dispatch-core" % "0.11.0"
And here is the plugin's SweetPlugin.scala:
package com.mattwalters
import sbt._
import Keys._
import dispatch._
object SweetPlugin extends Plugin {
import SweetKeys._
object SweetKeys {
lazy val sweetApiToken =
SettingKey[String]("Required. Find yours at https://example.com/account/#api")
lazy val someOtherToken =
SettingKey[String]("Required. Find yours at https://example.com/some/other/token/")
lazy val sweetFile =
SettingKey[String]("Required. File data")
lazy val sweetotes =
SettingKey[String]("Required. Release notes")
lazy val sweetUpload =
TaskKey[Unit]("sweetUpload", "A task to upload the specified sweet file.")
}
override lazy val settings = Seq (
sweetNotes := "some default notes",
// define the upload task
sweetUpload <<= (
sweetApiToken,
someOtherToken,
sweetFile,
sweetNotes
) map { (
sweetApiToken,
someOtherToken,
sweetFile,
sweetNotes
) =>
// define http stuff here
val request = :/("www.example.com") / "some" / "random" / "endpoint"
val post = request.POST
post.addParameter("api_token", sweetApiToken)
post.addParameter("some_other_token", someOtherToken)
post.addParameter("file", io.Source.fromFile(sweetFile).mkString)
post.addParameter("notes", sweetNotes)
val responseFuture = Http(post OK as.String)
val response = responseFuture()
println(response) // see if we can get something at all....
}
)
}
The dispatch documentation shows:
import dispatch._, Defaults._
but I get
reference to Defaults is ambiguous;
[error] it is imported twice in the same scope by
removing , Defaults._ makes this error go away.
I also tried the recommendation from this post:
import dispatch._
Import dispatch.Default._
But alas I get:
object Default is not a member of package dispatch
[error] import dispatch.Default._
Also tried the advice from
Passing implicit ExecutionContext to contained objects/called methods:
import concurrent._
import concurrent.duration._
But I still get
Cannot find an implicit ExecutionContext, either require one yourself or import ExecutionContext.Implicits.global
Back to square one...
New to scala so any advice at all on the code above is appreciated.
Since the recommended sbt console run works fine, it looks like one of your other imports also has a Defaults module. It's a standard approach in Scala for collecting implicit values to be used as function params (another naming convention is Implicits).
The other problem is that there is a typo/out-of-date problem in the import statement you got from Google Groups - it's Defaults, plural.
In summary - the best solution is to explicitly let Scala know which module you want to use:
import dispatch._
import dispatch.Defaults._
In the general case, only if the library's docs don't say otherwise: the last error message is pretty common to concurrent programming in Scala. To quote the relevant part:
either require one yourself or import ExecutionContext.Implicits.global
So, unless you want to roll your own ExecutionContext, just import the scala.concurrent.ExecutionContext.Implicits.global one via the scala.concurrent.ExecutionContext.Implicits module.

Cannot find Await from akka

I have an error of importing Await from akka.io. Here is my build.sbt:
name := "Project1"
version := "0.1"
scalaVersion := "2.10.1"
libraryDependencies += "org.json4s" %% "json4s-native" % "3.2.4"
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.10" % "2.1.4"
Here is a part of a code:
import akka.actor.Actor
import akka.actor.ActorSystem
import akka.actor.Props
import akka.dispatch.Await
import akka.pattern.ask
//.......
private def resultId = {
private val someActor = context.actorSelection("../someActor123") // defined in Application object
val future = someActor ? SomeMessage
val result = Await.result(future, 1.timeout).asInstanceOf[String]
}
It says object Await is not a member of package akka.dispatch and value ? is not a member of akka.actor.ActorSelection and not found: value Await
Of course, I reloaded it and did gen-idea.
As #S.R.I noted you should use scala.concurrent.Await instead of akka.dispatch.Await.
value ? is not a member of akka.actor.ActorSelection
There is no ask pattern support for ActorSelection in version 2.1.4. See this commit. Ask support for ActorSelection is available only after version 2.2.