I want to test RMQSource class for receiving data from RabbitMQ, but i donĀ“t know how to config the Rabbit virtual host for my exchange, and i think is the problem i have. My code:
import org.apache.flink.streaming.api.scala.StreamExecutionEnvironment
import org.apache.flink.streaming.connectors.rabbitmq.RMQSource
import org.apache.flink.streaming.util.serialization.SimpleStringSchema
object rabbitjob {
val env = StreamExecutionEnvironment.getExecutionEnvironment
val stream = env.addSource(new RMQSource[String]("192.168.1.11", 5672,"user","pass", "inbound.input.data",false, new SimpleStringSchema())).print
def main (args:Array[String]){
env.execute("Test Rabbit")
}
}
Error in IntelliJ IDE:
Error:(10, 29) could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[String]
val stream = env.addSource(new RMQSource[String]("192.168.1.11", 5672,"user","pass", "inbound.input.data",false, new SimpleStringSchema())).print
^
Error:(10, 29) not enough arguments for method addSource: (implicit evidence$7: org.apache.flink.api.common.typeinfo.TypeInformation[String])org.apache.flink.streaming.api.scala.DataStream[String].
Unspecified value parameter evidence$7.
val stream = env.addSource(new RMQSource[String]("192.168.1.11", 5672,"user","pass", "inbound.input.data",false, new SimpleStringSchema())).print
^
Any idea how to solve it or alternatives??
Thank you in advance.
Things changed over time.
Please have a look at RMQConnectionConfig: here you can find the way to specify a virtual host through builder pattern.
The error you're seeing is a Scala compile time error caused by some needed imports not being there. Whenever you are using the Flink Scala API you should include the following:
import org.apache.flink.api.scala._
This will solve the compile time problem you're having.
You need to provide the vhost name as well. Take a look at AMQP URI spec.
In your case the whole AMQP URI would look like would be "user:pass#192.168.1.11:5672/TestVHost".
Related
I'm a scala newbie, using pyspark extensively (on DataBricks, FWIW). I'm finding that Protobuf deserialization is too slow for me in python, so I'm porting my deserialization udf to scala.
I've compiled my .proto files to scala and then a JAR using scalapb as described here
When I try to use these instructions to create a UDF like this:
import gnmi.gnmi._
import org.apache.spark.sql.{Dataset, DataFrame, functions => F}
import spark.implicits.StringToColumn
import scalapb.spark.ProtoSQL
// import scalapb.spark.ProtoSQL.implicits._
import scalapb.spark.Implicits._
val deserialize_proto_udf = ProtoSQL.udf { bytes: Array[Byte] => SubscribeResponse.parseFrom(bytes) }
I get the following error:
command-4409173194576223:9: error: could not find implicit value for evidence parameter of type frameless.TypedEncoder[Array[Byte]]
val deserialize_proto_udf = ProtoSQL.udf { bytes: Array[Byte] => SubscribeResponse.parseFrom(bytes) }
I've double checked that I'm importing the correct implicits, to no avail. I'm pretty fuzzy on implicits, evidence parameters and scala in general.
I would really appreciate it if someone would point me in the right direction. I don't even know how to start diagnosing!!!
Update
It seems like frameless doesn't include an implicit encoder for Array[Byte]???
This works:
frameless.TypedEncoder[Byte]
this does not:
frameless.TypedEncoder[Array[Byte]]
The code for frameless.TypedEncoder seems to include a generic Array encoder, but I'm not sure I'm reading it correctly.
#Dymtro, Thanks for the suggestion. That helped.
Does anyone have ideas about what is going on here?
Update
Ok, progress - this looks like a DataBricks issue. I think that the notebook does something like the following on startup:
import spark.implicits._
I'm using scalapb, which requires that you don't do that
I'm hunting for a way to disable that automatic import now, or "unimport" or "shadow" those modules after they get imported.
If spark.implicits._ are already imported then a way to "unimport" (hide or shadow them) is to create a duplicate object and import it too
object implicitShadowing extends SQLImplicits with Serializable {
protected override def _sqlContext: SQLContext = ???
}
import implicitShadowing._
Testing for case class Person(id: Long, name: String)
// no import
List(Person(1, "a")).toDS() // doesn't compile, value toDS is not a member of List[Person]
import spark.implicits._
List(Person(1, "a")).toDS() // compiles
import spark.implicits._
import implicitShadowing._
List(Person(1, "a")).toDS() // doesn't compile, value toDS is not a member of List[Person]
How to override an implicit value?
Wildcard Import, then Hide Particular Implicit?
How to override an implicit value, that is imported?
How can an implicit be unimported from the Scala repl?
Not able to hide Scala Class from Import
NullPointerException on implicit resolution
Constructing an overridable implicit
Caching the circe implicitly resolved Encoder/Decoder instances
Scala implicit def do not work if the def name is toString
Is there a workaround for this format parameter in Scala?
Please check whether this helps.
Possible problem can be that you don't want just to unimport spark.implicits._ (scalapb.spark.Implicits._), you probably want to import scalapb.spark.ProtoSQL.implicits._ too. And I don't know whether implicitShadowing._ shadow some of them too.
Another possible workaround is to resolve implicits manually and use them explicitly.
Using Scala and Play 2.5.10 (according to plugin.sbt) I have this code:
import akka.stream.scaladsl.Source
import play.api.libs.streams.Streams
import play.http._
val source = Source.fromPublisher(Streams.enumeratorToPublisher(enumerator))
Ok.sendEntity(HttpEntity.Streamed(source, None, Some("application/zip")))
The imports there are mostly from testing because no matter what I try I can't get the framework to accept HttpEntity.Streamed. With this setup the error is what the title says. Or taken from the console:
Looking at the documentation here I can't really figure out why it doesn't work: https://www.playframework.com/documentation/2.5.10/api/java/play/http/HttpEntity.Streamed.html
This is also what the official examples use: https://www.playframework.com/documentation/2.5.x/ScalaStream
Does anyone at least have some pointers on where to start looking? I've never used Scala or Play before so any hints are welcome.
you should import this one
import play.api.http.HttpEntity
import play.api.libs.streams.Streams
val entity: HttpEntity = HttpEntity.Streamed(fileContent, None, None)
Result(ResponseHeader(200), entity).as(MemeFoRTheFile)
It means that HttpEntity.Streamed is not a value so you should wrap it in a Result() with its ResponseHeader and its extension
I am using Akka HTTP for REST support, and I need to use Actors in another part of the
server I'm developing. My understanding is that one typically needs to use exactly ONE ActorSystem
instance throughout one's application. From the definition of akka.http.scaladsl.Http.apply(),
it seems that when I use the Http method, as in the snippet from my code below --
val service: FooRestService = new FooRestService()
Http(). bindAndHandle(service.route, "localhost", 8080) // something is supplying the imply method w/ implicit ActorSystem !
--- somehow the Http object's apply() method is getting supplied with an implicit ActorSystem instance... For easy reference, Http.apply() is defined like this:
package akka.http.scaladsl.Http
...
object Http {
...
def apply()(implicit system: ActorSystem): HttpExt = super.apply(system)
Since I need to stick to exactly one ActorSystem instance, I want to supply the other (non-REST) Actor-based code in my system with the
SAME reference as the one that is being supplied to the Http apply() method.
I guessed that my code must be doing a package import of a package with a package object with an implicit ActorSystem, or there
must be some other way this implicit is slipping in like a ninja in the dead of night. I've poked around quite a bit,
but couldn't figure it out ;^(
Any suggestions much appreciated !
Not sure I fully understood what the problem is but in your every actor you have context: ActorContext. You can obtain ActorSystem from context.system. Thus you don't need to explicitly pass ActorSystem around.
Here is how I used #expert's answer (above) to debug where my implicit was coming from. The key thing is to dump out the system variable from the Actor that is getting the implicit.. then look at the name to figure out where the implicit came from. In my case the implicit was coming from my own code (how dumb!). Anyway.. thanks to the answer above my problem is solved.
val http: HttpExt = Http()
val sys = http.system
System.out.println("sys:" + sys);
http. bindAndHandle(
service.route, "localhost", injector.getInstance(classOf[Conf]).getInt(PROVISIONER_PORT )
)
I am using avro4s
https://github.com/sksamuel/avro4s
I wrote this code
implicit val schema = AvroSchema[SalesRecord]
val output = AvroOutputStream[SalesRecord](new File(outputLocation))
output.write(salesList)
output.flush
output.close
But I get a compile time error
could not find implicit value for parameter builder: shapeless.Lazy[....]
Not enough arguments for method apply
There was a bug in 1.2.x with private vals in a case class which caused the error you've seen here. That's fixed in 1.3.0 and should solve your problem.
(If it's not private vals, you'd need to post up your SalesRecord object for us to take a look at and I'll update this answer with a solution).
I tried to register a class for Kryo as follows
val conf = new SparkConf().setMaster(...).setAppName(...)
conf.registerKryoClasses(Seq(classOf[MyClass]))
val sc = new SparkContext(conf)
However, I get the following error
value registerKryoClasses is not a member of org.apache.spark.SparkConf
I also tried, conf.registerKryoClasses(classOf[MyClass]), but still it complains about the same error.
What mistake am I doing? I am using Spark 1.4.
The method SparkConf.registerKryoClasses is defined in Spark 1.4 (since 1.2). However, it expects an Array[Class[_]] as an argument. This might be the problem. Try calling conf.registerKryoClasses(Array(classOf[MyClass])) instead.