I am writing a simple word count flink job but I keep getting this error:
could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[String]
[error] .flatMap{_.toLowerCase.split("\\W+") filter {_.nonEmpty}}
I searched the net but could not get any comprehensible answer.
Here is my code:
object Job {
def main(args: Array[String]) {
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment
val dataStream = env.readTextFile("file:///home/plivo/code/flink/scala/flinkstream/test/")
val count = dataStream
.flatMap{_.toLowerCase.split("\\W+") filter {_.nonEmpty}}
.map{ (_,1) }
.groupBy(0)
.sum(1)
dataStream.print()
env.execute("Flink Scala API Skeleton")
}
}
You have to import
import org.apache.flink.api.scala._
to enable implicit conversion instead of creating implicit value for each type that you use.
Adding this: implicit val typeInfo = TypeInformation.of(classOf[(String)]) as the first line in def main(args: Array[String]) {...} fixed it for me.
object Job {
def main(args: Array[String]) {
implicit val typeInfo = TypeInformation.of(classOf[(String)]) //Add this here
// set up the execution environment
val env = StreamExecutionEnvironment.getExecutionEnvironment
val dataStream = env.readTextFile("file:///home/plivo/code/flink/scala/flinkstream/test/")
val count = dataStream
.flatMap{_.toLowerCase.split("\\W+") filter {_.nonEmpty}}
.map{ (_,1) }
.groupBy(0)
.sum(1)
dataStream.print()
env.execute("Flink Scala API Skeleton")
}
}
Related
I am observing a strange behavior. In on of my test cases, I am using contentAsJson. In that test case, the compiler is not complaining that I have to provide an implicit value for Timeout and Materializer
class UserControllerUnitSpec extends PlaySpec with BeforeAndAfterAll with BeforeAndAfterEach with OneAppPerSuiteWithComponents{
..
"User signup request with body but with incorrect profile data " should {
"return error message " in {
...val resultFuture: Future[Result] = testEnv.controller.signupUser(request)
val responseBodyAsJsValue: JsValue = contentAsJson(resultFuture)//works
...
}
}
But in another test case, the compiler gives error that I need to provide the value
class QuestionsControllerUnitSpec extends PlaySpec with BeforeAndAfterAll with BeforeAndAfterEach with OneAppPerSuiteWithComponents{
...
"newQuestion" should {
"should return error if the size of the body in the request is more than the maximum allowed size" in {
...
val response:Accumulator[ByteString,Result] = questionController.newQuestion(request)
val responseBody = contentAsJson(response)//(Timeout(Duration(5000,"millis")),testEnv.testEnv.mat).
...
}
I get error
Error:(1485, 39) could not find implicit value for parameter mat: akka.stream.Materializer
val responseBody = contentAsJson(response)//(Timeout(Duration(5000,"millis")),testEnv.testEnv.mat)
How can I debug why one is working but the other isn't?
UPDATE - added return types after Mario's answer.
Try providing implicit Materializer like so
import play.api.test.Helpers._
implicit val actorSystem = ActorSystem("test")
implicit val materializer = ActorMaterializer()
val responseBody = contentAsJson(response)
instead of explicit testEnv.testEnv.mat
contentAsJson(response)(Timeout(Duration(5000,"millis")),testEnv.testEnv.mat)
Regarding the difference between the two tests, note there are two overloaded versions of contentAsJson
def contentAsJson(of: Future[Result])(implicit timeout: Timeout, mat: Materializer = NoMaterializer): JsValue
def contentAsJson(of: Accumulator[ByteString, Result])(implicit timeout: Timeout, mat: Materializer): JsValue
where in the first case we see default Materializer argument is provided
mat: Materializer = NoMaterializer
while in the second case we have to provided our own. Therefore it is likely that in the first test the type of resultFuture is Future[Result] whilst in the second test the return type of response is Accumulator.
Regarding finding out where the implicit is provided from, personally I use IntelliJ's View | Show Implicit Hints feature.
Running the example code snippet under the subtopic parSequence in Cats Effect document throws an error,
import cats._, cats.data._, cats.syntax.all._, cats.effect.IO
val anIO = IO(1)
val aLotOfIOs = NonEmptyList.of(anIO, anIO)
val ioOfList = aLotOfIOs.parSequence
<console>:44: error: could not find implicit value for parameter P: cats.Parallel[cats.effect.IO,F]
I include implicit Timer[IO] i.e. implicit val timer = IO.timer(ExecutionContext.global) but it does not work. Please advise. Thanks
Update #1
For a complete working snippet,
import cats._, cats.data._, cats.syntax.all._, cats.effect.IO
import scala.concurrent.ExecutionContext.Implicits.global
implicit val contextShift = IO.contextShift(global)
val anIO = IO(1)
val aLotOfIOs = NonEmptyList.of(anIO, anIO)
val ioOfList = aLotOfIOs.parSequence
The implicit you're looking for is defined in cats.effect.IOInstances and you can bring it in scope by importing cats.effect.IO._.
private[effect] abstract class IOInstances extends IOLowPriorityInstances {
//....
implicit def ioParallel(implicit cs: ContextShift[IO]): Parallel[IO, IO.Par] =
new Parallel[IO, IO.Par] {
final override val applicative: Applicative[IO.Par] =
parApplicative(cs)
final override val monad: Monad[IO] =
ioConcurrentEffect(cs)
final override val sequential: ~>[IO.Par, IO] =
new FunctionK[IO.Par, IO] { def apply[A](fa: IO.Par[A]): IO[A] = IO.Par.unwrap(fa) }
final override val parallel: ~>[IO, IO.Par] =
new FunctionK[IO, IO.Par] { def apply[A](fa: IO[A]): IO.Par[A] = IO.Par(fa) }
}
}
object IO extends IOInstances {
// ...
}
Note that you will need to have an implicit ContextShift[IO] in scope if you want to use the ioParallel instance.
It is a common pattern in Scala to have implicit instances defined as part of the companion object for the class (in this case IO).
I am trying to use fs2.io.writeOutputStream for the output to a Java AWS lambda fn. I don't know how to provide the implicit parameter it's looking for:
"no implicits found for parameter cs: ContextShift[IO]"
I found some documentation for creating my own implicit ContextShift object but that seems like overkill for what I'm trying to do.
final def handleRequest(in: InputStream, out: OutputStream, context: Context): Unit = (for {
bytes <- in.compile.toList
str = getString(bytes)
args <- decode(str).raiseIO
_ <- produce(args).to(writeOutputStream(IO(out), global)).compile.drain
} yield Unit).unsafeRunAsyncAndForget() // throws exception in the case of Failure
// ------------------------------------------------
// produce(args: MyCaseClass): fs2.Stream[IO, Byte]
"By default, Cats Effect can provide instance of ContextShift[IO] that manages thread-pools, but only if there’s an ExecutionContext in scope or if IOApp is used."
-- Cats-effect documentation.
From an ExecutionContext.
import cats.effect.{IO, ContextShift}
import scala.concurrent.ExecutionContext.Implicits.global
val contextShift = IO.contextShift(global)
Using IOApp.
import cats.effect.{IO, IOApp, ContextShift}
object Main extends IOApp {
override def run(args: List[String]): IO[ExitCode] = {
val cs = implicitly[ContextShift[IO]]
}
}
I am trying to find the overloaded method using Scala reflections. Here's my code
import scala.reflect.runtime.universe._
object Example {
class Something {
def printIt(s1: String,s2: String) {println(s1 + s2) }
def printIt(s: Int) { println(s) }
def printIt(s: String) {println(s) }
def printInt(i: Int) { println(i) }
def printInt(i: String) { println(i) }
}
def main(args: Array[String]): Unit = {
val r = new Something()
val mirror = runtimeMirror(getClass.getClassLoader)
val instanceMirror = mirror.reflect(r)
val symbols = mirror.typeOf[r.type].decl(TermName("printInt")).asMethod
}
}
When I execute the code I am getting the following exception.
Exception in thread "main" scala.ScalaReflectionException: value printInt encapsulates multiple overloaded alternatives and cannot be treated as a method. Consider invoking `<offending symbol>.asTerm.alternatives` and manually picking the required method
By following the suggestion given by the exception itself, I am able to find the overloaded method by iterating through method alternatives. But is there any way of finding the method using the argument types that the method takes?
Either using Scala reflection and iterating
val m: scala.reflect.runtime.universe.MethodSymbol =
typeOf[Something].decl(TermName("printInt")).asTerm.alternatives.find(s =>
s.asMethod.paramLists.map(_.map(_.typeSignature)) == List(List(typeOf[Int]))
).get.asMethod
or using Java reflection
val m: java.lang.reflect.Method =
Class.forName("Example$Something").getMethod("printInt", classOf[Int])
I have a class A(httpClient: HttpExt)(implicit val system: ActorSystem, materializer: ActorMaterializer) that has a method called extractInfo (res: Future[HttpResponse]): Int that takes a future response and extracts an int value from the response entity.
I want to write a unit test for this method and I have the following in my unit spec class:
class ASpec extends UnitSpec with MockFactory {
"An A" - {
implicit val as = mock[ActorSystem]
implicit val mat = mock[ActorMaterializer]
val stubHttpClient = stub[HttpExt]
val a = new A(stubHttpClient)
"return auth token from response" in {
val futureResponse: Future[HttpResponse] = <code that returns a Future[HttpResponse]>
val info: Future[Option[Int]] = A.extractInfo(futureResponse)
val result = Await.result(info, 10.seconds)
result shouldBe Some(<a int value>)
}
}
}
However, on the line mock[ActorMaterializer], I got the following compilation error:
Error:(19, 28) object creation impossible, since:
it has 3 unimplemented members.
/** As seen from <$anon: akka.stream.ActorMaterializer>, the missing .
signatures are as follows.
* For convenience, these are usable as stub implementations.
*/
private[package akka] def actorOf(context: akka.stream.MaterializationContext,props: akka.actor.Props): akka.actor.ActorRef = ???
private[package akka] def logger: akka.event.LoggingAdapter = ???
private[package akka] def supervisor: akka.actor.ActorRef = ???
implicit val mat = mock[ActorMaterializer]
I would love some suggestions on how to solve this problem. Thank you enormously in advance!
I would suggest not mocking the ActorMaterializer but instead using a default implementation for example ActorMaterializer() or the one you use for your production code. At least from your example it doesn't look that you are making assertions on the ActorMaterializer itself.