using package in Scala? - scala

I have a scala project that uses akka. I want the execution context to be available throughout the project. So I've created a package object like this:
import akka.actor.ActorSystem
import akka.stream.ActorMaterializer
import com.datastax.driver.core.Cluster
package object connector {
implicit val system = ActorSystem()
implicit val mat = ActorMaterializer()
implicit val executionContext = executionContext
implicit val session = Cluster
.builder
.addContactPoints("localhost")
.withPort(9042)
.build()
.connect()
}
In the same package I have this file:
import akka.stream.alpakka.cassandra.scaladsl.CassandraSource
import akka.stream.scaladsl.Sink
import com.datastax.driver.core.{Row, Session, SimpleStatement}
import scala.collection.immutable
import scala.concurrent.Future
object CassandraService {
def selectFromCassandra()() = {
val statement = new SimpleStatement(s"SELECT * FROM animals.alpakka").setFetchSize(20)
val rows: Future[immutable.Seq[Row]] = CassandraSource(statement).runWith(Sink.seq)
rows.map{item =>
print(item)
}
}
}
However I am getting the compiler error that no execution context or session can be found. My understanding of the package keyword was that everything in that object will be available throughout the package. But that does not seem work. Grateful if this could be explained to me!

Your implementation must be something like this, and hope it helps.
package.scala
package com.app.akka
package object connector {
// Do some codes here..
}
CassandraService.scala
package com.app.akka
import com.app.akka.connector._
object CassandraService {
def selectFromCassandra() = {
// Do some codes here..
}
}

You have two issue with your current code.
When you compile your package object connector it is throwing below error
Error:(14, 35) recursive value executionContext needs type
implicit val executionContext = executionContext
Issue is with implicit val executionContext = executionContext line
Solution for this issue would be as below.
implicit val executionContext = ExecutionContext
When we compile CassandraService then it is throwing error as below
Error:(17, 13) Cannot find an implicit ExecutionContext. You might pass
an (implicit ec: ExecutionContext) parameter to your method
or import scala.concurrent.ExecutionContext.Implicits.global.
rows.map{item =>
Error clearly say that either we need to pass ExecutionContext as implicit parameter or import scala.concurrent.ExecutionContext.Implicits.global. In my system both issues are resolved and its compiled successfully. I have attached code for your reference.
package com.apache.scala
import akka.actor.ActorSystem
import akka.stream.ActorMaterializer
import com.datastax.driver.core.Cluster
import scala.concurrent.ExecutionContext
package object connector {
implicit val system = ActorSystem()
implicit val mat = ActorMaterializer()
implicit val executionContext = ExecutionContext
implicit val session = Cluster
.builder
.addContactPoints("localhost")
.withPort(9042)
.build()
.connect()
}
package com.apache.scala.connector
import akka.stream.alpakka.cassandra.scaladsl.CassandraSource
import akka.stream.scaladsl.Sink
import com.datastax.driver.core.{Row, SimpleStatement}
import scala.collection.immutable
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.Future
object CassandraService {
def selectFromCassandra() = {
val statement = new SimpleStatement(s"SELECT * FROM animals.alpakka").setFetchSize(20)
val rows: Future[immutable.Seq[Row]] = CassandraSource(statement).runWith(Sink.seq)
rows.map{item =>
print(item)
}
}
}

Related

ParTraverse Not a Value of NonEmptyList

I am following the instructions on the cats IO website to run a sequence of effects in parallel:
My code looks like:
val maybeNonEmptyList: Option[NonEmptyList[Urls]] = NonEmptyList.fromList(urls)
val maybeDownloads: Option[IO[NonEmptyList[Either[Error, Files]]]] = maybeNonEmptyList map { urls =>
urls.parTraverse(url => downloader(url))
}
But I get a compile time error saying:
value parTraverse is not a member of cats.data.NonEmptyList[Urls]
[error] urls.parTraverse(url => downloader(url))
I have imported the following:
import cats.data.{EitherT, NonEmptyList}
import cats.effect.{ContextShift, IO, Timer}
import cats.implicits._
import cats.syntax.parallel._
and also i have the following implicits:
implicit val cs: ContextShift[IO] = IO.contextShift(ExecutionContext.global)
implicit val timer: Timer[IO] = IO.timer(ExecutionContext.global)
Why do i still get the problem?
This is caused because the implicit enrichment is being imported twice, so it becomes ambiguous
import cats.implicits._
import cats.syntax.parallel._
As of recent versions of cats, implicits imports are never required, only syntax
The recommended pattern is to do only import cats.syntax.all._

Completing Source[ByteString, _] in Akka-Http

I wanted to use Alpakka for handling S3 upload and download with Akka Steams. However, I got stuck with using Source produced by S3Client within Akka Http routes. The error message I get is:
[error] found : akka.stream.scaladsl.Source[akka.util.ByteString,_$1] where type _$1
[error] required: akka.http.scaladsl.marshalling.ToResponseMarshallable
[error] complete(source)
I assume that it is some annoying trivial thing, like missing implicit import, but I was not able to pinpoint what I am missing.
I've created some minimal example to illustrate the issue:
import akka.actor.ActorSystem
import akka.http.scaladsl.Http
import akka.http.scaladsl.server.Directives._
import akka.stream.ActorMaterializer
import akka.stream.scaladsl.Source
import akka.util.ByteString
import scala.concurrent.ExecutionContext
class Test {
implicit val actorSystem: ActorSystem = ActorSystem()
implicit val materializer: ActorMaterializer = ActorMaterializer()
implicit val executionContext: ExecutionContext = actorSystem.dispatcher
val route = (path("test") & get) {
def source: Source[ByteString, _] = ??? // just assume that I am able to get that value
complete(source) // here error happens
}
Http().bindAndHandle(route, "localhost", 8000)
}
Do you have some suggestions, what can I try? I am using
libraryDependencies += "com.typesafe.akka"%% "akka-http" % "10.0.5"
You need to create an HttpEntity from the source, and give it a content-type.
complete(HttpEntity(ContentTypes.`application/json`, source))

Not able to import Spark Implicits in ScalaTest

I am writing Test Cases for Spark using ScalaTest.
import org.apache.spark.sql.SparkSession
import org.scalatest.{BeforeAndAfterAll, FlatSpec}
class ClassNameSpec extends FlatSpec with BeforeAndAfterAll {
var spark: SparkSession = _
var className: ClassName = _
override def beforeAll(): Unit = {
spark = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()
className = new ClassName(spark)
}
it should "return data" in {
import spark.implicits._
val result = className.getData(input)
assert(result.count() == 3)
}
override def afterAll(): Unit = {
spark.stop()
}
}
When I try to compile the test suite it gives me following error:
stable identifier required, but ClassNameSpec.this.spark.implicits found.
[error] import spark.implicits._
[error] ^
[error] one error found
[error] (test:compileIncremental) Compilation failed
I am not able to understand why I cannot import spark.implicits._ in a test suite.
Any help is appreciated !
To do an import you need a "stable identifier" as the error message says. This means that you need to have a val, not a var.
Since you defined spark as a var, scala can't import correctly.
To solve this you can simply do something like:
val spark2 = spark
import spark2.implicits._
or instead change the original var to val, e.g.:
lazy val spark: SparkSession = SparkSession.builder().master("local").appName("class-name-test").getOrCreate()

Ammonite and Akka-Http config file error

I'm trying to embed an akka-http server into my ammonite scala script.
Below is the scala code used to create a server instance
import ammonite.ops._
import $ivy.`com.typesafe:config:1.3.1`
import $ivy.`com.typesafe.akka:akka-http_2.12:10.0.6`
import akka.actor.ActorSystem
import akka.stream.ActorMaterializer
import akka.http.scaladsl.Http
import akka.http.scaladsl.server.Directives._
import com.typesafe.config.ConfigFactory
import java.io._
#main
def main() = {
val fileConfig = ConfigFactory.parseFile(new File("resources/my.conf"))
val config = ConfigFactory.load(fileConfig)
println(config)
implicit val actorSystem = ActorSystem("system")
implicit val actorMaterializer = ActorMaterializer()
val route =
pathSingleSlash {
get {
complete {
"Hello world"
}
}
}
Http().bindAndHandle(route,"localhost",8080)
println("server started at 8080")
}
Here is the my.conf file content:
akka {
loglevel = INFO
stdout-loglevel = INFO
default-dispatcher {
fork-join-executor {
parallelism-min = 8
}
}
Running the script with amm server.sc I got the following error:
Exception in thread "main" com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka'
The same happen with the standard application.conf filename convention too.
I can read the file and get the content correctly.
What I'm missing?
Thanks a lot
You need to pass the config to the ActorSystem like ActorSystem("system", config)

Schedule in Akka - not found

Here is a basic example of using schedule in Akka:
import akka.pattern
import akka.util.Timeout
import scala.concurrent.Await
import akka.actor.Actor
import akka.actor.Props
import akka.actor.ActorSystem
import akka.pattern.ask
import scala.concurrent.duration
object Application extends App {
val supervisor = ActorSystem().actorOf(Props[Supervisor])
implicit val timeout = Timeout(10 seconds)
import system.dispatcher
supervisor.scheduler.scheduleOnce(120 seconds) {
val future = supervisor ? Supervisor.Start
val resultIdList = Await.result(future, timeout.duration).asInstanceOf[List[MyIdList]]
supervisor ! resultIdList
}
}
I'm really confused of Akka's documentation. Here Having problems with Akka 2.1.2 Scheduler ('system' not recognized) was said that import system.dispatcher is not a package import but something else. What is that then?
What is system? Do I have to replace it with supervisor? Even if I didn't do that and keep using system, I'd have pretty much the same errors:
//(using system)
value scheduler is not a member of akka.actor.ActorRef
not found: value system
//or (using supervisor)
not found: value system
not found: value system
Try this ;)
val system = ActorSystem()
val supervisor = system.actorOf(Props[Supervisor])
(Posting as answer since does not fit as comment)
Marius, you were referring to another question which started with this line:
val system = akka.actor.ActorSystem("system")
That is the identifier 'system' the import statement is referring to.
The line
import system.dispatcher
means that the dispatcher member of the variable system will be available in scope (you can use the name 'dispatcher' to refer to 'system.dispatcher' from that point). This also means that since dispatcher is an implicit that it will be now available for implicit resolution. Please note that the signature of schedule is
scheduleOnce(delay: FiniteDuration, runnable: Runnable)(implicit executor: ExecutionContext): Cancellable
So it either needs an explicitly passed ExecutionContext, or an implicit one. By using the import statement you bring the dispatcher (which is an ExecutionContext) into scope, so you don't have to provide it manually.