Unable to upgrade to Scala 2.13.6 - scala

I am trying to migrate to Scala - 2.13.6 and Akka - 2.6.16 with akka-actor_2.13 version 2.6.16.
Code
import akka.http.scaladsl.model.ContentType
import akka.http.scaladsl.model.HttpCharsets.UTF-8
import akka.http.scaladsl.model.MediaTypes.{application/json, application/xml}
object MyPayloadType extends Enumeration {
val XML,
JSON = Value
def getMyPayloadType(contentType: Value): ContentType.NonBinary = contentType match {
case XML => ContentType(`application/xml`, `UTF-8`)
case JSON => ContentType(`application/json`)
}
}
Above code gives the following compilation error.
Symbol 'type Scala.collection.IndexedSeqOptimized' is missing from the classpath.
This symbol is required by 'class akka.util.ByteString'

Related

Reactivemongo parseURI is failing when loading from config saying it can't find implicit value for parameter loader MongoConnection.ParsedURI

I am trying to use the reactivemongo driver in my play application without using the play module for reactive mongo.
So when I try and get the parsedURI from my config, I am getting the below error:
import reactivemongo.api.MongoConnection.ParsedURI
import reactivemongo.api.AsyncDriver
import com.typesafe.config.Config
val driver = new AsyncDriver(Some(config.get[Config]("mongodb")))
val parsedUri = config.get[ParsedURI]("mongodb.uri")
Error message:
could not find implicit value for parameter loader:
play.api.ConfigLoader[reactivemongo.api.MongoConnection.ParsedURI]
[error] val parsedUri = config.getParsedURI
[error] ^ [error] one error
found
My application.conf has:
mongodb {
uri = "mongodb://127.0.0.1:27017/mydb"
mongo-async-driver = ${akka}
}
ConfigLoader is a Play type class (like Reads) which tells play how to read a type from the config file.
You can find an explanation here: https://www.playframework.com/documentation/2.8.x/ScalaConfig#ConfigLoader
Generally you would define this by doing something like:
// Config
{
config {
url = "https://example.com"
}
}
// Config class
case class AConfig(url: String)
// Config Loader
implicit val configLoader: ConfigLoader[AConfig] = ConfigLoader {root => key =>
val config = root.getConfig(key)
AConfig(config.get[String]("url"))
}
// Usage
val aConfig = config.get[AConfig]("config")
In this case I would not suggest attempting to make one for ParsedURI because it is quite a complex type. Instead I would suggest doing something like:
val parsedUri: Try[ParsedURI] = MongoConnection.parseURI(config.get[String]("mongodb.uri"))

How to use TypeInformation in a generic method using Scala

I'm trying to create a generic method in Apache Flink to parse a DataSet[String](JSON strings) using case classes. I tried to use the TypeInformation like it's mentioned here: https://ci.apache.org/projects/flink/flink-docs-stable/dev/types_serialization.html#generic-methods
I'm using liftweb to parse the JSON string, this is my code:
import net.liftweb.json._
import org.apache.flink.api.common.typeinfo.TypeInformation
import org.apache.flink.api.scala._
class Loader(settings: Map[String, String])(implicit environment: ExecutionEnvironment) {
val env: ExecutionEnvironment = environment
def load[T: TypeInformation](): DataSet[T] = {
val data: DataSet[String] = env.fromElements(
"""{"name": "name1"}""",
"""{"name": "name2"}"""
)
implicit val formats = DefaultFormats
data.map(item => parse(item).extract[T])
}
}
But I got the error:
No Manifest available for T
data.map(item => parse(item).extract[T])
Then I tried to add a Manifest and delete the TypeInformation like this:
def load[T: Manifest](): DataSet[T] = { ...
And I got the next error:
could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[T]
I'm very confuse about this, I'll really appreciate your help.
Thanks.

Akka HTTP client - Unmarshal with Play JSON

I am using Akka HTTP as a client to do a POST request and parse the answer. I am using Play JSON and I get the following compiler error:
could not find implicit value for parameter um: akka.http.scaladsl.unmarshalling.Unmarshaller[akka.http.javadsl.model.ResponseEntity,B]
[ERROR] Unmarshal(response.entity).to[B].recoverWith {
This is the dependency I added to use Play JSON instead of Spray:
"de.heikoseeberger" %% "akka-http-play-json"
My class definition is:
class HttpClient(implicit val system: ActorSystem, val materializer: Materializer) extends PlayJsonSupport {
and the method definition is:
private def parseResponse[B](response: HttpResponse)(implicit reads: Reads[B]): Future[B] = {
if (response.status().isSuccess) {
Unmarshal(response.entity).to[B].recoverWith {
....
In the imports I have:
import play.api.libs.json._
import scala.concurrent.ExecutionContext.Implicits.global
import de.heikoseeberger.akkahttpplayjson.PlayJsonSupport._
It seems to me that I have the required implicits in scope. The Marshal part has a similar logic (but with Writes instead of Reads) and compiles fine. What am I missing?
Check your other imports. Based on the error message, it appears that you're using akka.http.javadsl.model.HttpResponse instead of akka.http.scaladsl.model.HttpResponse; PlayJsonSupport only supports the Scala DSL:
private def parseResponse[B](response: HttpResponse)(implicit reads: Reads[B]): Future[B] = ???
// ^ this should be akka.http.scaladsl.model.HttpResponse
In other words, use
import akka.http.scaladsl.model._
instead of
import akka.http.javadsl.model._

Intellij show error for object X is not a member of package

When I build my sbt project in Intellij I get the following error:
Error:(7, 8) object DfUtils is not a member of package com.naturalint.xspark.common.sparkutils
import com.naturalint.xspark.common.sparkutils.DfUtils
The object is a part of the package and when I ran sbt compile or package or assembly in terminal everything clears fine. When I move the object to a different location in the project it sometimes work for a day and then again gives the error
This is the content of the file:
package com.naturalint.xspark.common.sparkutils
import org.apache.spark.sql._
import scala.reflect._
object DfUtils {
def removeColumnThatAreNotInEncoder[T <: Product : ClassTag: Encoder](dataEntity: Dataset[T]): Dataset[T] = {
val declaredFields: Array[String] = classTag[T].runtimeClass.getDeclaredFields.map(x => x.getName)
val columns: Array[Column] = declaredFields.map(x => new Column(x))
dataEntity.select(columns: _*).as[T]
}
}
Any thoughts?
Thanks
Nir

Generating import statements with scala macros

I have the following code:
#mymacro #imports
val _1 = { import scala.collection.mutable.ListBuffer }
#mymacro
val _2 = { val yy: ListBuffer[Int] = ListBuffer.empty }
#mymacro is a scala macro that checks if it has been annotated with the #importsannotation. Part of the implementatation is as follows:
case (cc#q"${mods: Modifiers} val $tname: ${tpt: Tree} = ${expr: Tree}") :: Nil =>
if (tname.toString().startsWith("_"))
if (checkImports(mods, expr)) {
q"import scala.collection.mutable.ListBuffer"
}
else
q"{$expr}"
Currently the macro is able to transform the whole val _1 = ... statement to import scala.collection.mutable.ListBuffer (without the {} brackets!) But when the compilation continues, I keep getting the not found: type ListBuffer compilation error. Now I wonder if it is possible to fix this error somehow without having to define the import statement at the top of the file.
I am using the Scala 2.10 macro paradise plugin