How to send a KeyValue list to Kafka? - scala

I am trying to send a List[KeyValue] to the topic, in a Kafka Streams app. But the stream expects a single KeyValue. How I can send the KeyValues to the stream, instead of the whole list?
class MainTopology {
val createTopology = (sourceTopic: String, sinkTopic: String) => {
val builder = new StreamsBuilder()
builder.stream(sourceTopic)
.map[String, Option[JsonMessage]]((key, value) => toJsonEvent(key, value))
.map[String, String]((key, value) => toFormattedEvents(key, value))
.to(sinkTopic)
builder.build()
}
private val toJsonEvent = (key: String, value: String) => {
println(value)
val jsonEventsAsCaseClasses = jsonToClass(value)
new KeyValue(key, jsonEventsAsCaseClasses)
}
private val toFormattedEvents = (key: String, value: Option[JsonMessage]) => {
val jsonEvents: List[String] = formatEvents(value)
jsonEvents.map(event => new KeyValue(key,event))
}
}
The second map is not compiling due to this.
Expression of type List[KeyValue[String, String]] doesn't conform to expected type KeyValue[_ <: String, _ <: String]
I update my code, but now it is throwing another error:
val builder = new StreamsBuilder()
builder.stream(sourceTopic)
.map[String, Option[JsonMessage]]((key, value) => toJsonEvent(key, value))
.flatMap(
(key, value) => toFormattedEvents(key, value)
)
.to(sinkTopic)
builder.build()
}
private val toJsonEvent = (key: String, value: String) => {
println(value)
val jsonEventsAsCaseClasses = jsonToClass(value)
new KeyValue(key, jsonEventsAsCaseClasses)
}
private val toFormattedEvents = (key: String, value: Option[JsonMessage]) => {
val jsonEvents: List[String] = formatEvents(value)
jsonEvents.map(event => new KeyValue(key,event)).toIterable.asJava
}
Error:(15, 8) inferred type arguments [?1,?0] do not conform to method flatMap's type parameter bounds [KR,VR]
.flatMap(
Error:(16, 20) type mismatch;
found : org.apache.kafka.streams.kstream.KeyValueMapper[String,Option[org.minsait.streams.model.JsonMessage],Iterable[org.apache.kafka.streams.KeyValue[String,String]]]
required: org.apache.kafka.streams.kstream.KeyValueMapper[_ >: String, _ >: Option[org.minsait.streams.model.JsonMessage], _ <: Iterable[_ <: org.apache.kafka.streams.KeyValue[_ <: KR, _ <: VR]]]
(key, value) => toFormattedEvents(key, value)

Take a look at flatMap() and flatMapValues() to replace the second map().
https://kafka.apache.org/22/javadoc/org/apache/kafka/streams/kstream/KStream.html
For example:
.flatMap[String, Long]((key, value) => Seq(("foo", 1L), ("bar", 2L)))
If you do intent to keep using map for some reason, then see below.
The second map is not compiling due to this.
Expression of type List[KeyValue[String, String]] doesn't conform to expected type KeyValue[_ <: String, _ <: String]
The corresponding code, for reference:
.map[String, String]((key, value) => toFormattedEvents(key, value))
What you need is something like:
.map[String, List[KeyValue[KR, VR]]]((key, value) => toFormattedEvents(key, value))
where KR and VR are the key and value types, respectively, as returned by toFormattedEvents() (the actual return types are not clear from your question). For this to work you must also have Serde for the List[KeyValue[KR, VR]] type.
Let me illustrate this with a few different types so it's easier to understand which part of the method call refers to which type. Assume we want to the map output to have a key type of Integer and a value type of List[KeyValue[String, Long]]:
.map[Integer, List[KeyValue[String, Long]]]((key, value) => (5, List(KeyValue.pair("foo", 1L), KeyValue.pair("bar", 2L))))
Note that this example assigns the same values to every mapped record, but it's not the point of the example.

Related

No implicits found for parameter ev: Any <:< (T_, U_)

I'm trying to create a map based on conditions. Here's the general workflow in pseudocode:
def createMyMap(input: String): Map[String, String] = {
val stringArray = input.split(",")
stringArray.map(element => {
if (condition) {
newKey -> newVal
}
}).toMap
}
and I see two compile errors:
No implicits found for parameter ev: Any <:< (T_, U_) in the toMap call
For the method createMyMap
Type mismatch.
Required: scala.Predef.Map [String, String]
Found: scala.collection.immutable.Map[Nothing, Nothing]
This makes sense since the compiler doesn't know how to create the map if the condition isn't fulfilled. For example, if I add this in the method:
if (condition) {
newKey -> new Val
} else {
null
}
then it'll compile. I'm just not too sure how to approach the else - how do I avoid this kind of problem? I'm running into this because I only want to create a map entry if a condition is fulfilled.
It's not clear how newKey and newVal are derived, but here is the template code using collect
def createMyMap(input: String): Map[String, String] =
input.split(",").collect {
case s if <condition> =>
newKey -> newVal
}.to(Map)
e.g.
def createMyMap(input: String): Map[String, String] =
input.split(",").collect {
case s if s.contains('.') =>
s -> "data"
}.to(Map)
You have a few good options that I'll summarize from the comments:
filter + map + toMap
// stringArray: Traversable[String]
// condition: String => Boolean
// transform: String => (String, String)
val result: Map[String, String] = stringArray
.filter(condition)
.map(transform)
.toMap
map + flatten + toMap
val result: Map[String, String] = stringArray
.map { k =>
if (condition(k)) Some(transform(k))
else None
}
.flatten
.toMap
flatMap + toMap
val result: Map[String, String] = stringArray
.flatMap { k =>
if (condition(k)) Some(transform(k))
else None
}
.toMap
collect + toMap
val result: Map[String, String] = stringArray
.collect {
case k if condition(s) => transform(k)
}
.toMap
See documentation.
In short, methods 3 and 4 are especially clean (although all are good IMO). However, the one that is semantically the most readable IMO is 4, which uses collect (with option 1 being a close second).

implicit class functions in object scala generic

I wrote this
def computeMap(map:Map[String, DataFrame], f: (String) => String, g: (DataFrame) => DataFrame ) : Map[String, DataFrame] = {
map.map{ case (key, value) => (f(key), g(value) }
}
My problem here is that f and g function are provided by 2 implicit classes wrapped in objects ( one implicit class for string transform and the second one is for dataframe transorm)
I rather want to write:
def computeMap(map:Map[String, DataFrame], f: tobecompleted, g: tobecompleted ) : Map[String, DataFrame] = {
map.map{ case (key, value) => (key.f, value.g) }
}
f for example can be defined
object Test {
implicit class Transform(s:String) {
def colm():String = {
s + "ded"
}
}
Is there any solution for this please ?

Issue with elegantly accumulating errors using Either[String, B] in Scala

I am trying to parse a csv row here and each field can be a different type. To handle the error accumulation I am using Either[String, B] where the String is an error message and B is the value. The issue here is that B can be different types, Option[Int], String, Array[String], resulting in my Map being type (String, Either[String,java.io.Serializable]) effectively making the Map unreusable. Is there a way (I'm definitely sure there is) to more elegantly accumulate errors while also reusing those values to populate properties on an object?
override def parseCsv(implicit fields: Map[String, String]): Either[String, User] = {
val parsedValues = Map(Headers.Id -> getFieldAsString(Headers.Id),
Headers.FirstName -> getFieldAsString(Headers.FirstName),
Headers.LastName -> getFieldAsString(Headers.LastName),
Headers.Email -> getFieldAsString(Headers.Email),
Headers.TimeZone -> getFieldAsString(Headers.TimeZone),
Headers.Region -> getOption(Headers.Region),
Headers.Phone -> getOption(Headers.Phone),
Headers.ProfileImage -> getFieldAsString(Headers.ProfileImage),
Headers.Roles -> getFieldAsArray(Headers.Roles))
val errors = parsedValues.collect { case (key, Left(errors)) => errors }
if (!errors.isEmpty) Left(errors.mkString(", "))
else {
val user = new User
user.id = getFieldAsString(Headers.Id).right.get
user.firstName = getFieldAsString(Headers.FirstName).right.get
user.lastName = getFieldAsString(Headers.LastName).right.get
user.email = getFieldAsString(Headers.Email).right.get
user.timeZone = getFieldAsString(Headers.TimeZone).right.get
user.phoneNumber = (for {
region <- getOption(Headers.Region).right.get
phone <- getOption(Headers.Phone).right.get
_ = validatePhoneNumber(phone, region)
} yield {
new PhoneNumber(region, phone)
}).orNull
user.profileImageUrl = getFieldAsString(Headers.ProfileImage).right.get
user.roles = getFieldAsArray(Headers.Roles).right.get
Right(user)
}
}
Create case classes for all types of Bs. These case classes must extend some common trait. While populating the user object just pattern match and retrieve values.
sealed trait Result {
val paramName: String
}
case class OptionInt(override val paramName: String, value: Option[Int]) extends Result
case class ArrayString(override val paramName: String, value: Array[String]) extends Result
case class StringValue(override val paramName: String, value: String) extends Result
now the final type would be like Either[String, Result]
after parsing the whole file create a List[Result]
If you are expecting age as Option[Int] and firstName as String then do this
list.foreach { result =>
result match {
case Option("age", value) => userId.age = value.getOrElse(defaultAge)
case StringValue("firstName", value) => userId.firstName = value
case StringValue("email", value) => userId.email = value
case _ => //do nothing
}
}

Getting (Long, String) from (String) => Try[(Long, String)]

I have a method that return (String) => Try[(Long, String)] type and I want to get (Long, String). Any suggestion?
I thought map/flatMap will help but looks like they doesn't.
Update
def someMethod():(Long, String) = {
val result: (String) => Try[(Long, String)] = someOperation()
//Need to get (Long, String) from result
}
There are several options
val exceptional: Try[(Long, String)] = ???
val default: (Long, String) = (0, "")
Providing fallback value
exceptional.getOrElse(default)
handling exception and then safely get
exceptional.recover { case exception => default }.get
or using pattern matching
exceptional match {
case Success(v) => v
case Failure(exception) => default
}

Type of a result after map

I have the followeing classes:
case class RolepermissionRow(roleid: Int, permissionid: Int)
and
case class RolePermissionJSON(
var id: Option[Long],
var num: Option[Int],
var name: Option[String],
var perms: Map[String, Boolean])
I create a map:
var s = Map("1" -> true)
I create a RolePermissionJSON:
val f = RolePermissionJSON(Some(0), Some(0), Some('test'), s)
And I would like to convert the perms Map to RolepermissionRow using the following code:
scala> f.perms.map { case (key, value) => if (value) RolepermissionRow(key.toInt, 1) }.toSeq
res7: Seq[Any] = List(RolepermissionRow(1,1))
The result is Seq[Any] but I would like to have Seq[RolepermissionRow]
Simple changes required:
val z = f.perms.map { case (key, value) if (value) => RolepermissionRow(key.toInt, 1) }.toSeq
Now description:
In your code resulted Seq contains of RolepermissionRows if value is true and Units otherwise.
You should filter out "empty" elements of map that gives you Unit.
UPD:
#nafg advice to use collect to prevent match error in runtime.
f.perms.filter(_._2).map{case (key, value) => RolepermissionRow(key.toInt, 1) }.toSeq
I think you should use filter firstly to filter the map avoid empty Map.