Pureconfig read config as properties map - scala

Is it possible to make pureconfig read properties as Map[String, String]? I have the following
application.conf:
cfg{
some.property.name: "value"
some.another.property.name: "another value"
}
Here is the application I tried to read the config with:
import pureconfig.generic.auto._
import pureconfig.ConfigSource
import pureconfig.error.ConfigReaderException
object Model extends App {
case class Config(cfg: Map[String, String])
val result = ConfigSource.default
.load[Config]
.left
.map(err => new ConfigReaderException[Config](err))
.toTry
val config = result.get
println(config)
}
The problem is it throws the following excpetion:
Exception in thread "main" pureconfig.error.ConfigReaderException: Cannot convert configuration to a Model$Config. Failures are:
at 'cfg.some':
- (application.conf # file:/home/somename/prcfg/target/classes/application.conf: 2-3) Expected type STRING. Found OBJECT instead.
at Model$.$anonfun$result$2(Model.scala:11)
at scala.util.Either$LeftProjection.map(Either.scala:614)
at Model$.delayedEndpoint$Model$1(Model.scala:11)
at Model$delayedInit$body.apply(Model.scala:5)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1(App.scala:73)
at scala.App.$anonfun$main$1$adapted(App.scala:73)
at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:553)
at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:551)
at scala.collection.AbstractIterable.foreach(Iterable.scala:920)
at scala.App.main(App.scala:73)
at scala.App.main$(App.scala:71)
at Model$.main(Model.scala:5)
at Model.main(Model.scala)
Is there a way to fix it? I expected that the Map[String, String] will contain the following mappings:
some.property.name -> "value"
some.another.property.name -> "another value"

Your issue is not pureconfig. Your issue is that by HOCON spec what you wrote:
cfg {
some.property.name: "value"
some.another.property.name: "another value"
}
is a syntactic sugar for:
cfg {
some {
property {
name = "value"
}
}
another {
property {
name = "another value"
}
}
}
It's TypeSafe Config/Lightbend Config who decides that your cfg has two properties and both of them are nested configs. Pureconfig only takes these nested configs and maps them into case classes. But it won't be able to map something which has a radically different structure then expected.
If you write:
cfg {
some-property-name: "value"
some-another-property-name: "another value"
}
You'll be able to decode "cfg" path as Map[String, String] and top level config as case class Config(cfg: Map[String, String]). If you wanted to treat . as part of the key and not nesting... then I'm afraid you have to write a ConfigReader yourself because that is non-standard usage.

You can read a Map[String, String] in that way with the following ConfigReader:
implicit val strMapReader: ConfigReader[Map[String, String]] = {
implicit val r: ConfigReader[String => Map[String, String]] =
ConfigReader[String]
.map(v => (prefix: String) => Map(prefix -> v))
.orElse { strMapReader.map { v =>
(prefix: String) => v.map { case (k, v2) => s"$prefix.$k" -> v2 }
}}
ConfigReader[Map[String, String => Map[String, String]]].map {
_.flatMap { case (prefix, v) => v(prefix) }
}
}
Note that this is a recursive val definition, because strMapReader is used within its own definition. The reason it works is that the orElse method takes its parameter by name and not by value.

Related

MongoDB Scala driver. How to update only changed document's fields from the case class object

I have the following case class:
#GQLDirective(federation.Key("User Config"))
case class UserConfig(
userId: Int,
personalGroup: Option[PersonalGroup] = None,
accountGroup: Option[AccountGroup] = None
) extends Bson {
override def toBsonDocument[TDocument](documentClass: Class[TDocument], codecRegistry: CodecRegistry): BsonDocument = {
new BsonDocumentWrapper[UserConfig](this, codecRegistry.get(classOf[UserConfig]))
}
}
And trying to update an already stored document this way:
def update(userConfig: UserConfig) = {
collection.updateOne(equal("userId", 1), userConfig).headOption()
}
But when I am trying to do so I have the following error:
Invalid BSON field name userId
I have also tried to use replaceOne method, but it will replace the whole object and erase the fields that I don't want to.
What I am trying to achieve:
I want to save only changed fields in mongodb document. These fields are given by graphql request
For now, I found a way to convert and recursively merge new object with old ones like this. If somebody has a better code or opinion - I will be happy to chat
val firstJson = parse(userConfigFromRequest.toBsonDocument(classOf[UserConfig], codecRegistry).toJson).extract[Map[String, Any]]
val secondJson = parse(config.toBsonDocument(classOf[UserConfig], codecRegistry).toJson).extract[Map[String, Any]]
val merged = deepMerge(firstJson, secondJson)
def deepMerge(
map1: Map[String, Any],
map2: Map[String, Any],
accum: MutableMap[String, Any] = MutableMap[String, Any]()
): MutableMap[String, Any] = {
map1 foreach {
case (key, value: Map[String, Any]) => {
map2.get(key) match {
case Some(oldValue: Map[String, Any]) => {
accum.put(key, deepMerge(value, map2, MutableMap(oldValue.toSeq: _*) ))
}
case None => accum.put(key, value)
}
}
case (key, value) => {
accum.put(key, value)
}
}
accum
}
}

How can I serialize Sangria responses with json4s and Akka HTTP?

I'm working through a slight variation of Sangria's Getting Started, using Akka HTTP. I'm attempting to use json4s-jackson as the serializaltion lib, but am running in to some trouble getting the response I want.
Specifically, the serialized response I get is the JSON version of the (StatusCode, Node) tuple:
{
"_1": {
"allowsEntity": true,
"defaultMessage": "OK",
"intValue": 200,
"reason": "OK"
},
"_2": {
"data": {
"foo": {
"id": "1",
"name": "Foo"
}
}
}
}
The data portion is correct, but obviously I just want that and not the first element of the serialized tuple.
I'm using akka-http-json4s, so my trait with route looks like:
case class GraphQlData(query: String, operation: Option[String])
trait FooController {
import de.heikoseeberger.akkahttpjson4s.Json4sSupport._
implicit val serialization = jackson.Serialization
implicit val formats = DefaultFormats
val fooRoutes = post {
entity(as[GraphQlData]) { data =>
QueryParser.parse(data.query) match {
// query parsed successfully, time to execute it!
case Success(queryAst) =>
complete {
Executor
.execute(
SchemaDefinition.FooSchema,
queryAst,
new FooService,
operationName = data.operation
)
.map(OK -> _)
.recover {
case error: QueryAnalysisError => BadRequest -> error.resolveError
case error: ErrorWithResolver => InternalServerError -> error.resolveError
}
}
// can't parse GraphQL query, return error
case Failure(error) =>
complete(BadRequest -> error.getMessage)
}
}
}
implicit def executionContext: ExecutionContext
}
For the life of me I can't figure out what's wrong. I've been looking at sangria-akka-http-example but it seems to be exactly the same, with the exception of using spray-json instead of json4s.
Ideas? Thanks!
Ah, figured it out. I neglected to add
import sangria.marshalling.json4s.jackson._
to the trait defining the route. Adding it does the trick.
Just wanted to provide a quick update to this answer for the full GraphQLRequest. Now the variables are included in the request.
import de.heikoseeberger.akkahttpjson4s.Json4sSupport
import org.json4s._
import org.json4s.JsonAST.JObject
import sangria.marshalling.json4s.jackson._
case class GQLRequest(query: String, operationName: Option[String], variables: JObject)
trait SomeJsonSupport extends Json4sSupport {
implicit val serialization = jackson.Serialization
implicit val formats = DefaultFormats
}
trait GraphQLResource extends SomeJsonSupport{
implicit val timeout:Timeout
implicit val system:ActorSystem
import system.dispatcher
def graphqlRoute: Route =
(post & path("graphql")) {
entity(as[GQLRequest]) { requestJson =>
println(s"This is the requestJson = $requestJson")
graphQLEndpoint(requestJson)
}
} ~
get {
println(s"This is working")
getFromResource("graphiql.html")
}
def graphQLEndpoint(requestJson: GQLRequest): Route = {
val route = QueryParser.parse(requestJson.query) match {
case Success(query) =>
println(s"This is the query $query")
val vars = requestJson.variables match {
case jObj:JObject => jObj
case _ => JObject(List.empty)
}
val futureJValue = Executor.execute(clientSchema,
query,
NclhGqlRequest(this),
operationName = requestJson.operationName,
variables = vars)
val futureTupleStatusCodeJValue = futureJValue.map(OK -> _).recover {
case error: QueryAnalysisError => BadRequest -> error.resolveError
case error: ErrorWithResolver => InternalServerError -> error.resolveError
}
complete(futureTupleStatusCodeJValue)
case Failure(error) =>
complete(BadRequest, error.getMessage)
}
route
}

Scala - identifier expected but '=>' found

I'm writing an endpoint in a play scala application that makes a request to spotify, searching across track, album and artist types. I want to make over them and transform the string's into Future's of the calls.
This is my code:
def index = Action.async { implicit request =>
val futures = List("track", "album", "artist")
.map { type => performSearch("q" -> param(request, "q"), "type" -> type) }
Future.sequence(futures).onComplete {
Ok
}
}
private def performSearch(criteria: (String, String)): Future = {
ws.url("https://api.spotify.com/v1/search")
.withQueryString(criteria)
.get()
}
private def param(request: Request[AnyContent], name: String): String = {
request.queryString.get(name).flatMap(_.headOption).getOrElse("")
}
However I'm getting the error in my map of:
identifier expected but '=>' found
// .map { type => performSearch("q" -> param(request, "q"), "type" -> type) }
type is a keyword. Pick something else or put it inside `:
.map { `type` => performSearch("q" -> param(request, "q"), "type" -> `type`) }

Scala Play Framework Json Serializer error

Trying to convert class to Json.
Here is my class, it includes other two classes:
case class GoodEdit(good: Good, data: List[(String, Option[GoodText])])
case class Good(
id: Long,
partnumber: Option[String] = None
)
case class GoodText(
goodid: Long,
languageid: Long,
title: String,
description: Option[String] = None)
And here are my writers:
object GoodWriters {
implicit val goodWrites = new Writes[Good] {
def writes(good: Good) = Json.obj(
"id" -> good.id,
"partnumber" -> good.partnumber
)
}
implicit val goodTextWrites = new Writes[GoodText] {
def writes(goodText: GoodText) = Json.obj(
"goodid" -> goodText.goodid,
"languageid" -> goodText.languageid,
"title" -> goodText.title,
"description" -> goodText.description
)
}
implicit val GoodEditWrites = new Writes[GoodEdit] {
def writes(goodEdit: GoodEdit) = Json.obj(
"good" -> Json.toJson(goodEdit.good),
"data" -> Json.toJson(
for ((lang, goodTextOpt) <- goodEdit.data ) yield Json.obj(lang -> goodTextOpt)
)
)
}
Then, in controller I try it to use like this:
Action {
import jwriters.GoodWriters._
GoodEditAggregate.get(id).map{
a => Ok(Json.toJson(a))
}.getOrElse(Ok(Json.toJson(Json.obj("status" -> "error","message" -> "Can't find good with this id"))))
}
And compilator complaining on this part: Ok(Json.toJson(a))
No Json serializer found for type GoodEdit. Try to implement an
implicit Writes or Format for this type
Can't understand what is wrong. I've imported writers for objects already
Try to Import Goodwrites globally

Scala pattern matching on generic Map

Whats the best way to handle generics and erasure when doing pattern matching in Scala (a Map in my case). I am looking for a proper implementation without compiler warnings. I have a function that I want to return Map[Int, Seq[String]] from. Currently the code looks like:
def teams: Map[Int, Seq[String]] = {
val dateam = new scala.collection.mutable.HashMap[Int, Seq[String]]
// data.attributes is Map[String, Object] returned from JSON parsing (jackson-module-scala)
val teamz = data.attributes.get("team_players")
if (teamz.isDefined) {
val x = teamz.get
try {
x match {
case m: mutable.Map[_, _] => {
m.foreach( kv => {
kv._1 match {
case teamId: String => {
kv._2 match {
case team: Seq[_] => {
val tid: Int = teamId.toInt
dateam.put(tid, team.map(s => s.toString))
}
}
}
}
})
}
}
} catch {
case e: Exception => {
logger.error("Unable to convert the team_players (%s) attribute.".format(x), e)
}
}
dateam
} else {
logger.warn("Missing team_players attribute in: %s".format(data.attributes))
}
dateam.toMap
}
Use a Scala library to handle it. There are some based on Jackson (Play's ScalaJson, for instance -- see this article on using it stand-alone), as well as libraries not based on Jackson (of which my preferred is Argonaut, though you could also go with Spray-Json).
These libraries, and others, solve this problem. Doing it by hand is awkward and prone to errors, so don't do it.
It could be reasonable to use for comprehension (with some built in pattern matching). Also we could take into account that Map is a list of tuples, in our case of (String, Object) type. As well we will ignore for this example probable exceptions, so:
import scala.collection.mutable.HashMap
def convert(json: Map[String, Object]): HashMap[Int, Seq[String]] = {
val converted = for {
(id: String, description: Seq[Any]) <- json
} yield (id.toInt, description.map(_.toString))
HashMap[Int, Seq[String]](converted.toSeq: _*)
}
So, our for comprehension taking into account only tuples with (String, Seq[Any]) type, then combines converted String to Int and Seq[Any] to Seq[String]. And makes Map to be mutable.