How to handle complex transformations in circe decode - scala

Let's say that I have JSON
{"some": "123"}
And I want to create a custom decoder that would decode it to
case class Test(some: Long Refined NonNegative = 0)
So first I would need to decode JSON to String. Then turn String to Long which may throw an error which I can wrap in Try and use Decoder.decodeString.emapTry. But then I have to turn Long to Long Refined NonNegative which can be done via method refinedV[NonNegative](x: Long) which returns Either[String, Long Refined NonNegative]. And on top of all that, there is the default value I have to handle. Its gets very messy.
How can I manage such complex transformations during decoding, are there more complex structures than emap and emapTry?

Related

How can I convert Long from String in scala

I know I can parse from Long from String like the following
"60".toLong
or convert Long from Double like the following
60.0.toLong
or convert Long from a String of Double like the following
"60.0".toDouble.toLong
However, I can't do the following
"60.0".toLong
So my question is whether using .toDouble.toLong is a best practice, or should I use something like try ... catch ...?
Meanwhile, there is another question, when I try to convert a very large Long to Double, there maybe some precision loss, I want to know how to fix that?
"9223372036854775800.31415926535827932".toDouble.toLong
You should wrap the operation in a Try anyway, in case the string is not valid.
What you do inside the Try depends on whether "60.0" is a valid value in your application.
If it is valid, use the two-step conversion.
Try("60.0".toDouble.toLong) // => Success(60)
If it is not valid, use the one-step version.
Try("60.0".toLong) // => Failure(java.lang.NumberFormatException...)
Answer to updated question:
9223372036854775800.31415926535827932 is outside the range for a Double, so you need BigDecimal for that.
Try(BigDecimal("9223372036854775800.31415926535827932").toLong)
However you are very close to maximum value for Long, so if the numbers really are that large I suggest avoiding Long and using BigDecimal and BigInt.
Try(BigDecimal("9223372036854775800.31415926535827932").toBigInt)
Note that toLong will not fail if the BigDecimal is too large, it just gives the wrong value.

Scala function TypeTag: T use type T in function

I need to parse several json fields, which I'm using Play Json to do it. As parsing may fail, I need to throw a custom exception for each field.
To read a field, I use this:
val fieldData = parseField[String](json \ fieldName, "fieldName")
My parseField function:
def parseField[T](result: JsLookupResult, fieldName: String): T = {
result.asOpt[T].getOrElse(throw new IllegalArgumentException(s"""Can't access $fieldName."""))
}
However, I get an error that reads:
Error:(17, 17) No Json deserializer found for type T. Try to implement
an implicit Reads or Format for this type.
result.asOpt[T].getOrElse(throw new IllegalArgumentException(s"""Can't access $fieldName."""))
Is there a way to tell the asOpt[] to use the type in T?
I strongly suggest that you do not throw exceptions. The Play JSON API has both a JsSuccess and JsError types that will help you encode parsing errors.
As per the documentation
To convert a Scala object to and from JSON, we use Json.toJson[T: Writes] and Json.fromJson[T: Reads] respectively. Play JSON provides the Reads and Writes typeclasses to define how to read or write specific types. You can get these either by using Play's automatic JSON macros, or by manually defining them. You can also read JSON from a JsValue using validate, as and asOpt methods. Generally it's preferable to use validate since it returns a JsResult which may contain an error if the JSON is malformed.
See https://github.com/playframework/play-json#reading-and-writing-objects
There is also a good example on the Play Discourse forum on how the API manifests in practice.

Working with opaque types (Char and Long)

I'm trying to export a Scala implementation of an algorithm for use in JavaScript. I'm using #JSExport. The algorithm works with Scala Char and Long values which are marked as opaque in the interoperability guide.
I'd like to know (a) what this means; and (b) what the recommendation is for dealing with this.
I presume it means I should avoid Char and Long and work with String plus a run-time check on length (or perhaps use a shapeless Sized collection) and Int instead.
But other ideas welcome.
More detail...
The kind of code I'm looking at is:
#JSExport("Foo")
class Foo(val x: Int) {
#JSExport("add")
def add(n: Int): Int = x+n
}
...which works just as expected: new Foo(1).add(2) produces 3.
Replacing the types with Long the same call reports:
java.lang.ClassCastException: 1 is not an instance of scala.scalajs.runtime.RuntimeLong (and something similar with methods that take and return Char).
Being opaque means that
There is no corresponding JavaScript type
There is no way to create a value of that type from JavaScript (except if there is an #JSExported constructor)
There is no way of manipulating a value of that type (other than calling #JSExported methods and fields)
It is still possible to receive a value of that type from Scala.js code, pass it around, and give it back to Scala.js code. It is also always possible to call .toString(), because java.lang.Object.toString() is #JSExported. Besides toString(), neither Char nor Long export anything, so you can't do anything else with them.
Hence, as you have experienced, a JavaScript 1 cannot be used as a Scala.js Long, because it's not of the right type. Neither is 'a' a valid Char (but it's a valid String).
Therefore, as you have inferred yourself, you must indeed avoid opaque types, and use other types instead if you need to create/manipulate them from JavaScript. The Scala.js side can convert back and forth using the standard tools in the language, such as someChar.toInt and someInt.toChar.
The choice of which type is best depends on your application. For Char, it could be Int or String. For Long, it could be String, a pair of Ints, or possibly even Double if the possible values never use more than 52 bits of precision.

How can I define a type to present a stream which can provide integers, but also may fail?

Suppose I want to define a "NumberLoader", which will load integers from another place by demand, so I can give it a type:
type NumberLoader = Stream[Integer]
But it may throw errors when loading (say, it loads integers via network from another computer), so there should be some NetworkError in the type.
Finally, I defined it as:
type NumberLoader = Stream[Either[NetworkError, Integer]]
If seems to work, but I feel it a little strange. Is it a good one?
You want to represent
a Stream of something which can be either a NetworkError or an Integer
Stream[ Either[ NetworkError, Integer]]
so the type looks appropriate and well-suited.
You can alternatively use Future or Try in place of Either, but you would lose the flexibility of specifying the kind of exception you expect, as both Future and Try failures hold a generic Throwable.

In Scala, is it possible to simultaneously extend a library and have a default conversion?

For example in the following article
http://www.artima.com/weblogs/viewpost.jsp?thread=179766
Two separate examples are given:
Automatic string conversion
Addition of append method
Suppose I want to have automatic string conversion AND a new append method. Is this possible? I have been trying to do both at the same time but I get compile errors. Does that mean the two implicits are conflicting?
You can have any number of implicit conversions from a class provided that each one can be unambiguously determined depending on usage. So the array to string and array to rich-array-class-containing-append is fine since String doesn't have an append method. But you can't convert to StringBuffer which has append methods which would interfere with your rich array append.