Strange behavior of collection view for string - scala

I was answering another question and tried to write this code:
val view = "0000000000".view
println(List(0,12,30,4).foldLeft(view)((s, i) => s.updated(i, '1')).mkString)
But this actually doesn't compile, and I had to convert it to seq and add an ugly type ascription for it to work.
I've noticed that "0000".view returns SeqView[Char, String] which is probably correct, but then when I do "000".view.updated(0, '1') it returns Seq[Char] = SeqViewP(...).
I'd expect it to return the same SeqView[Char, String]. Is this a bug, or am I missing something? What is the best way to fix it?

Is there a reason you need to convert the String to a view? Could you not just leave it as a String? This will be converted to a "string-like" SeqLike[Char] on an as-needed basis by an implicit conversion to StringOps, which ensures that the returned type for operations like .updated(...) will still be String Eg.:
println(List(0,12,7,4).foldLeft("000000000000000")((s, i) => s.updated(i, '1')).mkString)
// prints: 100010010000100
Note: I upped the length of the String, and changed one of the values in your list just to avoid issues with index values larger than the String's length, but how you prefer to deal with that problem is a different issue.

Related

Var in scala not reassigning correctly

I have an output as an ArrayList from a Java Function
To which I am applying a map to convert all internal elements in Array from ArrayList to an Array
While doing the conversion on a single line seems to be working fine but reassigning to var seems to create a problem and gives an error
This seems to work fine
var rowList = readerTask.execute().get().asInstanceOf[util.ArrayList[Any]].toArray().map(x => x.asInstanceOf[util.ArrayList[Any]].toArray())
rowList.length should equal(3)
rowList(0)(0) should equal(1)
Where as
var rowList = readerTask.execute().get().asInstanceOf[util.ArrayList[Any]].toArray()
rowList = rowList.map(x => x.asInstanceOf[util.ArrayList[Any]].toArray())
rowList.length should equal(3)
rowList(0)(0) should equal(1)
Seems to give me an error in IntelliJ
'AnyRef' does not take parameter at this `rowList(0)(0) should equal(1)`
I am fairly new to Scala, and don't see what must be wrong
as I am simply applying a map on a new line
Any Help is appreciated
The problem is that rowList has a different type in the two versions of the code.
If you do not supply a type for a variable, Scala will infer the type from the value used to initialise it.
In the first case the type of rowList is inferred to be Array[Array[Any]] because that is the type of the whole expression.
In the second case the type of rowList is inferred to be Array[Any] because that is the type of the first part of the expression. The following assignment works because Array[Array[Any]] is compatible with Array[Any] so there is no error. But clearly you cannot double index into Array[Any] so the test fails.
More generally, Any, asInstanceOf, and var are best avoided; there are almost always ways to avoid using them and avoiding them usually results in better code.

Idiomatic handling of JSON null in scala upickle / ujson

I am new to Scala and would like to learn the idiomatic way to solve common problems, as in pythonic for Python. My question regards reading JSON data with upickle, where the JSON value contains a string when present, and null when not present. I want to use a custom value to replace null. A simple example:
import upickle.default._
val jsonString = """[{"always": "foo", "sometimes": "bar"}, {"always": "baz", "sometimes": null}]"""
val jsonData = ujson.read(jsonString)
for (m <- jsonData.arr) {
println(m("always").str.length) // this will work
println(m("sometimes").str.length) // this will fail, Exception in thread "main" ujson.Value$InvalidData: Expected ujson.Str (data: null)
}
The issue is with the field "sometimes": when null, we cannot apply .str (or any other function mapping to a static type other than null). I am looking for something like m("sometimes").str("DEFAULT").length, where "DEFAULT" is the replacement for null.
Idea 1
Using pattern matching, the following works:
val sometimes = m("sometimes") match {
case s: ujson.Str => s.str
case _ => "DEFAULT"
}
println(sometimes.length)
Given Scala's concise syntax, this looks a bit complicated and will be repetitive when done for a number of values.
Idea 2
Answers to a related question mention creating a case class with default values. For my problem, the creation of a case class seems inflexible to me when different replacement values are needed depending depending on context.
Idea 3
Anwers to another question (not specific to upickle) discuss using Try().getOrElse(), i.e.:
import scala.util.Try
// ...
println(Try(m("sometimes").str).getOrElse("DEFAULT").length)
However, the discussion mentions that throwing an exception for a regular program path is expensive.
What are idiomatic, yet concise ways to solve this?
Idiomatic or scala way to do this by using scala's Option.
Fortunately, upickle Values offers them. Refer strOpt method in this source code.
Your problem in code is str methods in m("always").str and m("sometimes").str
With this code, you are prematurely assuming that all the values are strings. That's where the strOpt method comes. It either outputs a string if its value is a string or a None type if it not. And we can use getOrElse method coupled with it to decide what to throw if the value is None.
Following would be the optimum way to handle this.
val jsonString = """[{"always": "foo", "sometimes": "bar"}, {"always": "baz", "sometimes": null}]"""
for (m <- jsonData.arr) {
println(m("always").strOpt.getOrElse("").length)
println(m("sometimes").strOpt.getOrElse("").length)
}
Output:
3
3
3
0
Here if we get any value other than a string (null, float, int), the code will output it as an empty string. And its length will be calculated as 0.
Basically, this is similar to your "Idea1" approach but this is the scala way. Instead of "DEFAULT", I am throwing an empty string because you wouldn't want to have null values' length to be 7 (Length of string "DEFAULT").

Anorm null values

I'm writing server that executes any select query on my db and returns json. I've done most of that task but I stuck with parsing nullable column to string.
val result = SQL("SELECT * FROM Table limit 5;")().map(_.asList.map({_.toString})).toList
val jsonResp = Json.toJson(result)
And that generates if the column could have null value string Some(123) instead of 123. I tried with match but I failed with compose that command. Maybe you had some similar problem and you know how to deal with that kind of response?
Edit:
I made some progress by adding pattern matching:
val result = SQL(query)()
.map(_.asList.map(
{
case Some(s) => s.toString
case None => ""
case v => v.toString
}
)).toList
but I'm not sure is it good way to solve that problem. Still waiting for ideas
Anorm is supporting nullable column as optional value.
There you return row as a list of raw value. It would be better to use the parser API to indicate how to extract properly values. E.g.
SQL("SELECT a, b, c ...").as(get[Option[String]]("a") ~ int("b") ~ str("c) map { case a ~ b ~ c => MyClass(a, b, c) }.*)
Returns SQL results as list of MyClass, with properties being in order Option[String], Int and String.
Anorm documentation has plenty of other examples.
It would probably be best to map optional entries in the database to optional fields in the json; most scala json libraries will render a field with Option type appropriately. It also seems odd that you would want to render an integer value as a string for json - json has a perfectly good numeric type for integers.
If you definitely want to convert an Option[Int] to a string in this fashion, the clearest way is probably o.map(_.toString).getOrElse(""); you could therefore write _.asList.map{_.map{_.toString}.getOrElse("")} rather than your pattern-match. I don't know the specific SQL library well enough to know whether the results are statically known to be of type Option or not; if the values are of type Any then you probably do need to match options and non-options the way you have.

avoid type conversion in Scala

I have this weird requirement where data comes in as name ->value pair from a service and all the name-> value type is string only (which really they are not but that's how data is stored)
This is a simplified illustration.
case class EntityObject(type:String,value:String)
EntityObject("boolean","true")
now when getting that EntityObject if type is "boolean" then I have to make sure value is not anything else but boolean so first get type out and check value and cast value to that type. e.g in this case check value is boolean so have to cast string value to boolean to validate. If it was anything else besides boolean then it should fail.
e.g. if data came in as below, casting will fail and it should report back to the caller about this error.
EntityObject("boolean","1")
Due to this weird requirement it forces type conversion in validation code which doesn't look elegant and against type safe programming. Any elegant way to handle this in scala (may be in a more type safe manner)?
Here is where I'm going to channel an idea taken from a tweet by Miles Sabin in regards to hereogenous mappings (see this gist on github.) If you know the type of object mapping names a head of time you can use a nifty little trick which involves dependent types. Hold on, 'cause it's a wild ride:
trait AssocConv[K] { type V ; def convert: String => V }
def makeConv[V0](name: String, con: String => V0) = new AssocConv[name.type]{
V = V0
val convert = con
}
implicit val boolConv = makeConv("boolean", yourMappingFunc)
def convEntity(name: String, value: String)(implicit conv: AssocConv[name.type]): Try[conv.V] = Try{ conv.convert(value) }
I haven't tested this but it "should" work. I've also enclosed it in a Scala Try so that it catches exceptions thrown by your conversion function (in case you're doing things like _.toInt as the converter.)
You're really talking about conversion, not casting. Casting would be if the value really were an instance of Boolean at runtime, whereas what you have is a String representation of a Boolean.
If you're already working with a case class, I think a pattern matching expression would work pretty well here.
For example,
def convert(entity : EntityObject) : Any = entity match {
case EntityObject("boolean", "true") => true
case EntityObject("boolean", "false") => false
case EntityObject("string", s) => s
// TODO: add Regex-based matchers for numeric types
}
Anything that doesn't match one of the specified patterns would cause a MatchError, or you could put a catchall expression at the end to throw your own exception.
In this particular example, since the function returns Any, the calling coffee would need to do an actual type cast to get the specific type, but at least by that point all validation/conversion would have already been performed. Alternatively, you could just put the code that uses the values directly into the above function and avoid casting. I don't know what your specific needs are, so I can't offer anything more detailed.

Map inside Map in Scala

I've this code :
val total = ListMap[String,HashMap[Int,_]]
val hm1 = new HashMap[Int,String]
val hm2 = new HashMap[Int,Int]
...
//insert values in hm1 and in hm2
...
total += "key1" -> hm1
total += "key2" -> hm2
....
val get = HashMap[Int,String] = total.get("key1") match {
case a : HashMap[Int,String] => a
}
This work, but I would know if exists a better (more readable) way to do this.
Thanks to all !
It looks like you're trying to re-implement tuples as maps.
val total : ( Map[Int,String], Map[Int,Int]) = ...
def get : Map[Int,String] = total._1
(edit: oh, sorry, I get it now)
Here's the thing: the code above doesn't work. Type parameters are erased, so the match above will ALWAYS return true -- try it with key2, for example.
If you want to store multiple types on a Map and retrieve them latter, you'll need to use Manifest and specialized get and put methods. But this has already been answers on Stack Overflow, so I won't repeat myself here.
Your total map, containing maps with non uniform value types, would be best avoided. The question is, when you retrieve the map at "key1", and then cast it to a map of strings, why did you choose String?
The most trivial reason might be that key1 and so on are simply constants, that you know all of them when you write your code. In that case, you probably should have a val for each of your maps, and dispense with map of maps entirely.
It might be that the calls made by the client code have this knowledge. Say that the client does stringMap("key1"), or intMap("key2") or that one way or another, the call implies that some given type is expected. That the client is responsible for not mixing types and names. Again in that case, there is no reason for total. You would have a map of string maps, a map of int maps (provided that you are previous knowledge of a limited number of value types)
What is your reason to have total?
First of all: this is a non-answer (as I would not recommend the approach I discuss), but it was too long for a comment.
If you haven't got too many different keys in your ListMap, I would suggest trying Malvolio's answer.
Otherwise, due to type erasure, the other approaches based on pattern matching are practically equivalent to this (which works, but is very unsafe):
val get = total("key1").asInstanceOf[HashMap[Int, String]]
the reasons why this is unsafe (unless you like living dangerously) are:
total("key1") is not returning an Option (unlike total.get("key1")). If "key1" does not exist, it will throw a NoSuchElementException. I wasn't sure how you were planning to manage the "None" case anyway.
asInstanceOf will also happily cast total("key2") - which should be a HashMap[Int, Int], but is at this point a HashMap[Int, Any] - to a HashMap[Int, String]. You will have problem later on when you try to access the Int value (which now scala believes is a String)