I have a scala Map and would like to test if a certain value exists in the map.
myMap.exists( /*What should go here*/ )
There are several different options, depending on what you mean.
If you mean by "value" key-value pair, then you can use something like
myMap.exists(_ == ("fish",3))
myMap.exists(_ == "fish" -> 3)
If you mean value of the key-value pair, then you can
myMap.values.exists(_ == 3)
myMap.exists(_._2 == 3)
If you wanted to just test the key of the key-value pair, then
myMap.keySet.exists(_ == "fish")
myMap.exists(_._1 == "fish")
myMap.contains("fish")
Note that although the tuple forms (e.g. _._1 == "fish") end up being shorter, the slightly longer forms are more explicit about what you want to have happen.
Do you want to know if the value exists on the map, or the key? If you want to check the key, use isDefinedAt:
myMap isDefinedAt key
you provide a test that one of the map values will pass, i.e.
val mymap = Map(9->"lolo", 7->"lala")
mymap.exists(_._1 == 7) //true
mymap.exists(x => x._1 == 7 && x._2 == "lolo") //false
mymap.exists(x => x._1 == 7 && x._2 == "lala") //true
The ScalaDocs say of the method "Tests whether a predicate holds for some of the elements of this immutable map.", the catch is that it receives a tuple (key, value) instead of two parameters.
What about this:
val map = Map(1 -> 'a', 2 -> 'b', 4 -> 'd')
map.values.toSeq.contains('c') //false
Yields true if map contains c value.
If you insist on using exists:
map.exists({case(_, value) => value == 'c'})
Per answers above, note that exists() is significantly slower than contains() (I've benchmarked with a Map containing 5000 string keys, and the ratio was a consistent x100). I'm relatively new to scala but my guess is exists() is iterating over all keys (or key,value tupple) whereas contains uses Map's random access
Related
I am doing an exercise that ask to remove the elements at odd positions.
I wonder if there is a best alternative to what I thought:
val a = List(1,2,3,4,5,6)
The first approach:
a.zipWithIndex.filter(x => (x._2 & 1) == 1).map(_._1)
and the second:
a.indices.filter(i => (i & 1) == 1).map(a(_))
Am I correct if I think the second approach is more efficient? Since it is not necessary to produce an intermediate list as zipWithIndex does?
You can use a view to avoid intermediate lists:
a.view
.zipWithIndex
.filter(x => (x._2 & 1) == 1)
.map(_._1)
.force
This will only traverse a once when force is called.
You can use the collect method on the zipped list, might be a bit clearer
a.zipWithIndex.collect{
case (x,i) if i % 2 == 1 => x
}
https://scalafiddle.io/sf/YbureiX/0
I am not sure about the efficiency though
You can avoid formation of intermediate collection by using withFilter, also you can convert list to Vector to extract element at particular indices in constant time:
val a: Vector[Int] = List(1,2,3,4,5,6).toVector
val res: Seq[Int] = a.indices.withFilter(i => (i & 1) == 1).map(a(_))
println(res)
Am trying to convert a map to list of Tuples, given a map like below
Map("a"->2,"a"->4,"a"->5,"b"->6,"b"->1,"c"->3)
i want output like
List(("a",2),("a",4),("a",5),("b",6),("b",1),("c",3))
I tried following
val seq = inputMap.toList //output (a,5)(b,1)(c,3)
var list:List[(String,Int)] = Nil
for((k,v)<-inputMap){
(k,v) :: list
} //output (a,5)(b,1)(c,3)
Why does it remove duplicates? I dont see other tuples that has "a" as key.
That's because a Map doesn't allow duplicate keys:
val map = Map("a"->2,"a"->4,"a"->5,"b"->6,"b"->1,"c"->3)
println(map) // Map(a -> 5, b -> 1, c -> 3)
Since map has duplicate keys, it will remove duplicate entries while map creation itself.
Map("a"->2,"a"->4,"a"->5,"b"->6,"b"->1,"c"->3)
it will turn into,
Map(a -> 5, b -> 1, c -> 3)
so other operations will be performed on short map
The problem is with Map, whose keys are a Set, so you cannot have twice the same key. This is because Maps are dictionaries, which are made to access a value by its key, so keys MUST be unique. The builder thus keeps only the last value given with the key "a".
By the way, Map already has a method toList, that do exactly what you implemented.
I currently have 2 lists List('a','b','a') and List(45,65,12) with many more elements and elements in 2nd list linked to elements in first list by having a key value relationship. I want combine elements with same keys by adding their corresponding values and create a map which should look like Map('a'-> 57,'b'->65) as 57 = 45 + 12.
I have currently implemented it as
val keys = List('a','b','a')
val values = List(45,65,12)
val finalMap:Map(char:Int) =
scala.collection.mutable.Map().withDefaultValue(0)
0 until keys.length map (w => finalMap(keys(w)) += values(w))
I feel that there should be a better way(functional way) of creating the desired map than how I am doing it. How could I improve my code and do the same thing in more functional way?
val m = keys.zip(values).groupBy(_._1).mapValues(l => l.map(_._2).sum)
EDIT: To explain how the code works, zip pairs corresponding elements of two input sequences, so
keys.zip(values) = List((a, 45), (b, 65), (a, 12))
Now you want to group together all the pairs with the same first element. This can be done with groupBy:
keys.zip(values).groupBy(_._1) = Map((a, List((a, 45), (a, 12))), (b, List((b, 65))))
groupBy returns a map whose keys are the type being grouped on, and whose values are a list of the elements in the input sequence with the same key.
The keys of this map are the characters in keys, and the values are a list of associated pair from keys and values. Since the keys are the ones you want in the output map, you only need to transform the values from List[Char, Int] to List[Int].
You can do this by summing the values from the second element of each pair in the list.
You can extract the values from each pair using map e.g.
List((a, 45), (a, 12)).map(_._2) = List(45,12)
Now you can sum these values using sum:
List(45, 12).sum = 57
You can apply this transform to all the values in the map using mapValues to get the result you want.
I was going to +1 Lee's first version, but mapValues is a view, and ell always looks like one to me. Just not to seem petty.
scala> (keys zip values) groupBy (_._1) map { case (k,v) => (k, (v map (_._2)).sum) }
res0: scala.collection.immutable.Map[Char,Int] = Map(b -> 65, a -> 57)
Hey, the answer with fold disappeared. You can't blink on SO, the action is so fast.
I'm going to +1 Lee's typing speed anyway.
Edit: to explain how mapValues is a view:
scala> keys.zip(values).groupBy(_._1).mapValues(l => l.map { v =>
| println("OK mapping")
| v._2
| }.sum)
OK mapping
OK mapping
OK mapping
res2: scala.collection.immutable.Map[Char,Int] = Map(b -> 65, a -> 57)
scala> res2('a') // recomputes
OK mapping
OK mapping
res4: Int = 57
Sometimes that is what you want, but often it is surprising. I think there is a puzzler for it.
You were actually on the right track to a reasonably efficient functional solution. If we just switch to an immutable collection and use a fold on a key-value zip, we get:
( Map[Char,Int]() /: (keys,values).zipped ) ( (m,kv) =>
m + ( kv._1 -> ( m.getOrElse( kv._1, 0 ) + kv._2 ) )
)
Or you could use withDefaultValue 0, as you did, if you want the final map to have that default. Note that .zipped is faster than zip because it doesn't create an intermediate collection. And a groupBy would create a number of other intermediate collections. Of course it may not be worth optimizing, and if it is you could do even better than this, but I wanted to show you that your line of thinking wasn't far off the mark.
I have a set of keys, say Set[MyKey] and for each of the keys I want to compute the value through some value function, lets say computeValueOf(key: MyKey). In the end I want to have a Map which maps key -> value
What is the most efficient way to do this without iterating too much?
A collection of Tuple2s can be converted to a Map, where the tuple's first element will be the key and the second element will be the value.
val setOfKeys = Set[MyKey]()
setOfKeys.map(key => (key, computeValueOf(key)).toMap
This is actually a pretty neat application for collection.breakOut, one of my favorite pieces of bizarre Scala voodoo:
type MyKey = Int
def computeValueOf(key: MyKey) = "value" * key
val mySet: Set[MyKey] = Set(1, 2, 3)
val myMap: Map[MyKey, String] =
mySet.map(k => k -> computeValueOf(k))(collection.breakOut)
See this answer for some discussion of what's going on here. Unlike the version with toMap, this won't construct an intermediate Set, saving you some allocations and a traversal. It's also much less readable, though—I only offer it because you mentioned that you wanted to avoid "iterating too much".
I'm trying to reduce the extent to which I write Scala (2.8) like Java. Here's a simplification of a problem I came across. Can you suggest improvements on my solutions that are "more functional"?
Transform the map
val inputMap = mutable.LinkedHashMap(1->'a',2->'a',3->'b',4->'z',5->'c')
by discarding any entries with value 'z' and indexing the characters as they are encountered
First try
var outputMap = new mutable.HashMap[Char,Int]()
var counter = 0
for(kvp <- inputMap){
val character = kvp._2
if(character !='z' && !outputMap.contains(character)){
outputMap += (character -> counter)
counter += 1
}
}
Second try (not much better, but uses an immutable map and a 'foreach')
var outputMap = new immutable.HashMap[Char,Int]()
var counter = 0
inputMap.foreach{
case(number,character) => {
if(character !='z' && !outputMap.contains(character)){
outputMap2 += (character -> counter)
counter += 1
}
}
}
Nicer solution:
inputMap.toList.filter(_._2 != 'z').map(_._2).distinct.zipWithIndex.toMap
I find this solution slightly simpler than arjan's:
inputMap.values.filter(_ != 'z').toSeq.distinct.zipWithIndex.toMap
The individual steps:
inputMap.values // Iterable[Char] = MapLike(a, a, b, z, c)
.filter(_ != 'z') // Iterable[Char] = List(a, a, b, c)
.toSeq.distinct // Seq[Char] = List(a, b, c)
.zipWithIndex // Seq[(Char, Int)] = List((a,0), (b,1), (c,2))
.toMap // Map[Char, Int] = Map((a,0), (b,1), (c,2))
Note that your problem doesn't inherently involve a map as input, since you're just discarding the keys. If I were coding this, I'd probably write a function like
def buildIndex[T](s: Seq[T]): Map[T, Int] = s.distinct.zipWithIndex.toMap
and invoke it as
buildIndex(inputMap.values.filter(_ != 'z').toSeq)
First, if you're doing this functionally, you should use an immutable map.
Then, to get rid of something, you use the filter method:
inputMap.filter(_._2 != 'z')
and finally, to do the remapping, you can just use the values (but as a set) withzipWithIndex, which will count up from zero, and then convert back to a map:
inputMap.filter(_._2 != 'z').values.toSet.zipWithIndex.toMap
Since the order of values isn't going to be preserved anyway*, presumably it doesn't matter that the order may have been shuffled yet again with the set transformation.
Edit: There's a better solution in a similar vein; see Arjan's. Assumption (*) is wrong, since it was a LinkedHashMap. So you do need to preserve order, which Arjan's solution does.
i would create some "pipeline" like this, but this has a lot of operations and could be probably shortened. These two List.map's could be put in one, but I think you've got a general idea.
inputMap
.toList // List((5,c), (1,a), (2,a), (3,b), (4,z))
.sorted // List((1,a), (2,a), (3,b), (4,z), (5,c))
.filterNot((x) => {x._2 == 'z'}) // List((1,a), (2,a), (3,b), (5,c))
.map(_._2) // List(a, a, b, c)
.zipWithIndex // List((a,0), (a,1), (b,2), (c,3))
.map((x)=>{(x._2+1 -> x._1)}) // List((1,a), (2,a), (3,b), (4,c))
.toMap // Map((1,a), (2,a), (3,b), (4,c))
performing these operation on lists keeps ordering of elements.
EDIT: I misread the OP question - thought you wanted run length encoding. Here's my take on your actual question:
val values = inputMap.values.filterNot(_ == 'z').toSet.zipWithIndex.toMap
EDIT 2: As noted in the comments, use toSeq.distinct or similar if preserving order is important.
val values = inputMap.values.filterNot(_ == 'z').toSeq.distinct.zipWithIndex.toMap
In my experience I have found that maps and functional languages do not play nice. You'll note that all answers so far in one way or another in involve turning the map into a list, filtering the list, and then turning the list back into a map.
I think this is due to maps being mutable data structures by nature. Consider that when building a list, that the underlying structure of the list does not change when you append a new element and if a true list then an append is a constant O(1) operation. Whereas for a map the internal structure of a map can vastly change when a new element is added ie. when the load factor becomes too high and the add algorithm resizes the map. In this way a functional language cannot just create a series of a values and pop them into a map as it goes along due to the possible side effects of introducing a new key/value pair.
That said, I still think there should be better support for filtering, mapping and folding/reducing maps. Since we start with a map, we know the maximum size of the map and it should be easy to create a new one.
If you're wanting to get to grips with functional programming then I'd recommending steering clear of maps to start with. Stick with the things that functional languages were designed for -- list manipulation.