I'm new in Scala programming.
I would like to have this kind of immutable map :
Map[ (Int,Int), (List[BoolVar]) ]
From these two lists :
val courseName = List("Course1","Course2")
val serieName = List("Serie1","Serie2")
My goal :
Map[0][0] // List[BoolVar] for "Course1""Serie1"
Map[0][0](0) // a BoolVar from "Course1""Serie1" List
....
I tried this but the syntax is false :
val test = Map[ (Int,Int), (List[BoolVar]) ](
for (course <- List.range(0,courseName.length) )
for( serie <- List.range(0,serieName.length) )
yield (course,serie) ->
for (indice <- List.range(0, 48))
yield BoolVar( courseName(course) + " - " + serieName(serie) )
);
Thanks for your help
Is that what you are looking for ?? Just a few minor changes.
But it will use round brackets
val courseName = List("Course1","Course2")
val serieName = List("Serie1","Serie2")
val m = {
for {
course <- List.range(0,courseName.length)
serie <- List.range(0,serieName.length)
} yield (course, serie) -> {
for (indice <- List.range(0, 48))
yield BoolVar( courseName(course) + " - " + serieName(serie) )
}
}.toMap
println( m )
Related
I want to test multiple methods, one that outputs a map and one that outputs a list. I have two separate test cases for each method, but I want a way to combine them and test both methods at the same time.
test("test 1 map") {
val testCases: Map[String, Map[String, Int]] = Map(
"Andorra" -> Map("la massana" -> 7211)
)
for ((input, expectedOutput) <- testCases) {
var computedOutput: mutable.Map[String, Int] = PaleBlueDot.cityPopulations(countriesFile, citiesFilename, input, "04")
assert(computedOutput == expectedOutput, input + " -> " + computedOutput)
}
}
test(testName="test 1 list") {
val testCases: Map[String, List[String]] = Map{
"Andorra" -> List("les escaldes")
}
for ((input, expectedOutput) <- testCases) {
var computedOutput: List[String] = PaleBlueDot.aboveAverageCities(countriesFile, citiesFilename, input)
assert(computedOutput.sorted == expectedOutput.sorted, input + " -> " + computedOutput)
}
Firstly, it is better to use a List rather than a Map for testCases as a Map can return values in any order. Using List ensures that tests are done in the order they are written in the list.
You can then make testCases into a List containing a tuple with test data for both tests, like this:
test("test map and list") {
val testCases = List {
"Andorra" -> (Map("la massana" -> 7211), List("les escaldes"))
}
for ((input, (mapOut, listOut)) <- testCases) {
val computedMap: mutable.Map[String, Int] =
PaleBlueDot.cityPopulations(countriesFile, citiesFilename, input, "04")
val computedList: List[String] =
PaleBlueDot.aboveAverageCities(countriesFile, citiesFilename, input)
assert(computedMap == mapOut, input + " -> " + computedMap)
assert(computedList.sorted == listOut.sorted, input + " -> " + computedList)
}
}
I'm new in Scala programming language so in this Bubble sort I need to generate 10 random integers instead of right it down like the code below
any suggestions?
object BubbleSort {
def bubbleSort(array: Array[Int]) = {
def bubbleSortRecursive(array: Array[Int], current: Int, to: Int): Array[Int] = {
println(array.mkString(",") + " current -> " + current + ", to -> " + to)
to match {
case 0 => array
case _ if(to == current) => bubbleSortRecursive(array, 0, to - 1)
case _ =>
if (array(current) > array(current + 1)) {
var temp = array(current + 1)
array(current + 1) = array(current)
array(current) = temp
}
bubbleSortRecursive(array, current + 1, to)
}
}
bubbleSortRecursive(array, 0, array.size - 1)
}
def main(args: Array[String]) {
val sortedArray = bubbleSort(Array(10,9,11,5,2))
println("Sorted Array -> " + sortedArray.mkString(","))
}
}
Try this:
import scala.util.Random
val sortedArray = (1 to 10).map(_ => Random.nextInt).toArray
You can use scala.util.Random for generation. nextInt method takes maxValue argument, so in the code sample, you'll generate list of 10 int values from 0 to 100.
val r = scala.util.Random
for (i <- 1 to 10) yield r.nextInt(100)
You can find more info here or here
You can use it this way.
val solv1 = Random.shuffle( (1 to 100).toList).take(10)
val solv2 = Array.fill(10)(Random.nextInt)
here is example about countWords. (Scala)
[origin]
def countWords(text: String): mutable.Map[String, Int] = {
val counts = mutable.Map.empty[String, Int]
for (rawWord <- text.split("[ ,!.]+")) {
val word = rawWord.toLowerCase
val oldCount =
if (counts.contains(word)) counts(word)
else 0
counts += (word -> (oldCount + 1))
}
return counts
}
[my code]
here is my code.
def countWords2(text: String):mutable.Map[String, Int] = {
val counts = mutable.Map.empty[String, Int]s
text.split("[ ,!.]").foreach(word =>
val lowWord = word.toLowerCase()
val oldCount = if (counts.contains(lowWord)) counts(lowWord) else 0
counts += (lowWord -> (oldCount + 1))
)
return counts
}
I tried transfer "for()" sentence to "foreach" but I got "cannot resolved symbol" error message.
how to use foreach in this case?
for (fordate <- 2 to 30) {
val dataRDD = sc.textFile("s3n://mypath" + fordate + "/*")
val a = 1
val c = fordate - 1
for (b <- a to c) {
val cumilativeRDD1 = sc.textFile("s3n://mypath/" + b + "/*")
val cumilativeRDD : org.apache.spark.rdd.RDD[String] = sc.union(cumilativeRDD1, cumilativeRDD)
if (b == c) {
val incrementalDEviceIDs = dataRDD.subtract(cumilativeRDD)
val countofIDs = incrementalDEviceIDs.distinct().count()
println(s"201611 $fordate $countofIDs")
}
}
}
i have a data set where i get deviceIDs on daily basis. i need to figure out the incremental count per day but when i join cumilativeRDD to itself it saysthrows following error:
forward reference extends over definition of value cumilativeRDD
how can i overcome this.
The problem is this line:
val cumilativeRDD : org.apache.spark.rdd.RDD[String] = sc.union(cumilativeRDD1 ,cumilativeRDD)
You're using cumilativeRDD before it's declaration. Variable assignment works from right to left. The right side of = defines the variable on the left. Therefore you cannot use the variable inside it's own definition. Because on the right side of the equation the variable does not yet exist.
You have to init cumilativeRDD in the first run and then you can you use it in following runs:
var cumilativeRDD: Option[org.apache.spark.rdd.RDD[String]] = None
for (fordate <- 2 to 30) {
val DataRDD = sc.textFile("s3n://mypath" + fordate + "/*")
val c = fordate - 1
for (b <- 1 to c) {
val cumilativeRDD1 = sc.textFile("s3n://mypath/" + b + "/*")
if (cumilativeRDD.isEmpty) cumilativeRDD = Some(cumilativeRDD1)
else cumilativeRDD = Some(sc.union(cumilativeRDD1, cumilativeRDD.get))
if (b == c) {
val IncrementalDEviceIDs = DataRDD.subtract(cumilativeRDD.get)
val countofIDs = IncrementalDEviceIDs.distinct().count()
println("201611" + fordate + " " + countofIDs)
}
}
}
val db = mongoClient("test")
val coll = db("test")
val q = MongoDBObject("id" -> 100)
val result= coll.findOne(q)
How can I convert result to a map of key --> value pairs?
result of findOne is an Option[Map[String, AnyRef]] because MongoDBObject is a Map.
A Map is already a collection of pairs.
To print them, simply:
for {
r <- result
(key,value) <- r
}
yield println(key + " " + value.toString)
or
result.map(_.map({case (k,v) => println(k + " " + v)}))
To serialize mongo result, try com.mongodb.util.JSON.serialize, like
com.mongodb.util.JSON.serialize(result.get)