How to import identity operations in scalaz? - scala

syntax.IdOps seems to have no "companion" object to import its implicits (see, selfless pattern), so it's hard to use that in REPL for example:
scala> val selfish = new scalaz.syntax.ToIdOps{} //I don't want to do this, it feels wrong
selfish: scalaz.syntax.ToIdOps = $anon$1#1adfe356
scala> import selfish._
import selfish._
Is there a way to import it?

https://github.com/scalaz/scalaz/blob/v7.1.2/core/src/main/scala/scalaz/syntax/Syntax.scala#L117
You can use scalaz.syntax.id instead of new scalaz.syntax.ToIdOps{}
import scalaz.syntax.id._

Related

Change Java List to Scala Seq?

I have the following list from my configuration:
val markets = Configuration.getStringList("markets");
To create a sequence out of it I write this code:
JavaConverters.asScalaIteratorConverter(markets.iterator()).asScala.toSeq
I wish I could do it in a less verbose way, such as:
markets.toSeq
And then from that list I get the sequence. I will have more configuration in the near future; is there a solution that provides this kind of simplicity?
I want a sequence regardless of the configuration library I am using. I don't want to have the stated verbose solution with the JavaConverters.
JavaConversions is deprecated since Scala 2.12.0. Use JavaConverters; you can import scala.collection.JavaConverters._ to make it less verbose:
import scala.collection.JavaConverters._
val javaList = java.util.Arrays.asList("one", "two")
val scalaSeq = javaList.asScala.toSeq
Yes. Just import implicit conversions:
import java.util
import scala.collection.JavaConversions._
val jlist = new util.ArrayList[String]()
jlist.toSeq

How to serialize bloom filter from Google Guava?

I'm trying to use Google Guava's Bloom filter and serialize it using Scala. Creating it was easy:
import com.google.common.hash.{BloomFilter,Funnels}
val b = BloomFilter.create(Funnels.unencodedCharsFunnel,5e8.toLong,1e-6)
But I don't understand how to serialize it.. Was expecting a BloomFilter.serialize method, but no.. What am I missing?
The poit is trying to turn the Bloom filter to an Array[Byte]..
Did you try this:
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import com.google.common.hash.{BloomFilter,Funnels}
val b = BloomFilter.create(Funnels.unencodedCharsFunnel,5e8.toLong,1e-6)
val stream = new ObjectOutputStream(new FileOutputStream(
"/path/to/file/file.obj"))
stream.writeObject(b)

Scala Queue import error

If I import scala.collection._ and create a queue:
import scala.collection._
var queue = new Queue[Component]();
I get the following error:
error: not found: type Queue
However, if I also add
import scala.collection.mutable.Queue
The error will disappear.
Why is this happening?
Shouldn't scala.collection._ contain scala.collection.mutable.Queue?
You have to know how the Scala collections library is structured. It splits collections based on whether they are mutable or immutable.
Queue lives in the scala.collection.mutable package and scala.collection.immutable package. You have to specify which one you want e.g.
scala> import scala.collection.mutable._
import scala.collection.mutable._
scala> var q = new Queue[Int]()
q: scala.collection.mutable.Queue[Int] = Queue()
scala> import scala.collection.immutable._
import scala.collection.immutable._
scala> var q = Queue[Int]()
q: scala.collection.immutable.Queue[Int] = Queue()
After import scala.collection._ you can use mutable.Queue; you could write Queue if there was a scala.collection.Queue (or one of scala.Queue, java.lang.Queue, and scala.Predef.Queue, since all of their members are imported by default), but there isn't.
It should be easy to see why it works this way: otherwise, the compiler (or anyone reading your code) would have no idea where they should look for the type: do you want scala.collection.Queue, scala.collection.mutable.Queue, or scala.collection.some.subpackage.from.a.library.Queue?

getOrElse method not being found in Scala Spark

Attempting to follow example in Sandy Ryza's book Advanced Analytics with Spark, coding using IntelliJ. Below I seem to have imported all the right libraries, but why is it not recognizing getOrElse?
Error:(84, 28) value getOrElse is not a member of org.apache.spark.rdd.RDD[String]
bArtistAlias.value.getOrElse(artistID, artistID)
^
Code:
import org.apache.spark.rdd.RDD
import org.apache.spark.rdd._
import org.apache.spark.rdd.PairRDDFunctions
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.mllib.recommendation._
val trainData = rawUserArtistData.map { line =>
val Array(userID, artistID, count) = line.split(' ').map(_.toInt)
val finalArtistID = bArtistAlias.value.getOrElse(artistID, artistID)
Rating(userID, finalArtistID, count)
}.cache()
I can only make an assumption as the code listed is missing pieces, but my guess is that bArtistAlias is supposed to be a Map that SHOULD be broadcast, but isnt.
I went and found the piece of code in Sandy's book and it corroborates my guess. So, you seem to be missing this piece:
val bArtistAlias = sc.broadcast(artistAlias)
I am not even sure what you did without the code, but it looks like you broadcast an RDD[String], thus the error.....this would not even work anyway as you cannot work with another RDD inside of an RDD

Creating a scala Reader from a file

How do you instantiate a scala.util.parsing.input.Reader to read from a file? The API mentions in passing something about PagedSeq and java.io.Reader, but it's not clear at all how to accomplish that.
You create a FileInputStream, pass that to an InputStreamReader and pass that to the apply method of the StreamReader companion object, which returns a StreamReader, a subtype of Reader.
scala> import scala.util.parsing.input.{StreamReader,Reader}
import scala.util.parsing.input.{StreamReader, Reader}
scala> import java.io._
import java.io._
scala> StreamReader(new InputStreamReader(new FileInputStream("test")))
res0: scala.util.parsing.input.StreamReader = scala.util.parsing.input.StreamReader#1e5a0cb