Hard time using Scala Worksheets in IntelliJ - scala

I'm following the course Functional Programming Principles in Scala
but I am experiencing a lot of issues when using Scala Worksheets in IntelliJ to make quick tests.
For example, I have set up a new Scala project where I have created a package object called lecture5 (it's in the file) src/main/scala/lecture5/package.scala
The content of the file is:
package object lecture5 {
def last[T](xs:List[T]): T = xs match {
case List() => throw new Error("empty list")
case List(x) => x
case x :: y => last(y)
}
/* init should return all elements but last */
def init[T](xs: List[T]): List[T] = xs match {
case List() => throw new Error("List is empty")
case List(x) => List[T]()
case y :: ys => y :: init(ys)
}
def concat[T](xs: List[T], ys: List[T]): List[T] = xs match {
case List() => ys
case z:: zs => z :: concat(zs, ys)
}
}
In the worksheet I have the following:
import lecture5._
val x = List("a","b","c")
val xs = List("a","b")
val ys = List("c")
last(x)
init(x)
concat(xs, ys) == x
In the settings for the worksheet I checked Interactive Mode, Make project before run and use Run Type = REPL (Plain doesn't work for some reason) and Compiler profile = Default.
When I click on the "play" button to run the worksheet the functions init and last work, but for the function concat I get error:
Error:(13, 9) not found: value concat
concat(xs, ys) == x
Why is concat not found?
Note that if I use the Scala console from within the sbt-shell and execute the same commands then everything works.
How can I configure IntelliJ to work with a Worksheet without issues?

I replicated the issue on IntelliJ 2019.1.2, Scala Plugin 2019.1.8. No form of building the project before running the worksheet worked. Package object was finally successfully imported after Invalidate Caches / Restart.... A workaround that seems to work for me without restarting is to use Scala Scratch file instead of Scala Worksheet:
Right click project | New | Scratch file | Scala
Possibly related to issue SCL-12890

Related

Convert Seq[Try[Option(String, Any)]] into Try[Option[Map[String, Any]]]

How to conveniently convert Seq[Try[Option[String, Any]]] into Try[Option[Map[String, Any]]].
If any Try before convert throws an exception, the converted Try should throw as well.
Assuming that the input type has a tuple inside the Option then this should give you the result you want:
val in: Seq[Try[Option[(String, Any)]]] = ???
val out: Try[Option[Map[String,Any]]] = Try(Some(in.flatMap(_.get).toMap))
If any of the Trys is Failure then the outer Try will catch the exception raised by the get and return Failure
The Some is there to give the correct return type
The get extracts the Option from the Try (or raises an exception)
Using flatMap rather than map removes the Option wrapper, keeping all Some values and discaring None values, giving Seq[(String, Any)]
The toMap call converts the Seq to a Map
Here is something that's not very clean but may help get you started. It assumes Option[(String,Any)], returns the first Failure if there are any in the input Seq and just drops None elements.
foo.scala
package foo
import scala.util.{Try,Success,Failure}
object foo {
val x0 = Seq[Try[Option[(String, Any)]]]()
val x1 = Seq[Try[Option[(String, Any)]]](Success(Some(("A",1))), Success(None))
val x2 = Seq[Try[Option[(String, Any)]]](Success(Some(("A",1))), Success(Some(("B","two"))))
val x3 = Seq[Try[Option[(String, Any)]]](Success(Some(("A",1))), Success(Some(("B","two"))), Failure(new Exception("bad")))
def f(x: Seq[Try[Option[(String, Any)]]]) =
x.find( _.isFailure ).getOrElse( Success(Some(x.map( _.get ).filterNot( _.isEmpty ).map( _.get ).toMap)) )
}
Example session
bash-3.2$ scalac foo.scala
bash-3.2$ scala -classpath .
Welcome to Scala 2.13.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_66).
Type in expressions for evaluation. Or try :help.
scala> import foo.foo._
import foo.foo._
scala> f(x0)
res0: scala.util.Try[Option[Equals]] = Success(Some(Map()))
scala> f(x1)
res1: scala.util.Try[Option[Equals]] = Success(Some(Map(A -> 1)))
scala> f(x2)
res2: scala.util.Try[Option[Equals]] = Success(Some(Map(A -> 1, B -> two)))
scala> f(x3)
res3: scala.util.Try[Option[Equals]] = Failure(java.lang.Exception: bad)
scala> :quit
If you're willing to use a functional support library like Cats then there are two tricks that can help this along:
Many things like List and Try are traversable, which means that (if Cats's implicits are in scope) they have a sequence method that can swap two types, for example converting List[Try[T]] to Try[List[T]] (failing if any of the items in the list are failure).
Almost all of the container types support a map method that can operate on the contents of a container, so if you have a function from A to B then map can convert a Try[A] to a Try[B]. (In Cats language they are functors but the container-like types in the standard library generally have map already.)
Cats doesn't directly support Seq, so this answer is mostly in terms of List instead.
Given that type signature, you can iteratively sequence the item you have to in effect push the list type down one level in the type chain, then map over that container to work on its contents. That can look like:
import cats.implicits._
import scala.util._
def convert(listTryOptionPair: List[Try[Option[(String, Any)]]]): Try[
Option[Map[String, Any]]
] = {
val tryListOptionPair = listTryOptionPair.sequence
tryListOptionPair.map { listOptionPair =>
val optionListPair = listOptionPair.sequence
optionListPair.map { listPair =>
Map.from(listPair)
}
}
}
https://scastie.scala-lang.org/xbQ8ZbkoRSCXGDJX0PgJAQ has a slightly more complete example.
One way to approach this is by using a foldLeft:
// Let's say this is the object you're trying to convert
val seq: Seq[Try[Option[(String, Any)]]] = ???
seq.foldLeft(Try(Option(Map.empty[String, Any]))) {
case (acc, e) =>
for {
accOption <- acc
elemOption <- e
} yield elemOption match {
case Some(value) => accOption.map(_ + value)
case None => accOption
}
}
You start off with en empty Map. You then use a for comprehension to go through the current map and element and finally you add a new tuple in the map if present.
The following solutions is based on this answer to the point that almost makes the question a duplicate.
Method 1: Using recursion
def trySeqToMap1[X,Y](trySeq : Seq[Try[Option[(X, Y)]]]) : Try[Option[Map[X,Y]]] = {
def helper(it : Iterator[Try[Option[(X,Y)]]], m : Map[X,Y] = Map()) : Try[Option[Map[X,Y]]] = {
if(it.hasNext) {
val x = it.next()
if(x.isFailure)
Failure(x.failed.get)
else if(x.get.isDefined)
helper(it, m + (x.get.get._1-> x.get.get._2))
else
helper(it, m)
} else Success(Some(m))
}
helper(trySeq.iterator)
}
Method 2: directly pattern matching in case you are able to get a stream or a List instead:
def trySeqToMap2[X,Y](trySeq : LazyList[Try[Option[(X, Y)]]], m : Map[X,Y]= Map.empty[X,Y]) : Try[Option[Map[X,Y]]] =
trySeq match {
case Success(Some(h)) #:: tail => trySeqToMap2(tail, m + (h._1 -> h._2))
case Success(None) #:: tail => tail => trySeqToMap2(tail, m)
case Failure(f) #:: _ => Failure(f)
case _ => Success(Some(m))
}
note: this answer was previously using different method signatures. It has been updated to conform to the signature given in the question.

How do I shrink a list but guarantee it isn't empty?

In ScalaCheck, I have written a generator of non-empty lists of strings,
val nonEmptyListsOfString: Gen[List[String]] =
Gen.nonEmptyListOf(Arbitrary.arbitrary[String])
And then, assume I wrote a property using Prop.forAll,
Prop.forAll(nonEmptyListsOfString) { strs: List[String] =>
strs == Nil
}
This is just a simple example that is meant to fail, so that it can show how the shrinking is done by Scalacheck to find the smallest example.
However, the default shrinker in Scalacheck doesn't respect the generator, and will still shrink to an empty string, ignoring the generator properties.
sbt> test
[info] ! Prop.isEmpty: Falsified after 1 passed tests.
[info] > ARG_0: List()
[info] > ARG_0_ORIGINAL: List("")
[info] Failed: Total 1, Failed 1, Errors 0, Passed 0
[error] Failed tests:
[error] example.Prop
As mentioned in the comment, and re-using the example from the github issue you posted:
import cats.data.NonEmptyList
import org.scalacheck.{Arbitrary, Gen}
import org.scalatest.{FreeSpec, Matchers}
import org.scalatest.prop.PropertyChecks
class ScalaCheckTest extends FreeSpec with PropertyChecks with Matchers{
"Test scalacheck (failing)" in {
val gen: Gen[List[Int]] = for {
n <- Gen.choose(1, 3)
list <- Gen.listOfN(n, Gen.choose(0, 9))
} yield list
forAll(gen) { list =>
list.nonEmpty shouldBe true
if (list.sum < 18) throw new IllegalArgumentException("ups")
}
}
"Test scalacheck" in {
val gen1 = for{
first <- Arbitrary.arbInt.arbitrary
rest <- Gen.nonEmptyListOf(Arbitrary.arbInt.arbitrary)
} yield {
NonEmptyList(first, rest)
}
forAll(gen1) { list =>
val normalList = list.toList
normalList.nonEmpty shouldBe true
if (normalList.sum < 18) throw new IllegalArgumentException("ups")
}
}
}
The first test does fail showing an empty list being used, but the second one does indeed throw the exception.
UPDATE: Cats is obviously not really needed, here I use a simple (and local) version of a non-empty list for the sake of this test.
"Test scalacheck 2" in {
case class FakeNonEmptyList[A](first : A, tail : List[A]){
def toList : List[A] = first :: tail
}
val gen1 = for{
first <- Arbitrary.arbInt.arbitrary
rest <- Gen.nonEmptyListOf(Arbitrary.arbInt.arbitrary)
} yield {
FakeNonEmptyList(first, rest)
}
forAll(gen1) { list =>
val normalList = list.toList
normalList.nonEmpty shouldBe true
if (normalList.sum < 18) throw new IllegalArgumentException("ups")
}
}
There is a way to define your own Shrink class in ScalaCheck. However, it is not common nor very easy to do.
Overview
A Shrink requires defining an implicit definition in scope of your property test. Then Prop.forAll will find your Shrink class if it is in scope and has the appropriate type signature for the value that failed a test.
Fundamentally, a Shrink instance is a function that converts the failing value, x, to a stream of "shrunken" values. It's type signature is roughly:
trait Shrink[T] {
def shrink(x: T): Stream[T]
}
You can define a Shrink with the companion object's apply method, which is roughly this:
object Shrink {
def apply[T](s: T => Stream[T]): Shrink[T] = {
new Shrink[T] {
def shrink(x: T): Stream[T] = s(x)
}
}
}
Example: Shrinking integers
If you know how to work with a Stream collection in Scala, then it's easy to define a shrinker for Int that shrinks by halving the value:
implicit val intShrinker: Shrink[Int] = Shrink {
case 0 => Stream.empty
case x => Stream.iterate(x / 2)(_ / 2).takeWhile(_ != 0) :+ 0
}
We want to avoid returning the original value to ScalaCheck, so that's why zero is a special case.
Answer: Non-empty lists
In the case of a non-empty list of strings, you want to re-use the container shrinking of ScalaCheck, but avoid empty containers. Unfortunately, that's not easy to do, but it is possible:
implicit def shrinkListString(implicit s: Shrink[String]): Shrink[List[String]] = Shrink {
case Nil => Stream.empty[List[String]]
case strs => Shrink.shrink(strs)(Shrink.shrinkContainer).filter(!_.isEmpty)
}
Rather than writing a generic container shrinker that avoids empty containers, the one above is specific to List[String]. It could probably be rewritten to List[T].
The first pattern match against Nil is probably unnecessary.

Getting the element from a 1-element Scala collection

Learning Scala and I keep wanting an equivalent to LINQ's Single() method. Example,
val collection: Seq[SomeType]
val (desiredItem, theOthers) = collection.partition(MyFunc)
desiredItem.single.doSomething
// ^^^^^^
I could use desiredItem.head but what if MyFunc actually matched several? I want the assurance that there's only one.
Edit #2 The duplicate question says 'no there isn't but here's how to build it'. So I am thinking if this was a common need it would be in the base API. Do properly written Scala programs need this?
I'd use something more verbose instead of single:
(desiredItem match {
case Seq(single) => single
case _ => throw IllegalStateException("Not a single element!")
}).doSomething
Its advantage over single is that it allows you to explicitly control the behavior in exceptional case (trow an exception, return fallback value).
Alternatively you can use destructuring assignment:
val Seq(single) = desiredItem
single.doSomething
In this case you'll get MatchError if desiredItem doesn't contain exactly one element.
UPD: I looked again at your code. Destructuring assignment is the way to go for you:
val collection: Seq[SomeType]
val (Seq(desiredItem), theOthers) = collection.partition(MyFunc)
desiredItem.doSomething
There's no prebuilt method in the API to do that. You can create your own method to do something similar though.
scala> def single[A](xs: List[A]) = xs match{
| case List() => None
| case x::Nil => Some(x)
| case x::xs => throw new Exception("More than one element")
| }
single: [A](xs: Seq[A])Option[A]
scala> single(List(1,2,3))
java.lang.Exception: More than one element
at .single(<console>:11)
... 33 elided
scala> single(List(1))
res13: Any = Some(1)
scala> single(List())
res14: Any = None
Like others indicated, there is no library implementation of what you seek. But it's easy to implement your own using a Pimp My Library approach. For example you can do the following.
object Main extends App {
object PML {
implicit class TraversableOps[T](val collection: TraversableOnce[T]) {
def single: Option[T] = collection.toList match {
case List(x) => Some(x)
case _ => None
}
}
}
import PML._
val collection: Seq[Int] = Seq(1, 2)
val (desiredItem, theOthers) = collection.partition(_ < 2)
println(desiredItem.single) // Some(1)
println(collection.single) // None
println(List.empty.single) // None
}

Passing a parameterized function in scala [duplicate]

This question already has answers here:
Generic Programming in Scala
(2 answers)
Closed 8 years ago.
I am writing a parameterized merge sort functionality and passing the less checker as a function.
However, the compiler is throwing following error.
type mismatch;
found : y.type (with underlying type T)
required: T
Here is my full code
def mergeSort[T] (list:List[T], pred:(T,T) =>Boolean):List[T]={
def merge[T](left:List[T], right:List[T], acc:List[T]):List[T] = (left,right) match{
case (Nil,_) => acc ++ right
case (_,Nil) => acc ++ left
case (x::xs, y::ys) => if(pred(y,x)) merge(left,ys,acc :+ y) else merge(xs,right,acc :+ x)
}
val m = list.length/2
if (m == 0) list
else {
val (l,r) = list splitAt m
merge(mergeSort(l,pred), mergeSort(r,pred), List())
}
}
The problem is at line
if(pred(y,x))
Everything seems logically correct, can't figure out why this is happening?
help appreciated.
This happens because in your inner function merge you define a type T, it's like you're redefining the one you created in mergeSort. Just change def merge[T] to def merge and keep using T to parameterized your lists left, right, etc. That way you are telling the compiler "This is the same T I defined above in mergeSort".

Scala: Generalised method to find match and return match dependant values in collection

I wish to find a match within a List and return values dependant on the match. The CollectFirst works well for matching on the elements of the collection but in this case I want to match on the member swEl of the element rather than on the element itself.
abstract class CanvNode (var swElI: Either[CSplit, VistaT])
{
private[this] var _swEl: Either[CSplit, VistaT] = swElI
def member = _swEl
def member_= (value: Either[CSplit, VistaT] ){ _swEl = value; attach}
def attach: Unit
attach
def findVista(origV: VistaIn): Option[Tuple2[CanvNode,VistaT]] = member match
{
case Right(v) if (v == origV) => Option(this, v)
case _ => None
}
}
def nodes(): List[CanvNode] = topNode :: splits.map(i => List(i.n1, i.n2)).flatten
//Is there a better way of implementing this?
val temp: Option[Tuple2[CanvNode, VistaT]] =
nodes.map(i => i.findVista(origV)).collectFirst{case Some (r) => r}
Do I need a View on that, or will the collectFirst method ensure the collection is only created as needed?
It strikes me that this must be a fairly general pattern. Another example could be if one had a List member of the main List's elements and wanted to return the fourth element if it had one. Is there a standard method I can call? Failing that I can create the following:
implicit class TraversableOnceRichClass[A](n: TraversableOnce[A])
{
def findSome[T](f: (A) => Option[T]) = n.map(f(_)).collectFirst{case Some (r) => r}
}
And then I can replace the above with:
val temp: Option[Tuple2[CanvNode, VistaT]] =
nodes.findSome(i => i.findVista(origV))
This uses implicit classes from 2.10, for pre 2.10 use:
class TraversableOnceRichClass[A](n: TraversableOnce[A])
{
def findSome[T](f: (A) => Option[T]) = n.map(f(_)).collectFirst{case Some (r) => r}
}
implicit final def TraversableOnceRichClass[A](n: List[A]):
TraversableOnceRichClass[A] = new TraversableOnceRichClass(n)
As an introductory side node: The operation you're describing (return the first Some if one exists, and None otherwise) is the sum of a collection of Options under the "first" monoid instance for Option. So for example, with Scalaz 6:
scala> Stream(None, None, Some("a"), None, Some("b")).map(_.fst).asMA.sum
res0: scalaz.FirstOption[java.lang.String] = Some(a)
Alternatively you could put something like this in scope:
implicit def optionFirstMonoid[A] = new Monoid[Option[A]] {
val zero = None
def append(a: Option[A], b: => Option[A]) = a orElse b
}
And skip the .map(_.fst) part. Unfortunately neither of these approaches is appropriately lazy in Scalaz, so the entire stream will be evaluated (unlike Haskell, where mconcat . map (First . Just) $ [1..] is just fine, for example).
Edit: As a side note to this side note: apparently Scalaz does provide a sumr that's appropriately lazy (for streams—none of these approaches will work on a view). So for example you can write this:
Stream.from(1).map(Some(_).fst).sumr
And not wait forever for your answer, just like in the Haskell version.
But assuming that we're sticking with the standard library, instead of this:
n.map(f(_)).collectFirst{ case Some(r) => r }
I'd write the following, which is more or less equivalent, and arguably more idiomatic:
n.flatMap(f(_)).headOption
For example, suppose we have a list of integers.
val xs = List(1, 2, 3, 4, 5)
We can make this lazy and map a function with a side effect over it to show us when its elements are accessed:
val ys = xs.view.map { i => println(i); i }
Now we can flatMap an Option-returning function over the resulting collection and use headOption to (safely) return the first element, if it exists:
scala> ys.flatMap(i => if (i > 2) Some(i.toString) else None).headOption
1
2
3
res0: Option[java.lang.String] = Some(3)
So clearly this stops when we hit a non-empty value, as desired. And yes, you'll definitely need a view if your original collection is strict, since otherwise headOption (or collectFirst) can't reach back and stop the flatMap (or map) that precedes it.
In your case you can skip findVista and get even more concise with something like this:
val temp = nodes.view.flatMap(
node => node.right.toOption.filter(_ == origV).map(node -> _)
).headOption
Whether you find this clearer or just a mess is a matter of taste, of course.