Scala worksheet evaluation results not showing up in Intellij - scala

I am trying out scala for the first time and am following this page to set upmy first scala project :
https://docs.scala-lang.org/getting-started-intellij-track/getting-started-with-scala-in-intellij.html
However despite this on creating a simple worksheet with println("hello") upon evaluation no results come up.
What am I doing wrong ?

This seems to be an issue that is only happens on Scala 2.13. (I have not exhaustively tested it though.)
I have used 2.12.9 successfully.
I opened an issue on the Scala docs project too.
https://github.com/scala/docs.scala-lang/issues/1486

Related

Spark cannot find case class on classpath

I have an issue where Spark is failing to generate code for a case class. Here is the spark error
Caused by: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 52, Column 43: Identifier expected instead of '.'
Here is the referenced line in the generated code
/* 052 */ private com.avro.message.video.public.MetricObservation MapObjects_loopValue34;
It should be noted that com.avro.message.video.public.MetricObservation is a nested case class in part of a larger hierarchy. It is also used in other places in the code fine. It should also be noted that this pipeline works fine if I use the RDD API, but I want to use the Dataset API because I want to write out the Dataset in parquet. Has anyone seen this issue before?
I'm using Scala 2.11 and Spark 2.1.0. I was able to upgrade to Spark 2.2.1 and the issue is still there.
Do you think that SI-7555 or something like it has any bearing on this? I have noticed the past that Scala reflection has had issues generating TypeTags for statically nested classes. Do you think something like that is going on or is this strictly a catalyst issue in spark? You might want to file a spark ticket too.
So it turns out that changing the package name of the affect class "fixes" (ie made go away) the problem. I really have no idea why this is or even how to reproduce it in a small test case. What worked for me was I just created a higher level package that work. Specifically com.avro.message.video.public -> com.avro.message.publicVideo.

Scalac hanging on phase typer of RegexParser

I have a scala program which among other things has a parser-combinator. This is done by extending scala.util.parsing.combinator.RegexParsers. I had developed it using Scala 2.10 and all was working fine.
Yesterday I upgraded my system to Scala 2.11.4, together with IntelliJ 14.02 (not that it matters).
However, whenever I try to compile this program now, scalac hangs during this phase:
scalac: phase typer on MyParser.scala
I changed absolutely nothing to this code, I can't understand why it is hanging or from where I should start. IntelliJ had a warning about postfix operators for parser expressions like constants_def? or structure_def*, where the ? and * follow the token, and I added this line, because of the SIP: Language Modularization Features:
import scala.language.postfixOps
It didn't really have any effect and the problem is still the same.
How can I troubleshoot what is going on? I can't figure out from where to start understanding why the phase typer is just hanging indefinitely.
It looks like a workaround is to add an explicit type:
def da_gd : Parser[Expression with TimedCondition] =
pref_timed_gd | da_gd_conjunction |
(empty_temporal: Parser[Expression with TimedCondition])
A stack dump shows that it's figuring out the type of the expr, and -Ytyper-debug shows the vicinity.
Since nothing good happens after midnight, I'll stop there.

Why does IntelliJ IDEA throw compilation error?

Compiling Spark gives this compile error :
To fix I modify Utils.classIsLoadable method to just return true:
def classIsLoadable(clazz: String): Boolean = {
// Try { Class.forName(clazz, false, getContextOrSparkClassLoader) }.isSuccess
true
}
I realise this is not a good fix, but so far Spark seems to be running correctly from source. Has this compile error been experienced before and is there a fix? Will returning true suffice for now , I'm not sure what impact modifying this return value may have?
I suggest compiling Spark from the command-line using Maven or SBT instead of trying to use your IDE's compiler. Many of the core Spark developers use IntelliJ for editing Spark's source code but still use the command-line compilers, largely because it's been difficult to get the project to build correctly inside IDEs. Even if you're using an external compiler, you should still be able to benefit from IntelliJ's syntax highlighting, type checking, etc.
Here's a relevant discussion from the Spark developer mailing list: http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-IDEA-cannot-compile-TreeNode-scala-td7090.html
Note that Spark users should be able to use IntelliJ to compile applications that depend on Spark; this issue only affects developers who want to build Spark itself.
If you're interested in fixing the build to work with IntelliJ, I recommend opening a ticket on the Spark issue tracker.

Reading lines from file in Scala

Again Scala basic operations are making my life painful :D.
So I have to read lines from file...just a trivial I/O operation.
In every example on internet they are doing:
import scala.io.Source
for(line <- Source.fromPath("integerArray.txt").getLines())
println(line)
But my IntelliJ is throwing error : value fromPath is not a member of object scala.io.Source.
Does anyone knows what is problem here? ... I have installed last version of Scala few months ago and IntelliJ Scala plugin is also up to date so I doubt this might be a reason...
There is no fromPath in Source, just a fromFile, which accepts a String path. Good luck on Coursera.
There was Source.fromPath around 2.8. Briefly.
What, you're not using this version?
It was removed here, with "Review by community."
See? We just weren't paying attention.

Issue with BufferedReader.readLine using sbt run or sbt console

My question is quick I'm working on a small console for reading input in and then calling the appropriate code. I'm using sbt and I've encountered an issue where when I try to read input after running my program with sbt run, inside sbt console, or even in the plain old scala interpreter.
The prompt appears to just hang, but if I hit return it does actually read the input in. Though the shell's buffer remains empty. Here is the general code I've been trying that has been giving me the issue.
import java.io._
val s = new BufferedReader(new InputStreamReader(System.in))
val line = s.readLine
println(line)
Does anyone know why this is, and if so is there a way to fix this? I would love to be able to see what I type when I run my program from sbt. Without seeing my typing in the shell it makes the testing and development of my project much less enjoyable.
This is really a Java API question, although in Scala. BufferedReader.readLine() will consume all the characters you type from System.in until it has a whole line, at which time it will return the line as you said.
Console input was difficult in Java with the original java.io classes. Prior to Java6, I've seen a couple of messy solutions to this, but fortunately a new class was introduced with that release to make it much easier: java.io.Console. I think it then becomes as simple as
val line = System.console.readLine
println(line)