Reading lines from file in Scala - scala

Again Scala basic operations are making my life painful :D.
So I have to read lines from file...just a trivial I/O operation.
In every example on internet they are doing:
import scala.io.Source
for(line <- Source.fromPath("integerArray.txt").getLines())
println(line)
But my IntelliJ is throwing error : value fromPath is not a member of object scala.io.Source.
Does anyone knows what is problem here? ... I have installed last version of Scala few months ago and IntelliJ Scala plugin is also up to date so I doubt this might be a reason...

There is no fromPath in Source, just a fromFile, which accepts a String path. Good luck on Coursera.

There was Source.fromPath around 2.8. Briefly.
What, you're not using this version?
It was removed here, with "Review by community."
See? We just weren't paying attention.

Related

Scala worksheet evaluation results not showing up in Intellij

I am trying out scala for the first time and am following this page to set upmy first scala project :
https://docs.scala-lang.org/getting-started-intellij-track/getting-started-with-scala-in-intellij.html
However despite this on creating a simple worksheet with println("hello") upon evaluation no results come up.
What am I doing wrong ?
This seems to be an issue that is only happens on Scala 2.13. (I have not exhaustively tested it though.)
I have used 2.12.9 successfully.
I opened an issue on the Scala docs project too.
https://github.com/scala/docs.scala-lang/issues/1486

Spark cannot find case class on classpath

I have an issue where Spark is failing to generate code for a case class. Here is the spark error
Caused by: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 52, Column 43: Identifier expected instead of '.'
Here is the referenced line in the generated code
/* 052 */ private com.avro.message.video.public.MetricObservation MapObjects_loopValue34;
It should be noted that com.avro.message.video.public.MetricObservation is a nested case class in part of a larger hierarchy. It is also used in other places in the code fine. It should also be noted that this pipeline works fine if I use the RDD API, but I want to use the Dataset API because I want to write out the Dataset in parquet. Has anyone seen this issue before?
I'm using Scala 2.11 and Spark 2.1.0. I was able to upgrade to Spark 2.2.1 and the issue is still there.
Do you think that SI-7555 or something like it has any bearing on this? I have noticed the past that Scala reflection has had issues generating TypeTags for statically nested classes. Do you think something like that is going on or is this strictly a catalyst issue in spark? You might want to file a spark ticket too.
So it turns out that changing the package name of the affect class "fixes" (ie made go away) the problem. I really have no idea why this is or even how to reproduce it in a small test case. What worked for me was I just created a higher level package that work. Specifically com.avro.message.video.public -> com.avro.message.publicVideo.

Casbah Scala MongoDB driver - a strange error

I am trying to use Casbah, I get a strange error right in the beginning, on this line:
val mongoDB = MongoConnection("MyDatabase")
the error on MongoConenction says:
class file needed by MongoConnection is missing. reference type
MongoOptions of package com.mongodb refers to nonexisting symbol.
I do not know what to do with this. The jars that I have attached to my projects are:
casbah-commons_2.9.1-3.0.0-SNAPSHOT.jar
casbah-core_2.9.1-3.0.0-SNAPSHOT.jar
casbah-gridfs_2.9.1-3.0.0-SNAPSHOT.jar
casbah-query_2.9.1-3.0.0-SNAPSHOT.jar
casbah-util_2.9.1-3.0.0-SNAPSHOT.jar
which looks like a full setup of Casbah and I do not understand what it might be yearning for. So there is the question number one - what do I have to do to resolve this problem?
The question number two is - the Casbah tutorial says that I could import just one thing, and get the mongoConn() method, which is also not truth. The mongoConn() simply does not get found if I follow the instructions. So, how can I acheive that everythong works as in the tutorial?
I don't know the details of your setup, but it seems like you are not referencing the dependencies of the casbah-commons module.
According to the docs, those are:
mongo-java-driver, scalaj-collection, scalaj-time, JodaTime, slf4j-api

Issue with BufferedReader.readLine using sbt run or sbt console

My question is quick I'm working on a small console for reading input in and then calling the appropriate code. I'm using sbt and I've encountered an issue where when I try to read input after running my program with sbt run, inside sbt console, or even in the plain old scala interpreter.
The prompt appears to just hang, but if I hit return it does actually read the input in. Though the shell's buffer remains empty. Here is the general code I've been trying that has been giving me the issue.
import java.io._
val s = new BufferedReader(new InputStreamReader(System.in))
val line = s.readLine
println(line)
Does anyone know why this is, and if so is there a way to fix this? I would love to be able to see what I type when I run my program from sbt. Without seeing my typing in the shell it makes the testing and development of my project much less enjoyable.
This is really a Java API question, although in Scala. BufferedReader.readLine() will consume all the characters you type from System.in until it has a whole line, at which time it will return the line as you said.
Console input was difficult in Java with the original java.io classes. Prior to Java6, I've seen a couple of messy solutions to this, but fortunately a new class was introduced with that release to make it much easier: java.io.Console. I think it then becomes as simple as
val line = System.console.readLine
println(line)

No strictfp in Scala - workarounds?

I searched the web for how to enforce srictfp in Scala but could not find any hint of it. There are some people complaining about it, but real solutions cannot be found. There is a bugtracker entry about it which is almost two years old. As it seems there is no elegant fix for it on the way I'm looking for workarounds.
My current idea is to set the appropiate method flag ACC_STRICT in the generated bytecode by myself somehow but I have no idea what would be the best solution to do so. A Scala Compiler Plugin comes to mind or just hacking flags in a hex editor. Maybe someone faced the same challenge and can tell me his or her solution?
You could add a post-processor in your build process that would add the strictfp modifier to the generated class (i.e. setting the ACC_STRICT flag as you say).
You can implement such a post-processor using Javassist for example. This could look like this:
CtClass clazz = ClassPool.getDefault().makeClass(
new FileInputStream("old/HelloWorld.class"));
CtMethod method = clazz.getDeclaredMethod("testMethod");
method.setModifiers(method.getModifiers() | Modifier.STRICT);
clazz.detach();
clazz.toBytecode(new DataOutputStream(new FileOutputStream(
"new/HelloWorld.class")));
You would then have to find a way to configure which classes/method need to be modified this way.
Scala has a strictfp annotation now:
#strictfp
def f() = …