NullPointerException error by running scala code - scala

I installed Scala and spark on my laptop and afterwards, I set scala environment on Intellij. As you can see in the image below, I am receiving this error "NullPointerException" by running the code. Do you have any advice how to fix it? I appreciate your help so much.
Best!

Related

java.lang.ClassNotFoundException in Scala project

I built a MapReduce application and wrote it in Scala by using Akka Actor. I am trying to run 3 different JVMs in 3 Terminal Windows, one for my Client, and the other two for my remote Server (which I basically run on my localhost to simulate the distributed application). They can be compiled well. But when it comes to 'run', I saw the following exception. I am really bad at sbt and also new to Scala, Akka. Does anyone have any suggestions or ideas about how to fix it?
server side error
client side error
Updated: Nevermind, it turns out I made a silly mistake by forgetting to delete my old package name after deleted the package folder.

spark-shell initiation error /tmp/hive not writable

I have installed Scala, Spark, SBT and winutils on the windows box. Also configured JAVA_HOME, SPARK_HOME, SCALA_HOME, SBT_HOME appropriately. Still, when I try to run 'spark-shell', I run into a very common problem which I see many people have faced.
I have tried giving access to c:/tmp/hive directory through winutils as well as through windows explorer. Yet the same issue. I would appreciate if anyone can tell me what else I might be missing?
Versions:
Spark: 2.1.1
Scala: 2.11
Below are the links to screenshots:
https://filedropper.com/filemanager/public.php?service=files&t=b5cc12549b9ae19ff969c65cb952c135
https://filedropper.com/filemanager/public.php?service=files&t=cb5b9b8f14980963be6f0a6511f71455
https://filedropper.com/filemanager/public.php?service=files&t=a2d471abfe57ca020a700c226b906faf
Appreciate your help!
Thanks,
Manish
Thank you cricket_007 and Jacek. I appreciate your comments.
It turned out an issue with version compatibility with winutils. Had to try several versions until the right version changed the permissions on /tmp/hive

Attaching a Remote Debug session to Spark from Eclipse Scala IDE

I've been wracking my brain over this for the last two days trying to get it to work. I have a local Spark installation on my Mac that I'm trying to attach a debugger to. I set:
SPARK_JAVA_OPTS=-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005
Then I submit my job to spark-submit and launch my debug configuration in eclipse which is configured as a Socket Attach Remote Debugging session. The debugger attaches, my job resumes and executes, but none of my breakpoints are ever hit, no matter what I do.
The only way I can get it to hit a breakpoint is by attaching to a spark-shell, creating a Java Exception breakpoint and issuing
throw new java.lang.Exception()
The debugger will not stop at normal breakpoints for me.
I created a standalone Hellow World scala app and was able to attach to it and have it stop at a regular breakpoint without any issues.
Environment: Mac OS, latest Eclipse, latest Scala IDE, Spark 1.3.1, Scala 2.10.5
Thanks in advance.
I had a similar issue and there were 2 things that fixed my problem -
1. The .jar file and source not a little out of sync for me , so had to recompile and redeploy.
2. Next On the JAVAOPTS I had a suspend=n.
After correcting these two it worked for me .

Scala compiler never ending compilation

I noticed my Scala IDE consuming all the available CPU... I tried to compile the project via SBT from the command line and I got the same situation.
How can I get to know what's going wrong? Is there a way to find out what file or class/object/trait is being compiled?
I'm getting the same issue in 2.10.2 and 2.10.4-RC1
I found out the problem was due to importing a library from Slick 2.0 called heterogenous.syntax. I also got a contact with the Slick developing team in order to give'em some code to investigate on.

Weka class cannot be initialized: InvocationTargetException

This is my first time using weka, I am sorry if my question seems naive. But I was really stuck by this problem.
I am using weka in my own java project in eclipse. I have successfully import weka.jar with attached wekasource.jar. But when I ran the program, all the weka class always failed to be initialized(attribute, Fastvector etc.). All the exceptions are the same:
InvocationTargetException
I check the error stack where showed: java.lang.NoClassDefFoundError: weka/core/attribute
Additional Info: I tried to create a new project in eclipse ,and use weka, it works. But it still can not work in my own existing project.
Does anyone have some ideas how should I solve this problem?
It seems I have solved this weird problem. The solution is simple:
right click on the project, in the java build path, check weka.jar in order and export tab.
Hope it can help the later people who face the similar problem