GC overhead limit exceeded (Spark Scala) - scala

I am getting the error GC overhead limit exceeded using Intellij with a process made in Scala Spark as you can see in the following image:
Error GC overhead limit exceeded
I have been searching a solution for this error here and another sites but the error is still on.
This is my settigs:
Scala Compiler:
Settings Scala Compiler
Intellij Compiler:
Intellij Compiler
What can I have wrong?

Related

I am getting exception while running Scala Worksheet in intellij

I am using Scala 2.11.8 and running simple program in Scala worksheet.
I am getting the below error. Please let me know if I am missing anything here.
Internal error: null
org.jetbrains.jps.incremental.scala.remote.ClientEventProcessor.process(ClientEventProcessor.scala:22)
org.jetbrains.jps.incremental.scala.remote.RemoteResourceOwner.handle(RemoteResourceOwner.scala:47)
org.jetbrains.jps.incremental.scala.remote.RemoteResourceOwner.handle$(RemoteResourceOwner.scala:37)
org.jetbrains.plugins.scala.compiler.RemoteServerRunner.handle(RemoteServerRunner.scala:14)
org.jetbrains.jps.incremental.scala.remote.RemoteResourceOwner.$anonfun$send$5(RemoteResourceOwner.scala:30)

Spark Application Using Scala IDE

I'm trying to run Spark classifier using scala on my machine but I am getting the following error:
Only one SparkContext may be running in this JVM (see SPARK-2243). To
ignore this error, set spark.driver.allowMultipleContexts = true.

sbt ignoring the commandline options? (windows)

I am running
sbt compile
for a compiling a program. I am getting the
OutOfMemoryError: Java heap space
Exception. Instead I tried
sbt compile -J-Xms4G -J-Xmx6G
But I don't see any difference in memory usage of the program, like ignoring the command-line options, and I get the same exception (and when crashing it is barely using 0.8GB).

Scala Compiler doesn't terminate (programmatically invoked)

I am programmatically compiling Scala code with this piece of code:
val compiler = new Global(settings, reporter)
val run = new compiler.Run
run compile sourceFiles.map(_.fullPath).toList
The 2.10 RC1 compiler works for like three minutes then crashes, whereas 2.10 infinitely does something (full CPU usage). When I invoke the compiler via SBT (rather than programmatically) it works fine and compiles within less than a minute.
The shortened output looks like this (verbose - and running three minutes between the first line and the error):
[loaded class file C:\Program Files\scala\lib\scala-library.jar(scala/collection/mutable/StringBuilder.class) in 3ms]
Scala 2.10 stable
No further output. 100% CPU usage of 1 Core.
Scala 2.10 RC1
With RC1 I get this error after approximately 3 minutes:
error:
while compiling: Foo.scala
during phase: typer
library version: version 2.10.0-RC1
compiler version: version 2.10.0-RC1
reconstructed args:
Next piece of output (and final output before my application crashes) is an OutOfMemoryError. I'm not sure whether its cause is the code itself or the compile error. Both options appear strange to me, as it compiles on the SBT console and a compiler error should not consume that much memory, should it?
uncaught exception during compilation: java.lang.OutOfMemoryError
[error] (run-main) java.lang.OutOfMemoryError: Java heap space
java.lang.OutOfMemoryError: Java heap space
at scala.reflect.internal.Symbols$Symbol.createRefinementClassSymbol(Symbols.scala:1068)
at scala.reflect.internal.Symbols$Symbol.newRefinementClass(Symbols.scala:406)
at scala.reflect.internal.Types$class.refinedType(Types.scala:3504)
at scala.reflect.internal.SymbolTable.refinedType(SymbolTable.scala:12)
at scala.reflect.internal.Types$Type.narrow(Types.scala:459)
at scala.reflect.internal.Types$class.specializedBy$1(Types.scala:6125)
at scala.reflect.internal.Types$class.specializesSym(Types.scala:6129)
at scala.reflect.internal.SymbolTable.specializesSym(SymbolTable.scala:12)
at scala.reflect.internal.Types$$anonfun$thirdTry$1$2.apply(Types.scala:6021)
at scala.reflect.internal.Types$$anonfun$thirdTry$1$2.apply(Types.scala:6021)
at scala.collection.Iterator$class.forall(Iterator.scala:739)
at scala.collection.AbstractIterator.forall(Iterator.scala:1156)
at scala.collection.IterableLike$class.forall(IterableLike.scala:75)
at scala.reflect.internal.Scopes$Scope.forall(Scopes.scala:44)
at scala.reflect.internal.Types$class.thirdTry$1(Types.scala:6021)
at scala.reflect.internal.Types$class.secondTry$1(Types.scala:5982)
at scala.reflect.internal.Types$class.firstTry$1(Types.scala:5958)
at scala.reflect.internal.Types$class.isSubType2(Types.scala:6101)
at scala.reflect.internal.Types$class.isSubType(Types.scala:5710)
at scala.reflect.internal.SymbolTable.isSubType(SymbolTable.scala:12)
at scala.reflect.internal.Types$class.thirdTry$1(Types.scala:6043)
at scala.reflect.internal.Types$class.secondTry$1(Types.scala:5982)
at scala.reflect.internal.Types$class.firstTry$1(Types.scala:5958)
at scala.reflect.internal.Types$class.isSubType2(Types.scala:6101)
at scala.reflect.internal.Types$class.isSubType(Types.scala:5710)
at scala.reflect.internal.SymbolTable.isSubType(SymbolTable.scala:12)
at scala.reflect.internal.Types$class.scala$reflect$internal$Types$$specializesSym(Types.scala:6142)
at scala.reflect.internal.Types$class.specializedBy$1(Types.scala:6125)
at scala.reflect.internal.Types$class.specializesSym(Types.scala:6129)
at scala.reflect.internal.SymbolTable.specializesSym(SymbolTable.scala:12)
at scala.reflect.internal.Types$$anonfun$thirdTry$1$2.apply(Types.scala:6021)
at scala.reflect.internal.Types$$anonfun$thirdTry$1$2.apply(Types.scala:6021)
[trace] Stack trace suppressed: run 'last compile:run' for the full output.
java.lang.RuntimeException: Nonzero exit code: 1
at scala.sys.package$.error(package.scala:27)
I stumbled across Why am I getting OutOfMemoryError compilation error in Scala?. I am, however, not sure whether I'm actually simply lacking heap space for the compilation. There is no Maven involved, it's only Scala code and some JARs on the local build path.
I'm looking for the cause of the OutOfMemory error or a tweak to fix the error.
Using jvisualvm.exe (in the JDK) we found out that the compiler was indeed running low on memory. The GC was working too hard freeing memory, so it looked like an infinite loop (to be precise: happened when a symbol table's HashSet was enlarged).
Increasing the heap size to 2GB fixed the issue here.

typesafe stack scala error

I am using http://typesafe.com/stack/ for the first time, and I created simple akka project. My scala version is 2.9.2 I get the following error.
[info] Done updating.
[info] Compiling 1 Scala source to /Users/hrishikeshparanjape/git-public/web-service/target/scala-2.9.2/classes...
[info] 'compiler-interface' not yet compiled for Scala 2.9.2. Compiling...
sbt appears to be exiting abnormally.
The log file for this session is at /var/folders/26/hqgjyf0j7192hmjdsz17f3v80000gn/T/sbt2587622650679130928.log
java.lang.OutOfMemoryError: PermGen space
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
at java.util.concurrent.FutureTask.get(FutureTask.java:83)
at sbt.CompletionService$$anon$1.take(CompletionService.scala:29)
at sbt.Execute.next$1(Execute.scala:74)
at sbt.Execute.processAll(Execute.scala:77)
at sbt.Execute.runKeep(Execute.scala:57)
at sbt.EvaluateTask$.run$1(EvaluateTask.scala:109)
at sbt.EvaluateTask$.runTask(EvaluateTask.scala:124)
at sbt.Aggregation$$anonfun$7.apply(Aggregation.scala:87)
at sbt.Aggregation$$anonfun$7.apply(Aggregation.scala:85)
at sbt.EvaluateTask$.withStreams(EvaluateTask.scala:87)
at sbt.Aggregation$.runTasks(Aggregation.scala:85)
at sbt.Aggregation$$anonfun$applyDynamicTasks$1.apply(Aggregation.scala:141)
at sbt.Aggregation$$anonfun$applyDynamicTasks$1.apply(Aggregation.scala:136)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:64)
at sbt.Command$$anonfun$applyEffect$2$$anonfun$apply$3.apply(Command.scala:64)
at sbt.Command$.process(Command.scala:92)
at sbt.MainLoop$$anonfun$next$1$$anonfun$apply$1.apply(Main.scala:121)
at sbt.MainLoop$$anonfun$next$1$$anonfun$apply$1.apply(Main.scala:121)
at sbt.State$$anon$1.process(State.scala:154)
at sbt.MainLoop$$anonfun$next$1.apply(Main.scala:121)
at sbt.MainLoop$$anonfun$next$1.apply(Main.scala:121)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.MainLoop$.next(Main.scala:121)
at sbt.MainLoop$.run(Main.scala:114)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(Main.scala:103)
at sbt.MainLoop$$anonfun$runWithNewLog$1.apply(Main.scala:100)
at sbt.Using.apply(Using.scala:25)
at sbt.MainLoop$.runWithNewLog(Main.scala:100)
at sbt.MainLoop$.runAndClearLast(Main.scala:83)
at sbt.MainLoop$.runLoggedLoop(Main.scala:67)
at sbt.MainLoop$.runLogged(Main.scala:60)
Error during sbt execution: java.lang.OutOfMemoryError: PermGen space
Please help.
Your project needs more memory to be executed (that's what the java.lang.OutOfMemoryError: PermGen space tells you). I have never used the typesafe stack, thus I don't know if it is possible to configure memory parameters directly.
But if you run Linux you can type
env JAVA_OPTS="-Xms512m -Xmx1024m -Xss1M -XX:MaxPermSize=512" <command>
where command is the command to execute your project (probably it is sbt). Of course, you can change the size of the parameters if you need more/less space.