FPGrowth algorithm of Spark Mllib stalling on small dataset - scala

I am using Spark's Mllib FPGrowth algorithm to generate frequent itemsets. I'm getting the data from redshift and running the code on it. But the code stalls down with a lot of shuffle occurring between workers with the amount of rows being worked on equal to 5000. The code keeps running for 1-2 hours and then I have to kill it manually. I see the line - Asking to fetch output locations for shuffle 1 on the console and nothing else after this. The same code works fine if I increase the dataset to around 50000 rows. I used jstack to check the thread dump and it looks as if there is a deadlock occurring in FPTree when it fetches Iterator.hasnext(). My configuration is 4 workers with 6 executors with 4 cores and 15GB RAM each.
P.S - I know spark is not meant for small datasets but this just for testing purposes but the thread deadlock looks like a bug
Code :
val data = data1.map { case Row(city : String, cart : Long, p_list : String) =>
p_list
}.rdd
print(data.count)
val transactions: RDD[Array[String]] = data.map(s => s.trim.split(','))
import org.apache.spark.mllib.fpm.AssociationRules
import org.apache.spark.mllib.fpm.FPGrowth.FreqItemset
val fpg = new FPGrowth()
.setMinSupport(0.0001)
.setNumPartitions(24)
val model = fpg.run(transactions)
print(model.freqItemsets.count)
val res = model.generateAssociationRules(0.001)
print(res.count)
The code stalls as soon as it reaches the model.freqItemsets.count statement.
Thread dump -
"Attach Listener" #145 daemon prio=9 os_prio=0 tid=0x00007f7508001000 nid=0x4481 waiting on condition [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"block-manager-slave-async-thread-pool-6" #144 daemon prio=5 os_prio=0 tid=0x00007f7498005000 nid=0x4400 waiting on condition [0x00007f755eee7000]
java.lang.Thread.State: TIMED_WAITING (parking)
at sun.misc.Unsafe.park(Native Method)
- parking to wait for <0x00000004022797e8> (a java.util.concurrent.SynchronousQueue$TransferStack)
at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"block-manager-slave-async-thread-pool-5" #143 daemon prio=5 os_prio=0 tid=0x00007f7498002800 nid=0x43fe waiting on condition [0x00007f7549dfa000]
java.lang.Thread.State: TIMED_WAITING (parking)
at sun.misc.Unsafe.park(Native Method)
- parking to wait for <0x00000004022797e8> (a java.util.concurrent.SynchronousQueue$TransferStack)
at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"block-manager-slave-async-thread-pool-4" #142 daemon prio=5 os_prio=0 tid=0x00007f74c801f000 nid=0x43fc waiting on condition [0x00007f755c2ac000]
java.lang.Thread.State: TIMED_WAITING (parking)
at sun.misc.Unsafe.park(Native Method)
- parking to wait for <0x00000004022797e8> (a java.util.concurrent.SynchronousQueue$TransferStack)
at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:362)
at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:941)
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1066)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1127)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"shuffle-server-1" #58 daemon prio=5 os_prio=0 tid=0x00007f7490002000 nid=0x421f runnable [0x00007f7540181000]
java.lang.Thread.State: RUNNABLE
at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked <0x0000000404d0c368> (a io.netty.channel.nio.SelectedSelectionKeySet)
- locked <0x0000000405ec34f0> (a java.util.Collections$UnmodifiableSet)
- locked <0x0000000404d0c2d0> (a sun.nio.ch.EPollSelectorImpl)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:622)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:310)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
"Executor task launch worker-3" #140 daemon prio=5 os_prio=0 tid=0x00007f74b0004000 nid=0x4183 runnable [0x00007f7540281000]
java.lang.Thread.State: RUNNABLE
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1765)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1134)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1134)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"Executor task launch worker-2" #139 daemon prio=5 os_prio=0 tid=0x00007f74b0002800 nid=0x4182 runnable [0x00007f7540382000]
java.lang.Thread.State: RUNNABLE
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:216)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1763)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1134)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1134)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"Executor task launch worker-1" #138 daemon prio=5 os_prio=0 tid=0x00007f74b0001000 nid=0x4181 runnable [0x00007f755e9e1000]
java.lang.Thread.State: RUNNABLE
at scala.collection.mutable.HashTable$class.findOrAddEntry(HashTable.scala:166)
at scala.collection.mutable.HashMap.findOrAddEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.put(HashMap.scala:76)
at scala.collection.mutable.HashMap.update(HashMap.scala:81)
at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:194)
at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:80)
at org.apache.spark.mllib.fpm.FPTree$$anonfun$add$1.apply(FPTree.scala:43)
at org.apache.spark.mllib.fpm.FPTree$$anonfun$add$1.apply(FPTree.scala:40)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.mllib.fpm.FPTree.add(FPTree.scala:40)
at org.apache.spark.mllib.fpm.FPTree$$anonfun$org$apache$spark$mllib$fpm$FPTree$$project$1.apply(FPTree.scala:75)
at org.apache.spark.mllib.fpm.FPTree$$anonfun$org$apache$spark$mllib$fpm$FPTree$$project$1.apply(FPTree.scala:68)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
at org.apache.spark.mllib.fpm.FPTree.org$apache$spark$mllib$fpm$FPTree$$project(FPTree.scala:68)
at org.apache.spark.mllib.fpm.FPTree$$anonfun$extract$1$$anonfun$apply$2.apply(FPTree.scala:108)
at org.apache.spark.mllib.fpm.FPTree$$anonfun$extract$1$$anonfun$apply$2.apply(FPTree.scala:108)
at scala.collection.Iterator$JoinIterator.rhs$lzycompute(Iterator.scala:208)
- locked <0x00000006ce73a440> (a scala.collection.Iterator$JoinIterator)
at scala.collection.Iterator$JoinIterator.rhs(Iterator.scala:208)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:216)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$JoinIterator.hasNext(Iterator.scala:219)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1763)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1134)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1134)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner" #137 daemon prio=5 os_prio=0 tid=0x00007f74dc8c2800 nid=0x417b in Object.wait() [0x00007f755ede6000]
java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x0000000408920fd8> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
- locked <0x0000000408920fd8> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
at org.apache.hadoop.fs.FileSystem$Statistics$StatisticsDataReferenceCleaner.run(FileSystem.java:3063)
at java.lang.Thread.run(Thread.java:745)
"shuffle-client-1" #54 daemon prio=5 os_prio=0 tid=0x00007f74dc818000 nid=0x417a runnable [0x00007f755efe8000]
java.lang.Thread.State: RUNNABLE
at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked <0x0000000405ee6968> (a io.netty.channel.nio.SelectedSelectionKeySet)
- locked <0x0000000405f108e0> (a java.util.Collections$UnmodifiableSet)
- locked <0x0000000405ee68d0> (a sun.nio.ch.EPollSelectorImpl)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:622)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:310)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
"shuffle-client-0" #53 daemon prio=5 os_prio=0 tid=0x00007f74dc0dd000 nid=0x4179 runnable [0x00007f755ebe4000]
java.lang.Thread.State: RUNNABLE
at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked <0x0000000405dc6e08> (a io.netty.channel.nio.SelectedSelectionKeySet)
- locked <0x0000000405dc8ef8> (a java.util.Collections$UnmodifiableSet)
- locked <0x0000000405dc6d60> (a sun.nio.ch.EPollSelectorImpl)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:622)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:310)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
"shuffle-client-0" #67 daemon prio=5 os_prio=0 tid=0x00007f74dc094000 nid=0x40d1 runnable [0x00007f755e2b7000]
java.lang.Thread.State: RUNNABLE
at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method)
at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269)
at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93)
at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86)
- locked <0x0000000405e03260> (a io.netty.channel.nio.SelectedSelectionKeySet)
- locked <0x0000000405e03250> (a java.util.Collections$UnmodifiableSet)
- locked <0x0000000405e03208> (a sun.nio.ch.EPollSelectorImpl)
at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97)
at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:622)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:310)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
"Executor task launch worker-0" #66 daemon prio=5 os_prio=0 tid=0x00007f74b8002000 nid=0x40d0 runnable [0x00007f755eae2000]
java.lang.Thread.State: RUNNABLE
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:438)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$JoinIterator.next(Iterator.scala:232)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:444)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1765)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1134)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1134)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1899)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Related

Filling data from PySpark to PostgreSQL via JDBC does not work

I'm using a JDBC connector for the first time and trying to write a dataframe into a PostgreSQL database. I am strictly doing it according to:
How to use JDBC source to write and read data in (Py)Spark?
Driver is installed and code is specified:
postgres_url = "jdbc:postgresql://<IP_ADRESS>:<PORT>/postgres"
properties = {
"user": "user",
"password": "password",
"driver": "org.postgresql.Driver"
}
df.write.jdbc(url=postgres_url, table="newtable", mode = "overwrite", properties = properties)
After executing the code, I get this error:
py4j.protocol.Py4JJavaError: An error occurred while calling o110.jdbc.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 1.0 failed 4 times, most recent failure: Lost task 1.3 in stage 1.0 (TID 49, <SERVER>, executor 9): org.postgresql.util.PSQLException: The connection attempt failed.
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:331)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:223)
at org.postgresql.Driver.makeConnection(Driver.java:400)
at org.postgresql.Driver.connect(Driver.java:259)
at org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.connect(DriverWrapper.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:610)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.postgresql.core.PGStream.createSocket(PGStream.java:241)
at org.postgresql.core.PGStream.<init>(PGStream.java:98)
at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:109)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:235)
... 22 more
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:935)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:933)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:933)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.saveTable(JdbcUtils.scala:834)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:82)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:515)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:331)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:223)
at org.postgresql.Driver.makeConnection(Driver.java:400)
at org.postgresql.Driver.connect(Driver.java:259)
at org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.connect(DriverWrapper.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:610)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
... 1 more
Caused by: java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.postgresql.core.PGStream.createSocket(PGStream.java:241)
at org.postgresql.core.PGStream.<init>(PGStream.java:98)
at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:109)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:235)
... 22 more
The weird thing is, when I look afterwards into the PostgreSQL, I can see the newly created table, but it is empty. So it seems that installing the connection and creating the structure DOES work, but filling in the data does not. I also tried "ssl" = "false" in the properties, but did not work either. Any suggestions would be greatly appreciated.
Best regards
The connection should be opened between your driver and Postgres AND between your workers and Postgres. Did you check that? The creation of the table is probably done on the driver's side. But then, your workers cannot connect to your DB, therefore, no data. Check that!
You probably have someone in your company who can help you with that. Either, some security/network guy. Otherwise, you need to connect on each driver and test the connectivity to your DB

Java parallelStream run task always wait a long time, why?

I use parallelStream to execute backgroud task in Spring #Async annotation. but the task always blocked then stop. the core code is:
#Async
public void task() {
List<ConfigSymbol> symbols = SymbolUtils.getSaasOpenSymbol();
symbols .parallelStream().forEach(symbol->{
//execute task. 1 minute per loop
}
}
I catch the thread stack to diagnosis the problem. but can not find reason.
why the ForkJoinTask status is waiting .
"SimpleAsyncTaskExecutor-1609" #1777 prio=5 os_prio=0 tid=0x00007fb968006800 nid=0x6f5 in Object.wait() [0x00007fba2d119000]
java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
at java.util.concurrent.ForkJoinTask.externalAwaitDone(ForkJoinTask.java:334)
- locked <0x00000006428a0a38> (a java.util.stream.ForEachOps$ForEachTask)
at java.util.concurrent.ForkJoinTask.doInvoke(ForkJoinTask.java:405)
at java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:734)
at java.util.stream.ForEachOps$ForEachOp.evaluateParallel(ForEachOps.java:159)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateParallel(ForEachOps.java:173)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:485)
at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:650)
at com.chainup.stats.service.impl.StatsSymbolRateServiceImpl.getCalculationBTCExchangeRate(StatsSymbolRateServiceImpl.java:141)
at com.chainup.stats.service.impl.StatsSymbolRateServiceImpl.runThreadExchangeRate(StatsSymbolRateServiceImpl.java:67)
at com.chainup.stats.action.StatAct.insertStatsSymbolRate(StatAct.java:442)
at com.chainup.stats.action.StatAct$$FastClassBySpringCGLIB$$74ae31d4.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:769)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:747)
at org.springframework.aop.aspectj.MethodInvocationProceedingJoinPoint.proceed(MethodInvocationProceedingJoinPoint.java:88)
at org.springframework.cloud.sleuth.instrument.async.TraceAsyncAspect.traceBackgroundThread(TraceAsyncAspect.java:66)
at sun.reflect.GeneratedMethodAccessor277.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethodWithGivenArgs(AbstractAspectJAdvice.java:644)
at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethod(AbstractAspectJAdvice.java:633)
at org.springframework.aop.aspectj.AspectJAroundAdvice.invoke(AspectJAroundAdvice.java:70)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:747)
at org.springframework.aop.aspectj.MethodInvocationProceedingJoinPoint.proceed(MethodInvocationProceedingJoinPoint.java:88)
at com.chainup.stats.interceptor.StatsJobInterceptor.around(StatsJobInterceptor.java:99)
at sun.reflect.GeneratedMethodAccessor273.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethodWithGivenArgs(AbstractAspectJAdvice.java:644)
at org.springframework.aop.aspectj.AbstractAspectJAdvice.invokeAdviceMethod(AbstractAspectJAdvice.java:633)
at org.springframework.aop.aspectj.AspectJAroundAdvice.invoke(AspectJAroundAdvice.java:70)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:175)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:747)
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:95)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:747)
at org.springframework.aop.interceptor.AsyncExecutionInterceptor.lambda$invoke$0(AsyncExecutionInterceptor.java:115)
at org.springframework.aop.interceptor.AsyncExecutionInterceptor$$Lambda$790/775437217.call(Unknown Source)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.springframework.cloud.sleuth.instrument.async.TraceRunnable.run(TraceRunnable.java:65)
at java.lang.Thread.run(Thread.java:748)

Spark: java.lang.NullPointerException only in cluster mode (in local mode works well!)

I was running some tests in my local machine and everything was well.
Now, I'm running in a cluster with the same jar file, dataset, etc, and I'm getting the following exception:
java.lang.NullPointerException
at learner.MyLearner$$anonfun$18.apply(MyLearner.scala:165)
at learner.MyLearner$$anonfun$18.apply(MyLearner.scala:165)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
at scala.collection.AbstractIterator.to(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1499)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1487)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1486)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1486)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1714)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1669)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2062)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2087)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:936)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
at org.apache.spark.rdd.RDD.collect(RDD.scala:935)
at learner.MyLearner$.learn(MyLearner.scala:168)
at learner.Learner.learnAndClassify(Learner.scala:123)
at learner.Learner.run(Learner.scala:27)
at Launcher$.main(Launcher.scala:56)
at Launcher.main(Launcher.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NullPointerException
at learner.MyLearner$$anonfun$18.apply(MyLearner.scala:165)
at learner.MyLearner$$anonfun$18.apply(MyLearner.scala:165)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
at scala.collection.AbstractIterator.to(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
In MyLearner.scala, line 165 is this:
val rdd2: RDD[Rule] = rdd1.map { case (id, _) => new Rule(brdcst.value.elements(id).features, 0L, 0, id.toString, variables, labels, classes) }
I don't understand the exception and what could be the reason if in local mode this works well. Sorry if this is a obvious question, I have tried to figured it out but I coudn't.
I have tried to create a single rule, in order to check the new Rule(...) part, and I've put it in an RDD and it worked!
Info: The Rule class extends of the Serializable one.
Also, I did the following only to check the access to the broadcast variable, and it also works:
rdd1.collect().foreach{case (id, _) => println(brdcst.value.elements(id).features)}
Any suggestions or ideas?

SBT 1.2.8 Resolving key references hanging

We recently upgraded our sbt build from 0.13.x to 1.2.8 from then on our build agents are often (not always ) hangs right after
Resolving key references (11281 settings) ...
In thread dump I see following lines
- locked <0x00000000800058c0> (a java.lang.ref.Reference$Lock)
- waiting on <0x00000000800058c0> (a java.lang.ref.Reference$Lock)
Is it a bug? if not what am I doing wrong? Can you please advice?
Environment details
OS Linux aldburg 3.13.0-32-generic #57~precise1-Ubuntu SMP Tue Jul 15 03:51:20 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux
SBT launcher version 1.2.8
SBT project version 1.2.8
SBT Plugins we use are
addSbtPlugin("org.scoverage" % "sbt-scoverage" % "1.5.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-site" % "1.3.1")
addSbtPlugin("com.waioeka.sbt" % "cucumber-plugin" % "0.1.7")
addSbtPlugin("com.eed3si9n" % "sbt-unidoc" % "0.4.1")
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "1.0.0")
addSbtPlugin("io.spray" % "sbt-revolver" % "0.9.0")
addSbtPlugin("org.scalariform" % "sbt-scalariform" % "1.8.2")
addSbtPlugin("com.github.gseitz" % "sbt-release" % "1.0.7")
addSbtPlugin("de.heikoseeberger" % "sbt-header" % "4.1.0")
addSbtPlugin("com.lightbend.sbt" % "sbt-aspectj" % "0.11.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.3.2")
addSbtPlugin("pl.project13.scala" % "sbt-jmh" % "0.3.4")
Thread dump
Full thread dump OpenJDK 64-Bit Server VM (25.111-b14 mixed mode):
"Attach Listener" #51 daemon prio=9 os_prio=0 tid=0x00007fbc40001000 nid=0x10bd waiting on condition [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"pool-4-thread-8" #50 prio=5 os_prio=0 tid=0x00007fbc0c00e000 nid=0xf6c in Object.wait() [0x00007fbc45534000]
java.lang.Thread.State: RUNNABLE
at $0e82cce6ec5c385cb470$.$anonfun$v1$10(build.sbt:58)
at $0e82cce6ec5c385cb470$$$Lambda$2732/1436136891.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:204)
at sbt.internal.util.EvaluateSettings$$Lambda$1613/382762227.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:221)
at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:164)
- locked <0x0000000083884fc8> (a sbt.internal.util.EvaluateSettings$MixedNode)
at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:87)
at sbt.internal.util.EvaluateSettings$$Lambda$1624/2021540695.apply$mcV$sp(Unknown Source)
at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:98)
at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:94)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"pool-4-thread-7" #49 prio=5 os_prio=0 tid=0x00007fbc89f2d800 nid=0xf6b in Object.wait() [0x00007fbc45230000]
java.lang.Thread.State: RUNNABLE
at Dependencies$.<init>(Dependencies.scala:144)
at Dependencies$.<clinit>(Dependencies.scala)
at $0e82cce6ec5c385cb470$.$anonfun$common$10(build.sbt:81)
at $0e82cce6ec5c385cb470$$$Lambda$2744/908744222.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:204)
at sbt.internal.util.EvaluateSettings$$Lambda$1613/382762227.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:221)
at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:164)
- locked <0x000000008388f6c0> (a sbt.internal.util.EvaluateSettings$MixedNode)
at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:87)
at sbt.internal.util.EvaluateSettings$$Lambda$1624/2021540695.apply$mcV$sp(Unknown Source)
at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:98)
at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:94)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"pool-4-thread-6" #48 prio=5 os_prio=0 tid=0x00007fbc89f2c800 nid=0xf6a in Object.wait() [0x00007fbc45332000]
java.lang.Thread.State: RUNNABLE
at $0e82cce6ec5c385cb470$.$anonfun$support$1(build.sbt:163)
at $0e82cce6ec5c385cb470$$$Lambda$2734/1113898680.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:204)
at sbt.internal.util.EvaluateSettings$$Lambda$1613/382762227.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:221)
at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:164)
- locked <0x00000000838c9798> (a sbt.internal.util.EvaluateSettings$MixedNode)
at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:87)
at sbt.internal.util.EvaluateSettings$$Lambda$1624/2021540695.apply$mcV$sp(Unknown Source)
at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:98)
at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:94)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"pool-4-thread-5" #47 prio=5 os_prio=0 tid=0x00007fbc89f2b800 nid=0xf69 in Object.wait() [0x00007fbc44a2a000]
java.lang.Thread.State: RUNNABLE
at Dependencies$Compile$.<init>(Dependencies.scala:42)
at Dependencies$Compile$.<clinit>(Dependencies.scala)
at $0e82cce6ec5c385cb470$.$anonfun$test_server$4(build.sbt:176)
at $0e82cce6ec5c385cb470$$$Lambda$2715/684426930.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:204)
at sbt.internal.util.EvaluateSettings$$Lambda$1613/382762227.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:221)
at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:164)
- locked <0x0000000083a9c6c0> (a sbt.internal.util.EvaluateSettings$MixedNode)
at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:87)
at sbt.internal.util.EvaluateSettings$$Lambda$1624/2021540695.apply$mcV$sp(Unknown Source)
at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:98)
at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:94)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"pool-4-thread-4" #46 prio=5 os_prio=0 tid=0x00007fbc0800d000 nid=0xf68 in Object.wait() [0x00007fbc4502f000]
java.lang.Thread.State: RUNNABLE
at $0e82cce6ec5c385cb470$.$anonfun$test_utils$1(build.sbt:208)
at $0e82cce6ec5c385cb470$$$Lambda$2751/171063899.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:204)
at sbt.internal.util.EvaluateSettings$$Lambda$1613/382762227.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:221)
at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:164)
- locked <0x0000000084031dd0> (a sbt.internal.util.EvaluateSettings$MixedNode)
at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:87)
at sbt.internal.util.EvaluateSettings$$Lambda$1624/2021540695.apply$mcV$sp(Unknown Source)
at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:98)
at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:94)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"pool-4-thread-3" #45 prio=5 os_prio=0 tid=0x00007fbc89f2a800 nid=0xf67 in Object.wait() [0x00007fbc45433000]
java.lang.Thread.State: RUNNABLE
at $0e82cce6ec5c385cb470$.$anonfun$benchmark$1(build.sbt:154)
at $0e82cce6ec5c385cb470$$$Lambda$2797/2015110295.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:204)
at sbt.internal.util.EvaluateSettings$$Lambda$1613/382762227.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:221)
at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:164)
- locked <0x00000000842c3310> (a sbt.internal.util.EvaluateSettings$MixedNode)
at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:87)
at sbt.internal.util.EvaluateSettings$$Lambda$1624/2021540695.apply$mcV$sp(Unknown Source)
at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:98)
at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:94)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"pool-4-thread-2" #44 prio=5 os_prio=0 tid=0x00007fbc89f2a000 nid=0xf66 in Object.wait() [0x00007fbc44f2e000]
java.lang.Thread.State: RUNNABLE
at $0e82cce6ec5c385cb470$.$anonfun$cucumber$2(build.sbt:195)
at $0e82cce6ec5c385cb470$$$Lambda$2794/1142350221.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:204)
at sbt.internal.util.EvaluateSettings$$Lambda$1613/382762227.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:221)
at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:164)
- locked <0x0000000083f0b728> (a sbt.internal.util.EvaluateSettings$MixedNode)
at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:87)
at sbt.internal.util.EvaluateSettings$$Lambda$1624/2021540695.apply$mcV$sp(Unknown Source)
at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:98)
at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:94)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"pool-4-thread-1" #43 prio=5 os_prio=0 tid=0x00007fbc89f28000 nid=0xf65 in Object.wait() [0x00007fbc45130000]
java.lang.Thread.State: RUNNABLE
at $0e82cce6ec5c385cb470$.$anonfun$domain$1(build.sbt:147)
at $0e82cce6ec5c385cb470$$$Lambda$2733/1341548823.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings.$anonfun$constant$1(INode.scala:204)
at sbt.internal.util.EvaluateSettings$$Lambda$1613/382762227.apply(Unknown Source)
at sbt.internal.util.EvaluateSettings$MixedNode.evaluate0(INode.scala:221)
at sbt.internal.util.EvaluateSettings$INode.evaluate(INode.scala:164)
- locked <0x0000000084031e40> (a sbt.internal.util.EvaluateSettings$MixedNode)
at sbt.internal.util.EvaluateSettings.$anonfun$submitEvaluate$1(INode.scala:87)
at sbt.internal.util.EvaluateSettings$$Lambda$1624/2021540695.apply$mcV$sp(Unknown Source)
at sbt.internal.util.EvaluateSettings.sbt$internal$util$EvaluateSettings$$run0(INode.scala:98)
at sbt.internal.util.EvaluateSettings$$anon$3.run(INode.scala:94)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
"Log4j2-TF-1-AsyncLogger[AsyncContext#64bf3bbf]-1" #13 daemon prio=5 os_prio=0 tid=0x00007fbc8a2df800 nid=0xf38 waiting on condition [0x00007fbc6126c000]
java.lang.Thread.State: TIMED_WAITING (parking)
at sun.misc.Unsafe.park(Native Method)
- parking to wait for <0x000000008000b4e8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
at com.lmax.disruptor.TimeoutBlockingWaitStrategy.waitFor(TimeoutBlockingWaitStrategy.java:38)
at com.lmax.disruptor.ProcessingSequenceBarrier.waitFor(ProcessingSequenceBarrier.java:56)
at com.lmax.disruptor.BatchEventProcessor.processEvents(BatchEventProcessor.java:159)
at com.lmax.disruptor.BatchEventProcessor.run(BatchEventProcessor.java:125)
at java.lang.Thread.run(Thread.java:745)
"Service Thread" #9 daemon prio=9 os_prio=0 tid=0x00007fbc88213000 nid=0xf36 runnable [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"C1 CompilerThread3" #8 daemon prio=9 os_prio=0 tid=0x00007fbc88205800 nid=0xf35 waiting on condition [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"C2 CompilerThread2" #7 daemon prio=9 os_prio=0 tid=0x00007fbc88201000 nid=0xf34 waiting on condition [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"C2 CompilerThread1" #6 daemon prio=9 os_prio=0 tid=0x00007fbc881ff800 nid=0xf33 waiting on condition [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"C2 CompilerThread0" #5 daemon prio=9 os_prio=0 tid=0x00007fbc881fc800 nid=0xf32 waiting on condition [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"Signal Dispatcher" #4 daemon prio=9 os_prio=0 tid=0x00007fbc881fa800 nid=0xf31 runnable [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"Finalizer" #3 daemon prio=8 os_prio=0 tid=0x00007fbc881d2800 nid=0xf30 in Object.wait() [0x00007fbc62ceb000]
java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x0000000080003910> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143)
- locked <0x0000000080003910> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:164)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:209)
"Reference Handler" #2 daemon prio=10 os_prio=0 tid=0x00007fbc881ce000 nid=0xf2e in Object.wait() [0x00007fbc62dec000]
java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x00000000800058c0> (a java.lang.ref.Reference$Lock)
at java.lang.Object.wait(Object.java:502)
at java.lang.ref.Reference.tryHandlePending(Reference.java:191)
- locked <0x00000000800058c0> (a java.lang.ref.Reference$Lock)
at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153)
"main" #1 prio=5 os_prio=0 tid=0x00007fbc8800a000 nid=0xf20 waiting on condition [0x00007fbc91ed0000]
java.lang.Thread.State: WAITING (parking)
at sun.misc.Unsafe.park(Native Method)
- parking to wait for <0x0000000084031e88> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java:442)
at sbt.internal.util.EvaluateSettings.run(INode.scala:72)
at sbt.internal.util.Init.applyInits(Settings.scala:286)
at sbt.internal.util.Init.make(Settings.scala:208)
at sbt.internal.util.Init.make$(Settings.scala:199)
at sbt.Def$.make(Def.scala:20)
at sbt.internal.Load$.$anonfun$apply$5(Load.scala:266)
at sbt.internal.Load$$$Lambda$1561/1579610605.apply(Unknown Source)
at sbt.internal.Load$.timed(Load.scala:1395)
at sbt.internal.Load$.apply(Load.scala:261)
at sbt.internal.Load$.defaultLoad(Load.scala:69)
at sbt.BuiltinCommands$.liftedTree1$1(Main.scala:829)
at sbt.BuiltinCommands$.doLoadProject(Main.scala:829)
at sbt.BuiltinCommands$.$anonfun$loadProjectImpl$2(Main.scala:800)
at sbt.BuiltinCommands$$$Lambda$380/810169941.apply(Unknown Source)
at sbt.Command$.$anonfun$applyEffect$4(Command.scala:142)
at sbt.Command$$$Lambda$330/1723848804.apply(Unknown Source)
at sbt.Command$.$anonfun$applyEffect$2(Command.scala:137)
at sbt.Command$$$Lambda$356/1859823482.apply(Unknown Source)
at sbt.Command$.process(Command.scala:181)
at sbt.MainLoop$.processCommand(MainLoop.scala:151)
at sbt.MainLoop$.$anonfun$next$2(MainLoop.scala:139)
at sbt.MainLoop$$$Lambda$311/198374825.apply(Unknown Source)
at sbt.State$$anon$1.runCmd$1(State.scala:246)
at sbt.State$$anon$1.process(State.scala:250)
at sbt.MainLoop$.$anonfun$next$1(MainLoop.scala:139)
at sbt.MainLoop$$$Lambda$310/1620187937.apply(Unknown Source)
at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
at sbt.MainLoop$.next(MainLoop.scala:139)
at sbt.MainLoop$.run(MainLoop.scala:132)
at sbt.MainLoop$.$anonfun$runWithNewLog$1(MainLoop.scala:110)
at sbt.MainLoop$$$Lambda$303/1924802798.apply(Unknown Source)
at sbt.io.Using.apply(Using.scala:22)
at sbt.MainLoop$.runWithNewLog(MainLoop.scala:104)
at sbt.MainLoop$.runAndClearLast(MainLoop.scala:59)
at sbt.MainLoop$.runLoggedLoop(MainLoop.scala:44)
at sbt.MainLoop$.runLogged(MainLoop.scala:35)
at sbt.StandardMain$.runManaged(Main.scala:138)
at sbt.xMain.run(Main.scala:89)
at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
at xsbt.boot.Launch$.withContextLoader(Launch.scala:128)
at xsbt.boot.Launch$.run(Launch.scala:109)
at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:35)
at xsbt.boot.Launch$.launch(Launch.scala:117)
at xsbt.boot.Launch$.apply(Launch.scala:18)
at xsbt.boot.Boot$.runImpl(Boot.scala:56)
at xsbt.boot.Boot$.main(Boot.scala:18)
at xsbt.boot.Boot.main(Boot.scala)
"VM Thread" os_prio=0 tid=0x00007fbc881c6800 nid=0xf2d runnable
"GC task thread#0 (ParallelGC)" os_prio=0 tid=0x00007fbc8801f000 nid=0xf21 runnable
"GC task thread#1 (ParallelGC)" os_prio=0 tid=0x00007fbc88020800 nid=0xf22 runnable
"GC task thread#2 (ParallelGC)" os_prio=0 tid=0x00007fbc88022800 nid=0xf27 runnable
"GC task thread#3 (ParallelGC)" os_prio=0 tid=0x00007fbc88024000 nid=0xf28 runnable
"GC task thread#4 (ParallelGC)" os_prio=0 tid=0x00007fbc88026000 nid=0xf29 runnable
"GC task thread#5 (ParallelGC)" os_prio=0 tid=0x00007fbc88027800 nid=0xf2a runnable
"GC task thread#6 (ParallelGC)" os_prio=0 tid=0x00007fbc88029800 nid=0xf2b runnable
"GC task thread#7 (ParallelGC)" os_prio=0 tid=0x00007fbc8802b000 nid=0xf2c runnable
"VM Periodic Task Thread" os_prio=0 tid=0x00007fbc88215800 nid=0xf37 waiting on condition
JNI global references: 5542
Our build server is using different sbt-launcher (0.13.x) than one provided in the agent.
In our case, issue is fixed with the following steps :
Pointing launcher version to sbt-launcher 1.2.8
Removed sbt cache rm -rf ~/.sbt (make sure, take backups of global .sbt files and .credentials file etc..)
Removed ivy cache rm -rf ~/.ivy2/cache
export SBT_OPTS="-Dscala.concurrent.context.maxThreads=1" (out of 5 agents, i have to do this in only one agent)
I am not sure whether it fixes for everyone or not.

Sparkling water local mode cluster error

I'm trying to extend the hamorspam example(https://github.com/h2oai/sparkling-water/blob/master/examples/scripts/hamOrSpam.script.scala
) to make parallel predictions for large dataset using spark's parallel computation power(during the inference stage, not the training phase).
Below is the code I have written for the same. Moreover, it perfectly works fine in single node localmode (for export MASTER="local[*] ``), but fails when I run with export MASTER="local-cluster[2,2,1024] when 2 worker nodes are spawn.(to check the prediction parallelisation)
val data_test = load("smsData.txt") // Should be a large(in GBs) test dataset - using same training data for testing purposes just to test the workflow
val message_test = data.map( r => r(1))
message.take(1000).map(x => isSpam(x, dlModel, hashingTF, idfModel, h2oContext))
So the code fails when executing scala> val table:H2OFrame = resultRDD (
https://github.com/h2oai/sparkling-water/blob/master/examples/scripts/hamOrSpam.script.scala#L110)
I have attached the error from the console below:
17/06/26 20:25:49 WARN TaskSetManager: Lost task 0.0 in stage 6.0 (TID 43, 144.27.27.98, executor 1): java.lang.NoClassDefFoundError: Could not ini
tialize class $line32.$read$
at $line41.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$anonfun$1.apply(<console>:57)
at $line41.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$anonfun$1.apply(<console>:57)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$15.apply(RDD.scala:1010)
at org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$15.apply(RDD.scala:1009)
at org.apache.spark.SparkContext$$anonfun$33.apply(SparkContext.scala:1980)
at org.apache.spark.SparkContext$$anonfun$33.apply(SparkContext.scala:1980)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
17/06/26 20:25:49 ERROR TaskSetManager: Task 0 in stage 6.0 failed 4 times; aborting job
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 6.0 failed 4 times, most recent failure: Lost task 0.3 in stage
6.0 (TID 49, 144.27.27.98, executor 0): java.lang.NoClassDefFoundError: Could not initialize class
at $anonfun$1.apply(<console>:57)
at $anonfun$1.apply(<console>:57)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$15.apply(RDD.scala:1010)
at org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$15.apply(RDD.scala:1009)
at org.apache.spark.SparkContext$$anonfun$33.apply(SparkContext.scala:1980)
at org.apache.spark.SparkContext$$anonfun$33.apply(SparkContext.scala:1980)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1981)
at org.apache.spark.rdd.RDD$$anonfun$reduce$1.apply(RDD.scala:1025)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
at org.apache.spark.rdd.RDD.reduce(RDD.scala:1007)
at org.apache.spark.h2o.utils.H2OSchemaUtils$.collectMaxArrays(H2OSchemaUtils.scala:229)
at org.apache.spark.h2o.utils.H2OSchemaUtils$.expandedSchema(H2OSchemaUtils.scala:107)
at org.apache.spark.h2o.converters.SparkDataFrameConverter$.toH2OFrame(SparkDataFrameConverter.scala:59)
at org.apache.spark.h2o.H2OContext.asH2OFrame(H2OContext.scala:167)
at org.apache.spark.h2o.H2OContextImplicits.asH2OFrameFromDataFrame(H2OContextImplicits.scala:54)
... 58 elided
Caused by: java.lang.NoClassDefFoundError: Could not initialize class
at $anonfun$1.apply(<console>:57)
at $anonfun$1.apply(<console>:57)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$15.apply(RDD.scala:1010)
at org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$15.apply(RDD.scala:1009)
at org.apache.spark.SparkContext$$anonfun$33.apply(SparkContext.scala:1980)
at org.apache.spark.SparkContext$$anonfun$33.apply(SparkContext.scala:1980)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Any ideas?. Thanks in advance.