Error using apache spark for scala in intelliJ - scala

I am trying to use apache spark in Scala in IntelliJ. I am importing spark like this
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.sql._
import org.apache.log4j._
however when I build my project I receive the error object apache is not a member of package org import org.apache.spark._
how can I fix this error?

Related

Install ExporterUtil inside my project folder (scala)

I'm working on a scala source code, but when compiling, it doesn't find this import:
I'm new to scala, how can I bring this function to my scala (ExporterUtil)?
import org.joda.time.DateTime
import org.joda.time.format.DateTimeFormat
import scala.collection.mutable.ArrayBuffer
import br.net.empresa.digital.exporter.ExporterUtil
When compiling, it doesn't find this directory (br.net.empresa.digital.exporter.ExporterUtil), how can I bring this dir to my folder?

not found: type SparkContext || object apache is not a member of package org

I am trying to write one simple program in Scala but when I use SparkContext in Intellij this is throwing an error. Can someone give me any solution?
Scala 3.1.1
Spark version 3.2.1
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object Wordcount extends App {
val sc = new SparkContext("Local[*]","wordcount")
}

Am getting error in scala programming while integrating spark streaming with kafka

Am trying to add some imports , but when I add
import org.apache.spark.streaming.kafka.kafkautils its showing the below error
object kafka is not a member of the object org.apache.spark.streaming.kafka.kafkautils
working on eclipse with
scala ide 4.7
version 2.11.11,
spark-2.3.0-bin-hadoop2.7 jar files,
kafka 2.11 jars,
spark-streaming-kafka-0-10_2.11-2.3.0 jar
If you are using spark-streaming-kafka-0-10_2.11-2.3.0 jar then the KafkaUtils is available in org.apache.spark.streaming.kafka010 this package.
So import
import org.apache.spark.streaming.kafka010.KafkaUtils
and not
import org.apache.spark.streaming.kafka.kafkautils
Hope this hepls!

Apache Spark and Scala required jars

I am new to Scala.
Can any one suggest me what are the jar files required for running Apache Spark with Scala in Linux environment.Below code was a piece of original code. I am getting exceptions like java.lang.NoSuchMethodError: org.jboss.netty.channel.socket.nio.NioWorkerPool.(Ljava/util/concurrent/Executor;I)V
java -cp ".:/opt/cloudera/parcels/CDH-5.7.1-1.cdh5.7.1.p1876.1944/jars/:./"
TestAll.scala
import org.apache.spark.SparkContext._
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SQLContext
import java.io._
import java.sql.{Connection,DriverManager}
import scala.collection._
import scala.collection.mutable.MutableList
object TestAll {
def main(args: Array[String]) {
val conf =new SparkConf().setAppName("Testing App").setMaster("local")
val sc=new SparkContext(conf)
println("Hello, world!")
}
}
You need to download Spark from here. Choose the "Pre-built with Hadoop" option. Then you can follow the directions of the Quick Start. This will get you through the Hello World. I am not sure which IDE you are using, but the most friendly for Scala is Intellij IDEA

joda-time import stopped working in Spark shell

I'm trying to test some code in the Spark shell (spark v. 1.3.0, using Scala version 2.10.4) and in the past was able to import joda-time libraries like this:
import org.joda.time.DateTime
import org.joda.time.format.DateTimeFormatter
import org.joda.time.format.DateTimeFormat
Today when I started the shell on my local machine, I'm getting:
scala> import org.joda.time.format.DateTimeFormat
<console>:19: error: object joda is not a member of package org
import org.joda.time.format.DateTimeFormat
^
scala> import org.joda.time.DateTime
<console>:19: error: object joda is not a member of package org
import org.joda.time.DateTime
^
scala> import org.joda.time._
<console>:19: error: object joda is not a member of package org
import org.joda.time._
^
As far as I know nothing's changed overnight. Anyone ever seen this before?
Not sure why I'm getting inconsistent behavior, but this seems to fix it.
spark-shell --jars ~/jars/joda-time-2.8.1.jar