I'm trying to test some code in the Spark shell (spark v. 1.3.0, using Scala version 2.10.4) and in the past was able to import joda-time libraries like this:
import org.joda.time.DateTime
import org.joda.time.format.DateTimeFormatter
import org.joda.time.format.DateTimeFormat
Today when I started the shell on my local machine, I'm getting:
scala> import org.joda.time.format.DateTimeFormat
<console>:19: error: object joda is not a member of package org
import org.joda.time.format.DateTimeFormat
^
scala> import org.joda.time.DateTime
<console>:19: error: object joda is not a member of package org
import org.joda.time.DateTime
^
scala> import org.joda.time._
<console>:19: error: object joda is not a member of package org
import org.joda.time._
^
As far as I know nothing's changed overnight. Anyone ever seen this before?
Not sure why I'm getting inconsistent behavior, but this seems to fix it.
spark-shell --jars ~/jars/joda-time-2.8.1.jar
Related
I'm working on a scala source code, but when compiling, it doesn't find this import:
I'm new to scala, how can I bring this function to my scala (ExporterUtil)?
import org.joda.time.DateTime
import org.joda.time.format.DateTimeFormat
import scala.collection.mutable.ArrayBuffer
import br.net.empresa.digital.exporter.ExporterUtil
When compiling, it doesn't find this directory (br.net.empresa.digital.exporter.ExporterUtil), how can I bring this dir to my folder?
I am trying to write one simple program in Scala but when I use SparkContext in Intellij this is throwing an error. Can someone give me any solution?
Scala 3.1.1
Spark version 3.2.1
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object Wordcount extends App {
val sc = new SparkContext("Local[*]","wordcount")
}
I am trying to use apache spark in Scala in IntelliJ. I am importing spark like this
import org.apache.spark._
import org.apache.spark.SparkContext._
import org.apache.spark.sql._
import org.apache.log4j._
however when I build my project I receive the error object apache is not a member of package org import org.apache.spark._
how can I fix this error?
I am using windows machine and installed spark and scala for my learning. for spark-sql in need to process json input data.
scala> sc
res4: org.apache.spark.SparkContext = org.apache.spark.SparkContext#7431f4b8
scala> import play.api.libs.json._
<console>:23: error: not found: value play
import play.api.libs.json._
^
scala>
How can i import play api in my spark-shell commnad.
If you want to use other libraries while you are using spark-shell, you need to run spark-shell command with --jars and/or --packages. For example, to use play in your spark shell, run the following command;
spark-shell --packages "com.typesafe.play":"play_2.11":"2.6.19"
For more information, you can use spark-shell -h. I hope it helps!
I am adding jars into my scala repl like so:
scala> :cp scalaj-http_2.10-2.2.1.jar
Added '/home/XXX/scalaj-http_2.10-2.2.1.jar'. Your new classpath is:
".:/home/XXX/json4s-native_2.10-3.3.0.RC3.jar:/home/XXX/scalaj-http_2.10-2.2.1.jar"
Nothing to replay.
Now when I try and import that jar for use I get an error:
scala> import scalaj.http._
<console>:7: error: not found: value scalaj
import scalaj.http._
I've tried this on another jar:
scala> :cp json4s-native_2.10-3.3.0.RC3.jar
Added '/home/XXX/json4s-native_2.10-3.3.0.RC3.jar'. Your new classpath is:
".:/home/XXX/json4s-native_2.10-3.3.0.RC3.jar"
Nothing to replay.
scala> import org.json4s.JsonDSL._
<console>:7: error: object json4s is not a member of package org
import org.json4s.JsonDSL._
I've read multuple tutorials online that all do it this way but my REPL does not seem to be behaving in the same manor.
I am using Scala 2.10
Double check your path, if it still is not working you can try adding the jar at the time you start the REPL (it's always seemed to work for me, even with v2.10)
scala -cp /home/XXX/json4s-native_2.10-3.3.0.RC3.jar:/home/XXX/scalaj-http_2.10-2.2.1.jar
Note: That the delimeter between jars is ; for Windows and : otherwise.