My project uses the following jars: scala-library (2.9.2), mongo-java-driver (2.7.3), scalaj-collection (2.9.1-1.2), casbah (util, commons, core, query, gridfs) 2.9.1-3.0.0-M2, joda-time 2.1, and joda convert 1.2
When I enter the following hello-worldish code:
package test
import com.mongodb.casbah.Imports._
object Test {
def main(args: Array[String]): Unit = {
var connection = MongoConnection()
}
}
I get an error: "not found: value MongoConnection". The error goes away if I explicitly
include com.mongodb.casbah.MongoConnection
But I thought Imports._ was supposed to be taking care of that. What could I be doing wrong?
In Casbah 3.0, Imports._ is deprecated.
What is weird though is that MongoConnection is not even imported anymore. Everything else works but deprecation warnings occur.
As those warnings state, you just need to do this instead:
import com.mongodb.casbah._
Related
I'm running Scala code on Azure databricks well. Now I want to move this code from Azure notebook to eclipse.
I install databricks connection following Microsoft document successfully. Pass databricks data connection test.
I also installed SBT and import to my project in eclipse
I create scala object in eclipse and also I import all jar files as external file in pyspark
package Student
import org.apache.spark.sql.DataFrame
import org.apache.spark.sql.SparkSession
import java.util.Properties
//import com.databricks.dbutils_v1.DBUtilsHolder.dbutils
object Test {
def isTypeSame(df: DataFrame, name: String, coltype: String) = (df.schema(name).dataType.toString == coltype)
def main(args: Array[String]){
var Result = true
val Borrowers = List(("col1", "StringType"),("col2", "StringType"),("col3", "DecimalType(38,18)"))
val dfPcllcus22 = spark.read.format("parquet").load("/mnt/slraw/ServiceCenter=*******.parquet")
if (Result == false) println("Test Fail, Please check") else println("Test Pass")
}
}
When I run this code in eclipse, it shows cannot find main class. But if I comment "val dfPcllcus22 = spark.read.format("parquet").load("/mnt/slraw/ServiceCenter=*******.parquet")", pass the test.
So it seems spark.read.format cannot be recognized.
I'm new to Scala and DataBricks.
I was researching result for several days but still cannot solve it.
If anyone can help, really appreciate.
Environment is a bit complicated to me, if more information required, please let me know
SparkSession is needed to run your code in eclipse, since your provided code does not have this line for SparkSession creation leads to an error,
val spark = SparkSession.builder.appName("SparkDBFSParquet").master("local[*]".getOrCreate()
Please add this line and run the code and it should work.
I'm trying to use Janusgraph in scala script with tinkerpop 3. I use the gremlin.scala library (https://github.com/mpollmeier/gremlin-scala) but I get an error about HNil (see below). How to use gremlin in scala script and Janusgraph ?
import gremlin.scala._
import org.apache.commons.configuration.BaseConfiguration
import org.janusgraph.core.JanusGraphFactory
import org.apache.tinkerpop.gremlin.structure.Graph
object Janus {
def main(args: Array[String]): Unit = {
val conf = new BaseConfiguration()
conf.setProperty("storage.backend","inmemory")
val graph = JanusGraphFactory.open(conf)
val v1 = graph.graph.addV("test")
}
}
Error:(11, 14) Symbol 'type scala.ScalaObject' is missing from the classpath.
This symbol is required by 'trait shapeless.HNil'.
Make sure that type ScalaObject is in your classpath and check for conflicting dependencies with -Ylog-classpath.
A full rebuild may help if 'HNil.class' was compiled against an incompatible version of scala.
val v1 = graph.graph.addV("test")
Not sure what you mean by 'scala script', but it looks like you're missing many (all?) dependencies. Did you have a look at https://github.com/mpollmeier/gremlin-scala-examples/ ? It contains an example setup for janusgraph.
I am getting an error if I import readline, as follows:
import scala.io.StdIn.{readline, readInt} =>
error: value readline is not a member of object scala.io.StdIn
import scala.io.StdIn.{readline, readInt}
Scala code runner version 2.12.1
If I don't import this, I get a deprecated message:
warning: there was one deprecation warning (since 2.11.0); re-run with -deprecation for details
one warning found
I get no errors if I use the fill path to the function:
var x = scala.io.StdIn.readLine.toInt
Let me know if you can help me resolve the import. Thanks.
A very tiny overlook:
import scala.io.StdIn.{readLine, readInt}
readLine has an upper case L
I am receiving the following error when I try to run my code:
Error:(104, 63) type mismatch;
found : hydrant.spark.hydrant.spark.IPPortPair
required: hydrant.spark.(some other)hydrant.spark.IPPortPair
IPPortPair(it.getHeader.getDestinationIP, it.getHeader.getDestinationPort))
My code uses a case class defined in the package object spark to set up the IP/Port map for each connection.
The package object looks like this:
package object spark{
case class IPPortPair(ip:Long,port:Long)
}
And the code using the package object like the below:
package hydrant.spark
import java.io.{File,PrintStream}
object identifyCustomers{
……………
def mapCustomers(suspectTraffic:RDD[Generic])={
suspectTraffic.filter(
it => !it.getHeader.isEmtpy
).map(
it => IPPortPair(it.getHeader.getDestinationIP,it.getHeader.getDestinationPort)
) ^`
}
I am concious about the strange way that my packages are being displayed as the error makes it seem that I am in hydrant.spark.hydrant.spark which does not exist.
I am also using Intellij if that makes a difference.
You need to run sbt clean (or the IntelliJ equivalent). You changed something in the project (e.g. Scala version) and this is how the incompatibility manifests.
I developed an app in scala-ide (eclipse plugin), no errors or warnings. Now I'm trying to deploy it to the stax cloud:
$ stax deploy
But it fails to compile it:
compile:
[scalac] Compiling 2 source files to /home/gleontiev/workspace/rss2lj/webapp/WEB-INF/classes
error: error while loading FlickrUtils, Scala signature FlickrUtils has wrong version
expected: 4.1
found: 5.0
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/DisplaySnippet.scala:8: error: com.folone.logic.FlickrUtils does not have a constructor
val dispatcher = new FlickrUtils("8196243#N02")
^
error: error while loading Photo, Scala signature Photo has wrong version
expected: 4.1
found: 5.0
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/DisplaySnippet.scala:9: error: value link is not a member of com.folone.logic.Photo
val linksGetter = (p:Photo) => p.link
^
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/DisplaySnippet.scala:15: error: com.folone.logic.FlickrUtils does not have a constructor
val dispatcher = new FlickrUtils("8196243#N02")
^
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/DisplaySnippet.scala:16: error: value medium1 is not a member of com.folone.logic.Photo
val picsGetter = (p:Photo) => p.medium1
^
/home/gleontiev/workspace/rss2lj/src/scala/example/snippet/RefreshSnippet.scala:12: error: com.folone.logic.FlickrUtils does not have a constructor
val dispatcher = new FlickrUtils("8196243#N02")
^
7 errors found
ERROR: : The following error occurred while executing this line:
/home/gleontiev/workspace/rss2lj/build.xml:61: Compile failed with 7 errors; see the compiler error output for details.
I see two errors, it is complaining about: the first one is FlickrUtils class constructor, which is defined like this:
class FlickrUtils(val userId : String) {
//...
}
The second one is the fact, that two fields are missing from Photo class, which is:
class Photo (val photoId:String, val userId:String, val secret:String, val server:String) {
private val _medium1 = "/sizes/m/in/photostream"
val link = "http://flickr.com/photos/" + userId + "/" + photoId
val medium1 = link + _medium1
}
Seems like stax sdk uses the wrong comliler (?). How do I make it use the right one? If it is not, what is the problem here, and what are some ways to resolve it?
Edit: $ scala -version says
Scala code runner version 2.8.0.final -- Copyright 2002-2010, LAMP/EPFL
I tried compiling everything with scalac manually, puting everything to their places, and running stax deploy afterwards -- same result.
I actually resolved this by moving FlickrUtils and Photo classes to the packages, where snippets originally are, but I still don't get, why it was not able to compile and use them from the other package.