object neo4j is not a member of package org - scala

I am trying to read data from neo4j using spark job. The import itself is showing this error. I tried to import org.neo4j.spark._ in intellij IDEA, but it is showing an error "Cannot resolve symbol spark". While trying in the spark-shell, it is throwing an error,
":23: error: object neo4j is not a member of package org
import org.neo4j.spark._"
Spark version - 3.1.1
dependencies

Related

How to run documentDb (mongoDb) in Zeppelin using AWS EMR?

I am using Zeppelin inside an AWS EMR.
Then i created a DocumentDb instance and tried to run it through Zeppelin
but when I try to run the code it gives the error as there isn't any MongoDB Java Driver dependency.
<console>:25: error: object mongodb is not a member of package org
import org.mongodb.scala._
^
<console>:26: error: object bson is not a member of package org
import org.bson._
^
Is there any way to add maven dependency through which I can run mongo in Zeppelin?

error not found value spark import spark.implicits._ import spark.sql

I am using hadoop 2.7.2 , hbase 1.4.9, spark 2.2.0, scala 2.11.8 and java 1.8 on a hadoop cluster which is composed of one master and two slave.
when I run spark-shell after starting the cluster , it works fine.
I am trying to connect to hbase using scala by following this tutorial : [https://www.youtube.com/watch?v=gGwB0kCcdu0][1] .
But when I try like he does to run the spark-shell by adding those jars like argument I have this error:
spark-shell --jars
"hbase-annotations-1.4.9.jar,hbase-common-1.4.9.jar,hbase-protocol-1.4.9.jar,htrace-core-3.1.0-incubating.jar,zookeeper-3.4.6.jar,hbase-client-1.4.9.jar,hbase-hadoop2-compat-1.4.9.jar,metrics-json-3.1.2.jar,hbase-server-1.4.9.jar"
<console>:14: error: not found: value spark
import spark.implicits._
^
<console>:14: error: not found: value spark
import spark.sql
^
and after that even I log out and run spark-shell another time I have the same issue.
Can any one tell me please what is the cause and how to fix it .
In your import statement spark should be an object of type SparkSession. That object should have been created previously for you. Or you need to create it yourself (read spark docs). I didn't watch your tutorial video.
The point is it doesn't have to be called spark. It could be for instance called sparkSession and then you can do import sparkSession.implicits._

import uploaded library to Databricks

I uploaded the spark time series spark-ts library to DataBricks using maven coordinate option in the Create Library. I was able to successfully create the library and attach it to my cluster. But when I tried to import the spark-ts library in DataBricks using org.apache.spark.spark-ts. But it throws an error stating that notebook:1: error: object ts is not a member of package org.apache.spark Please let me know how to handle this issue.

Eclipse says "apache is not a member of package org"

I am using Scala on Eclipse Luna and trying to connect to Cassandra. My code is showing the error apache object is not a member of package org on the following line:
import org.apache.spark.SparkConf
I already imported the Scala and Spark libraries into the project. Does someone know how can I make my program import Spark libraries?

Scala REHL jar importing issue

I was trying to run a scala file using command
scala myclass.scala
However, it complains about one of the import library. I included the jar using the -classpath option like this.
scala -class ncscala-time.jar myclass.scala
Error I got is:
myclass.scala:5: error: object github is not a member of package com
import com.github.nscala_time.time.Imports._
Any idea why?