Spark Eclipse Import Package Error - eclipse

When I open shell in windows 10 and type spark-shell,it opens scala.After that I type
import org.apache.spark.SparkContext
and it's ok on shell.There is no problem.
But when I try to run this command on eclipse it gives me error :
error: object apache is not a member of package org
import org.apache.spark.SparkContext
Why it gives that error ? What should I do now ?
I make some research on internet,some say give the classpath etc..I tried everything but remain unsolvable.
Please help

Related

Intellij error 'object update_eori is not a member of package views.html'

I am getting the below error while compiling my scala project-
object update_eori is not a member of package views.html
import views.html.update_eori
This is the package that I am using -
import views.html.update_eori
What could be possibly going wrong here?
P.S. I am completely new to coding
I tried to run the command sbt clean and then tried to compile again but same error persists.

object neo4j is not a member of package org

I am trying to read data from neo4j using spark job. The import itself is showing this error. I tried to import org.neo4j.spark._ in intellij IDEA, but it is showing an error "Cannot resolve symbol spark". While trying in the spark-shell, it is throwing an error,
":23: error: object neo4j is not a member of package org
import org.neo4j.spark._"
Spark version - 3.1.1
dependencies

object apache is not a member of package org - jupyter notebook

I am trying to launch jupyter notebook with scala. For that I used almond but I am running into problem when trying to import:
import org.apache.spark._
the error is:
object apache is not a member of package org

Eclipse says "apache is not a member of package org"

I am using Scala on Eclipse Luna and trying to connect to Cassandra. My code is showing the error apache object is not a member of package org on the following line:
import org.apache.spark.SparkConf
I already imported the Scala and Spark libraries into the project. Does someone know how can I make my program import Spark libraries?

Scala REHL jar importing issue

I was trying to run a scala file using command
scala myclass.scala
However, it complains about one of the import library. I included the jar using the -classpath option like this.
scala -class ncscala-time.jar myclass.scala
Error I got is:
myclass.scala:5: error: object github is not a member of package com
import com.github.nscala_time.time.Imports._
Any idea why?