how to create vertx in scala - scala

The build.sbt is as follows:
name := "ScalaVertxTest"
version := "0.1"
scalaVersion := "2.12.8"
libraryDependencies += "io.vertx" %% "vertx-lang-scala" % "3.6.3"
In Scala file, just trying to create vertx instance as follows:
package com.example
import io.vertx.scala.core._
object Main {
def main (args: Array[String]): Unit = {
println("Hello Vertx Scala")
var vertx = Vertx.vertx()
}
sbt compile command generates following error message:
com/example/Main.scala:3:11: object vertx is not a member of package io
[error] import io.vertx.scala.core._
[error] ^
[error] com/example/Main.scala:12:17: not found: value Vertx
[error] var vertx = Vertx.vertx()
[error] ^
[error] two errors found
}
How to create Vertx instance in Scala?

I was having problem building in Intellij IDE.
Compiled using sbt in console, program builds correctly.

Related

sbt - object apache is not a member of package org

I want to deploy and submit a spark program using sbt but its throwing error.
Code:
package in.goai.spark
import org.apache.spark.{SparkContext, SparkConf}
object SparkMeApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("First Spark")
val sc = new SparkContext(conf)
val fileName = args(0)
val lines = sc.textFile(fileName).cache
val c = lines.count
println(s"There are $c lines in $fileName")
}
}
build.sbt
name := "First Spark"
version := "1.0"
organization := "in.goai"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
resolvers += Resolver.mavenLocal
Under first/project directory
build.properties
bt.version=0.13.9
When I am trying to run sbt package its throwing error given below.
[root#hadoop first]# sbt package
[info] Loading project definition from /home/training/workspace_spark/first/project
[info] Set current project to First Spark (in build file:/home/training/workspace_spark/first/)
[info] Compiling 1 Scala source to /home/training/workspace_spark/first/target/scala-2.11/classes...
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:3: object apache is not a member of package org
[error] import org.apache.spark.{SparkContext, SparkConf}
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:9: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("First Spark")
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:11: not found: type SparkContext
[error] val sc = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 4 s, completed May 10, 2018 4:05:10 PM
I have tried with extends to App too but no change.
Please remove resolvers += Resolver.mavenLocal from build.sbt. Since spark-core is available on Maven, we don't need to use local resolvers.
After that, you can try sbt clean package.

Finch Hello World Error: Http not a member of com.twitter.finagle

I'm trying to use the scala finch library to build an API.
I have the following simple code:
package example
import io.finch._
import com.twitter.finagle.Http
object HelloWorld extends App {
val api: Endpoint[String] = get("hello") { Ok("Hello, World!") }
Http.serve(":8080", api.toService)
}
And a build.sbt file that looks like this:
name := "hello-finch"
version := "1.0"
scalaVersion := "2.10.6"
mainClass in (Compile, run) := Some("example.HelloWorld")
libraryDependencies ++= Seq(
"com.github.finagle" %% "finch-core" % "0.10.0"
)
// found here: https://github.com/finagle/finch/issues/604
addCompilerPlugin(
"org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full
)
When I compile and run the code I get this error message:
object Http is not a member of package com.twitter.finagle
[error] import com.twitter.finagle.Http
[error] ^
[error] /Users/jamesk/Code/hello_finch/hello-finch/src/main/scala/example/Hello.scala:8: wrong number of type arguments for io.finch.Endpoint, should be 2
[error] val api: Endpoint[String] = get("hello") { Ok("Hello, World!") }
[error] ^
[error] /Users/jamesk/Code/hello_finch/hello-finch/src/main/scala/example/Hello.scala:8: not found: value get
[error] val api: Endpoint[String] = get("hello") { Ok("Hello, World!") }
[error] ^
[error] /Users/jamesk/Code/hello_finch/hello-finch/src/main/scala/example/Hello.scala:10: not found: value Http
[error] Http.serve(":8080", api.toService)
[error] ^
[error] four errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 1 s, completed Aug 15, 2017 12:56:01 PM
At this point I'm running out of ideas, it looks like a good library but it's a pain getting it working. Any help would be very much appreciated.
I have updated your example to work with the last version of Finch: "com.github.finagle" %% "finch-core" % "0.15.1" and also Scala 2.12
the build.sbt file:
name := "hello-finch"
version := "1.0"
scalaVersion := "2.12.2"
mainClass in (Compile, run) := Some("example.HelloWorld")
libraryDependencies ++= Seq(
"com.github.finagle" %% "finch-core" % "0.15.1"
)
then, the src/main/scala/example/HelloWorld.scala file:
package example
import io.finch._
import com.twitter.finagle.Http
import com.twitter.util.Await
object HelloWorld extends App {
val api: Endpoint[String] = get("hello") { Ok("Hello, World!") }
Await.ready(Http.server.serve(":8080", api.toServiceAs[Text.Plain]))
}
Notice also that having Await.ready() is mandatory - your program would exit right away otherwise.

Scala program isn't seeing dependencies downloaded via SBT

I'm writing a script to try to get Cassandra and Spark working together but I can't even get the program to compile. I am using SBT as the build tool and I have all the dependencies required for the program declared. The first time I ran sbt run it downloaded the dependencies but I would get an error when it started compiling the scala code shown below:
[info] Compiling 1 Scala source to /home/vagrant/ScalaTest/target/scala-2.10/classes...
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:6: not found: type SparkConf
[error] val conf = new SparkConf(true)
[error] ^
[error] /home/vagrant/ScalaTest/src/main/scala/ScalaTest.scala:9: not found: type SparkContext
[error] val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Jun 5, 2015 2:40:09 PM
This is the SBT build file
lazy val root = (project in file(".")).
settings(
name := "ScalaTest",
version := "1.0"
)
libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "1.3.0-M1"
and this is the actual Scala program
import com.datastax.spark.connector._
object ScalaTest {
def main(args: Array[String]) {
val conf = new SparkConf(true)
.set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new SparkContext("spark://192.168.10.11:7077", "test", conf)
}
}
Here is my directory structure
- ScalaTest
- build.sbt
- project
- src
- main
- scala
- ScalaTest.scala
- target
I don't know if this is the problem, but you're not importing the SparkConf and SparkContext classes definition. Thus try adding to your scala file:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext

SBT cannot resolve class declared in src/main/scala in a src/test/scala test class

I am trying to make my own custom CSV reader. I am using IntelliJ IDEA 14 with sbt and specs2 test framework.
The class I declared in src/main is as follows:
import java.io.FileInputStream
import scala.io.Source
class CSVStream(filePath:String) {
val csvStream = Source.fromInputStream(new FileInputStream(filePath)).getLines()
val headers = csvStream.next().split("\\,", -1)
}
The content of the test file in src/test is as follows:
import org.specs2.mutable._
object CSVStreamSpec {
val csvSourcePath = getClass.getResource("/csv_source.csv").getPath
}
class CSVStreamSpec extends Specification {
import CSVStreamLib.CSVStreamSpec._
"The CSV Stream reader" should {
"Extract the header" in {
val csvSource = CSVStream(csvSourcePath)
}
}
}
The build.sbt file contains the following:
name := "csvStreamLib"
version := "1.0"
scalaVersion := "2.11.4"
libraryDependencies ++= Seq("org.specs2" %% "specs2-core" % "2.4.15" % "test")
parallelExecution in Test := false
The error I am getting when I type test is as follows:
[error] /Users/raiyan/IdeaProjects/csvStreamLib/src/test/scala/csvStreamSpec.scala:18: not found: value CSVStream
[error] val csvSource = CSVStream(csvSourcePath)
[error] ^
[error] one error found
[error] (test:compile) Compilation failed
[error] Total time: 23 s, completed 30-Dec-2014 07:44:46
How do I make the CSVStream class accessible to the CSVStreamSpec class in the test file?
Update:
I tried it with sbt in the command line. The result is the same.
You forgot the new keyword. Without it, the compiler looks for the companion object named CSVStream, not the class. Since there is none, it complains. Add new and it'll work.

Can't Run `test` in Scala Test with SBT

src/main/scala/Testing.scala
package common
object Add1Method {
def main(args: Array[String]) = 100+2
}
project/build.sbt
name := "Foo"
version := "1.0"
scalaVersion := "2.10.2"
libraryDependencies += "org.scalatest" % "scalatest_2.10" % "1.9.1" % "test"
resolvers += "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
resolvers += "Sonatype Releases" at "http://oss.sonatype.org/content/repositories/releases"
src/test/scala/Test.scala
package test
import common.Testing
import org.scalatest._
class Test extends FlatSpec with Matchers {
"running main" should "return 102" in {
val result = Add1Method.main(Array("asdf"))
assert(result == 102)
}
}
But, when I run test from SBT, the following 4 compile-time errors:
[error] Test.scala:4: object scalatest is not a member of package org
[error] import org.scalatest._
[error] ^
[error] Test.scala:6: not found: type FlatSpec
[error] class Test extends FlatSpec with Matchers {
[error] ^
[error] Test.scala:6: not found: type Matchers
[error] class Test extends FlatSpec with Matchers {
[error] ^
[error] Test.scala:8: value should is not a member of String
[error] "running main" should "return 102" in {
[error] ^
[error] four errors found
Note that I tried the suggested answer in SBT not finding scalatest for scala 2.10.1 without success.
The ScalaTest example uses the same imports - http://www.scalatest.org/quick_start.
I think the problem is that your build.sbt is in the wrong place. It should not be in project/ but in the root directory, next to the src directory.
See Directories in the sbt documentation for more info.