Finch Hello World Error: Http not a member of com.twitter.finagle - scala

I'm trying to use the scala finch library to build an API.
I have the following simple code:
package example
import io.finch._
import com.twitter.finagle.Http
object HelloWorld extends App {
val api: Endpoint[String] = get("hello") { Ok("Hello, World!") }
Http.serve(":8080", api.toService)
}
And a build.sbt file that looks like this:
name := "hello-finch"
version := "1.0"
scalaVersion := "2.10.6"
mainClass in (Compile, run) := Some("example.HelloWorld")
libraryDependencies ++= Seq(
"com.github.finagle" %% "finch-core" % "0.10.0"
)
// found here: https://github.com/finagle/finch/issues/604
addCompilerPlugin(
"org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full
)
When I compile and run the code I get this error message:
object Http is not a member of package com.twitter.finagle
[error] import com.twitter.finagle.Http
[error] ^
[error] /Users/jamesk/Code/hello_finch/hello-finch/src/main/scala/example/Hello.scala:8: wrong number of type arguments for io.finch.Endpoint, should be 2
[error] val api: Endpoint[String] = get("hello") { Ok("Hello, World!") }
[error] ^
[error] /Users/jamesk/Code/hello_finch/hello-finch/src/main/scala/example/Hello.scala:8: not found: value get
[error] val api: Endpoint[String] = get("hello") { Ok("Hello, World!") }
[error] ^
[error] /Users/jamesk/Code/hello_finch/hello-finch/src/main/scala/example/Hello.scala:10: not found: value Http
[error] Http.serve(":8080", api.toService)
[error] ^
[error] four errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 1 s, completed Aug 15, 2017 12:56:01 PM
At this point I'm running out of ideas, it looks like a good library but it's a pain getting it working. Any help would be very much appreciated.

I have updated your example to work with the last version of Finch: "com.github.finagle" %% "finch-core" % "0.15.1" and also Scala 2.12
the build.sbt file:
name := "hello-finch"
version := "1.0"
scalaVersion := "2.12.2"
mainClass in (Compile, run) := Some("example.HelloWorld")
libraryDependencies ++= Seq(
"com.github.finagle" %% "finch-core" % "0.15.1"
)
then, the src/main/scala/example/HelloWorld.scala file:
package example
import io.finch._
import com.twitter.finagle.Http
import com.twitter.util.Await
object HelloWorld extends App {
val api: Endpoint[String] = get("hello") { Ok("Hello, World!") }
Await.ready(Http.server.serve(":8080", api.toServiceAs[Text.Plain]))
}
Notice also that having Await.ready() is mandatory - your program would exit right away otherwise.

Related

Can't find SttpBackends + "Error occurred in an application involving default arguments."

I'm trying to create a extremely simple Telegram bot in Scala using bot4s. I'm pretty much following the example there. Here's the code:
package info.jjmerelo.BoBot
import cats.instances.future._
import cats.syntax.functor._
import com.bot4s.telegram.api.RequestHandler
import com.bot4s.telegram.api.declarative.Commands
import com.bot4s.telegram.clients.{FutureSttpClient, ScalajHttpClient}
import com.bot4s.telegram.future.{Polling, TelegramBot}
import scala.util.Try
import scala.concurrent.Future
import com.typesafe.scalalogging.Logger
object BoBot extends TelegramBot
with Polling
with Commands[Future] {
implicit val backend = SttpBackends.default
def token = sys.env("BOBOT_TOKEN")
override val client: RequestHandler[Future] = new FutureSttpClient(token)
val log = Logger("BoBot")
// val lines = scala.io.Source.fromFile("hitos.json").mkString
// val hitos = JSON.parseFull( lines )
// val solo_hitos = hitos.getOrElse( hitos )
onCommand("hey") { implicit msg =>
log.info("Hello")
reply("Conseguí que funcionara").void
}
}
And here's the build.sbt
name := "bobot"
version := "0.0.1"
organization := "info.jjmerelo"
libraryDependencies += "com.bot4s" %% "telegram-core" % "4.4.0-RC2"
val circeVersion = "0.12.3"
libraryDependencies ++= Seq(
"io.circe" %% "circe-core",
"io.circe" %% "circe-generic",
"io.circe" %% "circe-parser"
).map(_ % circeVersion)
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
retrieveManaged := true
Circe is for later
Anyway, I managed to compile most of it, but I still get these two errors:
[info] compiling 2 Scala sources to /home/jmerelo/Asignaturas/cloud-computing/BoBot/target/scala-2.12/classes ...
[error] /home/jmerelo/Asignaturas/cloud-computing/BoBot/src/main/scala/info/jjmerelo/BoBot.scala:21:26: not found: value SttpBackends
[error] implicit val backend = SttpBackends.default
[error] ^
[error] /home/jmerelo/Asignaturas/cloud-computing/BoBot/src/main/scala/info/jjmerelo/BoBot.scala:23:49: could not find implicit value for parameter backend: com.softwaremill.sttp.SttpBackend[scala.concurrent.Future,Nothing]
[error] Error occurred in an application involving default arguments.
[error] override val client: RequestHandler[Future] = new FutureSttpClient(token)
[error] ^
[error] two errors found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 5 s, completed 11 nov. 2020 8:19:38
I can't figure out either of the two. SttpBackends is missing, that's clear, but there's nothing in the example that indicates it's needed, or, for that matter, what library should be included. The second one about the default arguments I simply can't figure it out, even if I define token as String or if I change def to val. Any idea?
Your error messages is associated with each other.
First error tells us that compiler couldn't find object SttpBackends which has field of SttpBackend.
The second one tells us that compiler couldn't find implicit backend: SttpBackend for constructing FutureSttpClient. It requires two implicits: SttpBackend and ExecutionContext.
class FutureSttpClient(token : _root_.scala.Predef.String,
telegramHost : _root_.scala.Predef.String = { /* compiled code */ })
(implicit backend : com.softwaremill.sttp.SttpBackend[scala.concurrent.Future, scala.Nothing],
ec : scala.concurrent.ExecutionContext)
extends com.bot4s.telegram.clients.SttpClient[scala.concurrent.Future] {...}
You can create it by yourself as in bot4s examples.
If you will try to find SttpBackends object in bot4s library you would found this code in bot4s examples:
import com.softwaremill.sttp.okhttp._
object SttpBackends {
val default: SttpBackend[Future, Nothing] = OkHttpFutureBackend()
}
add this object to your project to make it compilable.

h2o scala code compile error not found object ai

I am trying to comile and run simple h2o scala code. But when I do sbt package I get errors.
Am I missing something in the sbt file
This is my h2o scala code
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql._
import ai.h2o.automl.AutoML
import ai.h2o.automl.AutoMLBuildSpec
import org.apache.spark.h2o._
object H2oScalaEg1 {
def main(args: Array[String]): Unit = {
val sparkConf1 = new SparkConf().setMaster("local[2]").setAppName("H2oScalaEg1App")
val sparkSession1 = SparkSession.builder.config(conf = sparkConf1).getOrCreate()
val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)
import h2oContext._
import java.io.File
import h2oContext.implicits._
import water.Key
}
}
And this is my sbt file.
name := "H2oScalaEg1Name"
version := "1.0"
scalaVersion := "2.11.12"
scalaSource in Compile := baseDirectory.value / ""
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.3"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.0"
libraryDependencies += "ai.h2o" % "h2o-core" % "3.22.1.3" % "runtime" pomOnly()
When I do sbt package I get these errors
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:7:8: not found: object ai
[error] import ai.h2o.automl.AutoML
[error] ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:8:8: not found: object ai
[error] import ai.h2o.automl.AutoMLBuildSpec
[error] ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:10:25: object h2o is not a member of package org.apache.spark
[error] import org.apache.spark.h2o._
[error] ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:20:20: not found: value H2OContext
[error] val h2oContext = H2OContext.getOrCreate(sparkSession1.sparkContext)
[error] ^
[error] /home/myuser1/h2oScalaEg1/H2oScalaEg1.scala:28:10: not found: value water
[error] import water.Key
[error] ^
[error] 5 errors found
How can I fix this problem.
My spark version in spark-2.2.3-bin-hadoop2.7
Thanks,
marrel
pomOnly() in build.sbt indicates to the dependency management handlers that jar libs/artifacts for this dependency should not be loaded and to only look for the metadata.
Try to use libraryDependencies += "ai.h2o" % "h2o-core" % "3.22.1.3" instead.
Edit 1: Additionally I think you are missing (at least) one library dependency:
libraryDependencies += "ai.h2o" % "h2o-automl" % "3.22.1.3"
see: https://search.maven.org/artifact/ai.h2o/h2o-automl/3.22.1.5/pom
Edit 2:
The last dependency you are missing is sparkling-water-core:
libraryDependencies += "ai.h2o" % "sparkling-water-core_2.11" % "2.4.6" should do the trick.
Here is the github of sparkling-water/core/src/main/scala/org/apache/spark/h2o
.

Not found value spark SBT project

Hi i am trying to set up a small spark application in SBT,
My build.sbt is
import Dependencies._
name := "hello"
version := "1.0"
scalaVersion := "2.11.8"
val sparkVersion = "1.6.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-streaming-twitter" % sparkVersion
)
libraryDependencies += scalaTest % Test
Everything works fine i get all dependencies resolved by SBT, but when i try importing spark in my hello.scala project file i get this error
not found: value spark
my hello.scala file is
package example
import org.apache.spark._
import org.apache.spark.SparkContext._
object Hello extends fileImport with App {
println(greeting)
anime.select("*").orderBy($"rating".desc).limit(10).show()
}
trait fileImport {
lazy val greeting: String = "hello"
var anime = spark.read.option("header", true).csv("C:/anime.csv")
var ratings = spark.read.option("header", true).csv("C:/rating.csv")
}
here is error file i get
[info] Compiling 1 Scala source to C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\target\scala-2.11\classes...
[error] C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\src\main\scala\example\Hello.scala:12: not found: value spark
[error] var anime = spark.read.option("header", true).csv("C:/anime.csv")
[error] ^
[error] C:\Users\haftab\Downloads\sbt-0.13.16\sbt\alfutaim\src\main\scala\example\Hello.scala:13: not found: value spark
[error] var ratings = spark.read.option("header", true).csv("C:/rating.csv")
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 3 s, completed Sep 10, 2017 1:44:47 PM
spark is initialized in spark-shell only
but for the code you need to initialize the spark variable by yourself
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder().appName("testings").master("local").getOrCreate
you can change the testings name to your desired name .master option is optional if you want to run the code using spark-submit

ToolBox Import Error

I'm getting the following error when compiling the following toy class:
package com.example
import scala.tools.reflect.ToolBox
import scala.reflect.runtime.{currentMirror => m}
object Hello {
def main(args: Array[String]): Unit = {
println("Hello, world!")
}
}
[info] Loading project definition from /Users/me/Temp/Bar/project
[info] Set current project to Bar (in build file:/Users/me/Temp/Bar/)
[info] Compiling 1 Scala source to /Users/me/Temp/Bar/target/scala-2.11/classes...
[error] /Users/me/Temp/Bar/src/main/scala/com/example/Hello.scala:3: object tools is not a member of package scala
[error] import scala.tools.reflect.ToolBox
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
This is my build.sbt file:
name := """Bar"""
version := "1.0"
scalaVersion := "2.11.8"
// Change this to another test framework if you prefer
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % "test"
libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.8"
// Uncomment to use Akka
//libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.3.11"
The following dependency fixed the issue:
libraryDependencies += "org.scala-lang" % "scala-compiler" % "2.11.8"
Is this the best solution?
The ToolBox class is part of the compiler, not the public reflection API.

Can't Run `test` in Scala Test with SBT

src/main/scala/Testing.scala
package common
object Add1Method {
def main(args: Array[String]) = 100+2
}
project/build.sbt
name := "Foo"
version := "1.0"
scalaVersion := "2.10.2"
libraryDependencies += "org.scalatest" % "scalatest_2.10" % "1.9.1" % "test"
resolvers += "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
resolvers += "Sonatype Releases" at "http://oss.sonatype.org/content/repositories/releases"
src/test/scala/Test.scala
package test
import common.Testing
import org.scalatest._
class Test extends FlatSpec with Matchers {
"running main" should "return 102" in {
val result = Add1Method.main(Array("asdf"))
assert(result == 102)
}
}
But, when I run test from SBT, the following 4 compile-time errors:
[error] Test.scala:4: object scalatest is not a member of package org
[error] import org.scalatest._
[error] ^
[error] Test.scala:6: not found: type FlatSpec
[error] class Test extends FlatSpec with Matchers {
[error] ^
[error] Test.scala:6: not found: type Matchers
[error] class Test extends FlatSpec with Matchers {
[error] ^
[error] Test.scala:8: value should is not a member of String
[error] "running main" should "return 102" in {
[error] ^
[error] four errors found
Note that I tried the suggested answer in SBT not finding scalatest for scala 2.10.1 without success.
The ScalaTest example uses the same imports - http://www.scalatest.org/quick_start.
I think the problem is that your build.sbt is in the wrong place. It should not be in project/ but in the root directory, next to the src directory.
See Directories in the sbt documentation for more info.