Not found : Type Build (Unresolved 'Build') - scala

I'm getting the following set of errors, which I belive is caused by the sbt-assembly plugin that I is used.
In fact the object declaration of ;
object Build extends **Build** { (here Build is unresolved).
The error is as follows,
Error:Error while importing SBT project:<br/><pre>
[info] Loading settings from assembly.sbt,plugins.sbt ...
[info] Loading project definition from C:\Users\ShakthiW\IdeaProjects\TestProject\project
[error] <path>\Build.scala:4:22: not found: type Build
[error] object Build extends Build{
[error] ^
[error] <path>\Build.scala:8:80: value map is not a member of (sbt.TaskKey[sbt.librarymanagement.UpdateReport], sbt.SettingKey[java.io.File], sbt.SettingKey[String])
[error] def copyDepTask = copyDependencies <<= (update, crossTarget, scalaVersion) map {
[error] ^
[error] <path>\Build.scala:19:16: too many arguments (3) for method apply: (id: String, base: java.io.File)sbt.Project in object Project
[error] Note that 'settings' is not a parameter name of the invoked method.
[error] settings = Defaults.defaultSettings ++ Seq(
[error] ^
[error] three errors found
[error] (compile:compileIncremental) Compilation failed
A quick resolve is highly appreciated.
My Build.scala looks like this.
import sbt.Keys._
import sbt._
object MyBuild extends Build {
lazy val copyDependencies = TaskKey[Unit]("copy-dependencies")
def copyDepTask = copyDependencies <<= (update, crossTarget, scalaVersion) map {
(updateReport, out, scalaVer) =>
updateReport.allFiles foreach { srcPath =>
val destPath = out / "lib" / srcPath.getName
IO.copyFile(srcPath, destPath, preserveLastModified=true)
}
}
lazy val root = Project(
"root",
file("."),
settings = Defaults.defaultSettings ++ Seq(
copyDepTask
)
)
}
Also, I do rekon there is a issue with sbt-assembly upgrades as well which I'm not entirely aware of.

In sbt version 1.0.x, some key dependencies operators were removed. See the migration docs: https://www.scala-sbt.org/0.13/docs/Migrating-from-sbt-012x.html
Here is an short tutorial for writing Build.scala for sbt version 1.0.x: https://alvinalexander.com/scala/sbt-how-to-use-build.scala-instead-of-build.sbt.
You can also refer to build.scala of an existing project for more ref, eg. scalaz.

Related

How to fix "Could not find proxy for val base" error in sbt 1.3.0 loading project

I upgraded to sbt 1.3.0 and related plugins.sbt. When I try to start sbt for my project it fails to initialize with the error
java.lang.IllegalArgumentException: Could not find proxy for val base: sbt.SettingKey in List(value base, method sbtdef$1, method $sbtdef, object $bd1712fb73ddc970045f, package <empty>, package <root>) (currentOwner= method $sbtdef )
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:316)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.$anonfun$proxy$4(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.searchIn$1(LambdaLift.scala:321)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.proxy(LambdaLift.scala:330)
[error] at scala.tools.nsc.transform.LambdaLift$LambdaLifter.proxyRef(LambdaLift.scala:370)
I did find this stackoverflow question Could not find proxy for ... in Macro , but I don't think it helps my error.
i think the code perpetrator is
//Ensure that version.sbt is included with each package.
mappings in Universal ++= {
val h=(packageBin in Compile, baseDirectory)
val base=h._2
val versionFile = (base.value / "version.sbt")
versionFile.get.map(file => file -> file.name)
}
and for some reason base is not storing (packageBin in Compile, baseDirectory) properly?
Edit:
I not a 100% but I think I fixed it by removing the intermediate variables and one lining it. So something like this:
mappings in Universal ++= {
((packageBin in Compile, baseDirectory)._2.value / "version.sbt").get.map(file => file -> file.name)
}
I don't know why it fixed it though...
I think the OP has confused the example with the ineffectual tuple use; perhaps there is some misunderstanding with some SBT API/DSL usage, that is, packageBin in Compile is never used or resolved (for it's side-effect).
I believe the error, however, is more to do with expressing the mappings in Universal task value in way the macro cannot process - it gets confused - for instance, expecting the macro/compiler to find the taskKey in a variable base, which is the _2 in a Tuple2.
The example could be rewritten as
mappings in Universal ++= {
(baseDirectory.value / "version.sbt").get.map(file => file -> file.name)
}
or
mappings in Universal ++= {
(baseDirectory in(Compile, packageBin)).value / "version.sbt").get.map(file => file -> file.name)
}
Depending on what the intention was (probably the latter).
Of course the new syntax would be
mappings in Universal ++= {
((Compile / packageBin / baseDirectory).value / "version.sbt").get.map(file => file -> file.name)
}

sbt - object apache is not a member of package org

I want to deploy and submit a spark program using sbt but its throwing error.
Code:
package in.goai.spark
import org.apache.spark.{SparkContext, SparkConf}
object SparkMeApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("First Spark")
val sc = new SparkContext(conf)
val fileName = args(0)
val lines = sc.textFile(fileName).cache
val c = lines.count
println(s"There are $c lines in $fileName")
}
}
build.sbt
name := "First Spark"
version := "1.0"
organization := "in.goai"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
resolvers += Resolver.mavenLocal
Under first/project directory
build.properties
bt.version=0.13.9
When I am trying to run sbt package its throwing error given below.
[root#hadoop first]# sbt package
[info] Loading project definition from /home/training/workspace_spark/first/project
[info] Set current project to First Spark (in build file:/home/training/workspace_spark/first/)
[info] Compiling 1 Scala source to /home/training/workspace_spark/first/target/scala-2.11/classes...
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:3: object apache is not a member of package org
[error] import org.apache.spark.{SparkContext, SparkConf}
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:9: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("First Spark")
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:11: not found: type SparkContext
[error] val sc = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 4 s, completed May 10, 2018 4:05:10 PM
I have tried with extends to App too but no change.
Please remove resolvers += Resolver.mavenLocal from build.sbt. Since spark-core is available on Maven, we don't need to use local resolvers.
After that, you can try sbt clean package.

SBT cannot resolve class declared in src/main/scala in a src/test/scala test class

I am trying to make my own custom CSV reader. I am using IntelliJ IDEA 14 with sbt and specs2 test framework.
The class I declared in src/main is as follows:
import java.io.FileInputStream
import scala.io.Source
class CSVStream(filePath:String) {
val csvStream = Source.fromInputStream(new FileInputStream(filePath)).getLines()
val headers = csvStream.next().split("\\,", -1)
}
The content of the test file in src/test is as follows:
import org.specs2.mutable._
object CSVStreamSpec {
val csvSourcePath = getClass.getResource("/csv_source.csv").getPath
}
class CSVStreamSpec extends Specification {
import CSVStreamLib.CSVStreamSpec._
"The CSV Stream reader" should {
"Extract the header" in {
val csvSource = CSVStream(csvSourcePath)
}
}
}
The build.sbt file contains the following:
name := "csvStreamLib"
version := "1.0"
scalaVersion := "2.11.4"
libraryDependencies ++= Seq("org.specs2" %% "specs2-core" % "2.4.15" % "test")
parallelExecution in Test := false
The error I am getting when I type test is as follows:
[error] /Users/raiyan/IdeaProjects/csvStreamLib/src/test/scala/csvStreamSpec.scala:18: not found: value CSVStream
[error] val csvSource = CSVStream(csvSourcePath)
[error] ^
[error] one error found
[error] (test:compile) Compilation failed
[error] Total time: 23 s, completed 30-Dec-2014 07:44:46
How do I make the CSVStream class accessible to the CSVStreamSpec class in the test file?
Update:
I tried it with sbt in the command line. The result is the same.
You forgot the new keyword. Without it, the compiler looks for the companion object named CSVStream, not the class. Since there is none, it complains. Add new and it'll work.

Writing a custom matcher for NodeSeq

I'm trying to write a simple custom matcher for NodeSeq, with scalatest v.2.0.M5b.
package test
import org.scalatest.matchers.{MatchResult, Matcher, ShouldMatchers}
import scala.xml.NodeSeq
import org.scalatest.FunSpec
class MySpec extends FunSpec with ShouldMatchers with MyMatcher {
describe("where is wrong?") {
it("showOK") {
val xml = <span>abc</span>
xml should contn("b")
}
}
}
trait MyMatcher {
class XmlMatcher(str: String) extends Matcher[NodeSeq] {
def apply(xml: NodeSeq) = {
val x = xml.toString.contains(str)
MatchResult(
x,
"aaa",
"bbb"
)
}
}
def contn(str: String) = new XmlMatcher(str)
}
When I compile it, it reports error:
[error] /Users/freewind/src/test/scala/test/MyMacher.scala:14: overloaded method value should with alternatives:
[error] (beWord: MySpec.this.BeWord)MySpec.this.ResultOfBeWordForAnyRef[scala.collection.GenSeq[scala.xml.Node]] <and>
[error] (notWord: MySpec.this.NotWord)MySpec.this.ResultOfNotWordForAnyRef[scala.collection.GenSeq[scala.xml.Node]] <and>
[error] (haveWord: MySpec.this.HaveWord)MySpec.this.ResultOfHaveWordForSeq[scala.xml.Node] <and>
[error] (rightMatcher: org.scalatest.matchers.Matcher[scala.collection.GenSeq[scala.xml.Node]])Unit
[error] cannot be applied to (MySpec.this.XmlMatcher)
[error] xml should contn("b")
[error] ^
[error] one error found
[error] (test:compile) Compilation failed
Where is wrong?
Update:
The build.sbt file I use:
name := "scalatest-test"
scalaVersion := "2.10.1"
version := "1.0"
resolvers ++= Seq("snapshots" at "http://oss.sonatype.org/content/repositories/snapshots",
"releases" at "http://oss.sonatype.org/content/repositories/releases",
"googlecode" at "http://sass-java.googlecode.com/svn/repo"
)
libraryDependencies += "org.scalatest" %% "scalatest" % "2.0.M5b" % "test"
And a demo project: https://github.com/freewind/scalatest-test
For the reason why scala compiler complains see this answer.
But it seems that ScalaTest API has changed quite a bit since then, so the two solutions proposed both need some modification (tested for ScalaTest 2.0.M5b):
Replace All instances of NodeSeq to GenSeq[Node] so that the type matches everywhere.
See SeqShouldWrapper class of ScalaTest.
Alternatively, wrap xml explicitely with the conversion function, i.e. manually setting the required type but I don't recommend this because it makes client code ugly.
new AnyRefShouldWrapper(xml).should(contn("b"))
BTW, it is good to have a small but complete project on github for others to tweak. It makes this question much more attractive.

actual argument play.data.Form cannot be converted to play.api.data.Form

play doesn't convert my java form object to the scala world.
[error] /home/myproject/split/frontend/app/controllers/frontend/Configuration.java:46: error: method render in class settings cannot be applied to given types;
[error] return ok(settings.render(settingsForm.fill(userSettings)));
[error] ^
[error] required: play.api.data.Form<Settings>
[error] found: play.data.Form<Settings>
[error] reason: actual argument play.data.Form<Settings> cannot be converted to play.api.data.Form<Settings> by method invocation conversion
the view-template looks like this:
#(settingsForm: Form[Settings])
#import play.i18n._
#import helper._
#import helper.twitterBootstrap._
#main {
#helper.form(action = controllers.frontend.routes.Configuration.setSettings) {
Any idea?
I should also mention that we use project split main->frontend->common and main->backend->common. We moved this page (view and controller) from common to frontend. It worked in common fine. Now in frontend I get this error.
I actually had a similar problem with a java.util.List and I had to add templatesImport ++= Seq("java.util._", ... to the settings:
val frontend = play.Project(
appName + "-frontend", appVersion, path = file("main/frontend")
).settings(
templatesImport ++= Seq("java.util._", "models.frontend._")
).dependsOn(common).aggregate(common)
I tried with play.data._ already, didn't help.
Your frontend project is a Scala project, not a Java project. Add a dependency on javaCore to it, and it will be a Java project. Then do a play clean compile, and everything should work. Eg:
val frontend = play.Project(
appName + "-frontend", appVersion, Seq(javaCore), path = file("main/frontend")
).settings(
templatesImport ++= Seq("java.util._", "models.frontend._")
).dependsOn(common).aggregate(common)