I am porting a small legacy library from scala 2.12 to scala 2.13. sbt version is 1.3.3. The project is flat and relatively simple. scalaVersion declared in the project is 2.13.1.
I am executing clean and compile tasks, and then publish to both local ivy and to the artifactory.
The process seemingly goes fine and creates the artifact with the _2.13 suffix. When this binary gets executed against scala 2.13 runtime, it fails with MethodNotFound exception. Further introspection shows that the artifact was compiled for 2.12 but not for 2.13.
Does anybody have an idea why a different compiler version was used by sbt, and how to fix this problem?
Just had similar issue, sbt was compiling my project to the wrong Scala version, found this question without answers in google.
So, my problem was actually very simple. Turns out you need to start sbt in a project root directory (where build.sbt is located). I was running it from a directory where all my .scala files were located, so it haven't parsed build.sbt and compiled project using default Scala version (2.12 in my case).
Related
I have been using sbt on windows and a custom build.sbt script in conjunction with an import Chisel._ in the top-level file in order to generate Verilog from my Chisel source successfully.
I'm trying to get an IDE working on Windows to expedite Chisel development. I've gone with the Eclipse based SCALA IDE http://scala-ide.org/download/sdk.html/
I want to compile the Chisel library so that the import Chisel._ can be resolved locally, without having to go off and download the source from the repository each timeand recompile the source. When I download the Chisel-master repo from Git and include the src\main folder in my SCALA project in the SCALA IDE, I get lots of syntax errors in the Chisel SCALA files that prevent me from building the project.
Has anyone done anything like this before on Windows or have any knowledge of working with the SCALA IDE as it may just be a case of undefined symbols in the project configuration?
Not sure exactly what you did with build.sbt respect to recompile (I think it download it only the first time, then it caches it for the future). But I'm using ScalaIDE for Chisel on linux, using the default build.sbt files, maybe you can try to get it working out of the box first to help narrow down the issue.
Here are the steps I took in order to get ScalaIDE work with Chisel:
the latest Scala IDE uses 2.11.8, the current Chisel repository defaults to 2.11.7. So I had to change all the build.sbt reference to scalaVersion from 2.11.7 to 2.11.8
I used sbteclipse
https://github.com/typesafehub/sbteclipse
To create importable the workspace to setup the compilation dependencies.
Except for chiselFrontEnd. For some reason, this package is not added to the dependency. I have to Add chiselFrontEnd as a javabuildpath dependency manually (Properties/JavaBuildPath, under Projects) for my own projects.
To resolve undefined symbols, you can also add a JAR onto the project build path using Project Properties > Java Build Path > Libraries > Add External JARs...
If you are getting your JARs through Maven / SBT, they should be in:
C:\Users\<name>\.ivy2\local\edu.berkeley.cs\chisel3_2.11\jars
If you are using publish-local with chisel3, your JARs should be in
C:\Users\<name>\.ivy2\cache\edu.berkeley.cs\chisel3_2.11\jars
Note that chisel3 is compiled into one JAR, including coreMacros and chiselFrontend sub-projects
Of course, this is a more quick-and-dirty solution compared to something that can parse SBT files.
I have some main object:
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
object Main {
def main(args: Array[String]) {
val sc = new SparkContext(
new SparkConf().setMaster("local").setAppName("FakeProjectName")
)
}
}
...then I add spark-assembly-1.3.0-hadoop2.4.0.jar to the build path in Eclipse from
Project > Properties... > Java Build Path :
...and this warning appears in the Eclipse console:
More than one scala library found in the build path
(C:/Program Files/Eclipse/Indigo 3.7.2/configuration/org.eclipse.osgi/bundles/246/1/.cp/lib/scala-library.jar,
C:/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar).
This is not an optimal configuration, try to limit to one Scala library in the build path.
FakeProjectName Unknown Scala Classpath Problem
Then I remove Scala Library [2.10.2] from the build path, and it still works. Except now this warning appears in the Eclipse console:
The version of scala library found in the build path is different from the one provided by scala IDE:
2.10.4. Expected: 2.10.2. Make sure you know what you are doing.
FakeProjectName Unknown Scala Classpath Problem
Is this a non-issue? Either way, how do I fix it?
This is often a non-issue, especially when the version difference is small, but there are no guarantees...
The problem is (as stated in the warning) that your project has two Scala libraries on the class path. One is explicitly configured as part of the project; this is version 2.10.2 and is shipped with the Scala IDE plugins. The other copy has version 2.10.4 and is included in the Spark jar.
One way to fix the problem is to install a different version of Scala IDE, that ships with 2.10.4. But this is not ideal. As noted here, Scala IDE requires every project to use the same library version:
http://scala-ide.org/docs/current-user-doc/gettingstarted/index.html#choosing-what-version-to-install
A better solution is to clean up the class path by replacing the Spark jar you are using. The one you have is an assembly jar, which means it includes every dependency used in the build that produced it. If you are using sbt or Maven, then you can remove the assembly jar and simply add Spark 1.3.0 and Hadoop 2.4.0 as dependencies of your project. Every other dependency will be pulled in during your build. If you're not using sbt or Maven yet, then perhaps give sbt a spin - it is really easy to set up a build.sbt file with a couple of library dependencies, and sbt has a degree of support for specifying which library version to use.
The easiest solution:
In Eclipse :
1. Project/ (righclick) Properties
2. Go to Scala Compiler
3. click Use Project Settings
4. set Scala Installation to a compatible version. Generally Fixed Scala Installation 2.XX.X (build-in)
5. Rebuild the project.
There are 2 types of Spark JAR files (just by looking at the Name):
- Name includes the word "assembly" and not "core" (has Scala inside)
- Name includes the word "core" and not "assembly" (no Scala inside).
You should include the "core" type in your Build Path via “Add External Jars”
(the version you need) since the Scala IDE already shoves one Scala for you.
Alternatively, you can just take advantage of the SBT and add the following
Dependency (again, pay attention to the versions you need):
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.1.0"
Then you should NOT include “forcefully” any spark JAR in the Build Path.
Happy sparking:
Zar
>
Reason I ask, is because it's possible to specify a Scala version in the build.sbt file (using scalaVersion setting), and once you do that, sbt will automatically download that Scala version to use with the project.
I also seem to remember that despite having Scala 2.11.1 on my system, sbt would compile and run with Scala 2.10 if no version was specified.
So the question is, do I need to install Scala separately if I got sbt installed?
No you don't need it. sbt will download Scala for you.
If you install sbt-extras (basically just a script) you don't even need to download sbt: it will automatically download the sbt launcher you need. Very handy since you just need to specify sbt.version in your build.properties and you're good to go.
Edit: removed my comment about not being able to do sbt console in an empty directory, since both sbt and sbt-extras support it now.
I was getting the unresolved dependencies like the question here.
Getting org.scala-tools.sbt sbt_2.9.1 0.7.7 ...
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: org.scala-tools.sbt#sbt_2.9.1;0.7.7: not found
::::::::::::::::::::::::::::::::::::::::::::::
SBT 0.7.7 uses Scala 2.7.7 for the project configuration. SBT 0.11 uses Scala 2.9.1. You can use SBT 0.7.7 for configurations up to that version, but versions of SBT newer than 0.7.7 use a non-compatible configuration file.
Note that this is not related to the Scala version that will be used to compile the project itself, just the Scala version that is used to compile the configuration file. These are different things: you can use whatever version of Scala you want to compile your project, but you must use the version of Scala mandated by the SBT version to compile the project configuration.
To get an error message like that you must either have changed the Scala version for the project configuration, or used a newer SBT with a project written for an older version of SBT. Find out which it was, and correct the problem as needed.
I have an SBT 0.7.7 for projects that have not yet migrated to the new version, and the latest SBT for everything else. Put a different name on each script, and you are good to go.
While sbt-launch.jar makes an attempt to download and use the version of sbt specified in the project's project/build.properties, they must be compiled with matching versions of Scala. I think sbt 0.7.7 was compiled with Scala 2.7, but the most recent versions of sbt are compiled with Scala 2.9.
Most folks now just use a version of sbt-launch.jar that matches the version specified in project/build.properties. If you're running on Linux, OS X, or pretty anything that can run a Bash script, I highly recommend the launch script from sbt-extras. It will automagically use the version of sbt-launch.jar according to what's specified in project/build.properties, and gives some other handy command line parameters.
If that doesn't work for you, I think your best bet is different launch scripts to launch the different minor versions of sbt. Such as sbt7, sbt10 and sbt11, which launch 0.7.7, 0.10.1 and 0.11.2, respectively.
I have noticed that the issue can be resolved by updating or removing the sbt.version entry from project_root_folder/project/build.properties My current sbt version is 0.11.2 and I have updated it to sbt.version=0.11.2 . Removing the entry also works.
Firing up the SBT console it reads :
[info] Building project AYLIEN 1.0 against Scala 2.8.1
[info] using MyProject with sbt 0.7.4 and Scala 2.7.7
How can I make it use MyProject with sbt 0.7.4 and Scala 2.8.1 ? Please pay attenetion that I'm not asking about the Scala version that is used to build my project (it is the 2.8.1 as you can see), but I rather want to make sbt use MyProject with Scala 2.8.1. Apparently sbt uses it's own scala version to work with project definition (MyProject here) which is different than one it uses to actually build the project! or perhaps I'm missing something ... ?
I can see your concern about SBT still using 2.7.7 internally, but it doesn't really matter since SBT downloads that version on its own. You do not have to install 2.7.7 or anything, just forget about it and pretend your environment is pure Scala 2.8.
The configuration file that holds the SBT version setting is: project/build.properties. The content looks like this:
project.organization=com.ab.web
project.name=cool_proj
sbt.version=0.7.4
project.version=1.0
build.scala.versions=2.8.0
project.initialize=false
When you want to move up to the next SBT version, just change 0.7.4 to that version and SBT will update itself. Eventually SBT will use some other Scala version internally, but this will not matter to the user.
SBT 0.7.* won't work with Scala 2.8.* for your project definition.
Mark Harrah is currently working on the next version of SBT which will work with 2.8.*. This means that you can't use any Scala features or functionality that was added after Scala 2.7.7 in your project definition or plugins. Your project itself is free to use 2.8.*.