I installed Scala^Z3 on my Mac OSX (Mountain Lion, JDK 7, Scala 2.10, Z3 4.3) successfully (following this: http://lara.epfl.ch/w/ScalaZ3). Everything went fine except that I cannot run any example from this website (http://lara.epfl.ch/w/jniz3-scala-examples) without getting this nasty error:
java.lang.NoClassDefFoundError: scala/reflect/ClassManifest
at .<init>(<console>:8)
at .<clinit>(<console>)
at .<init>(<console>:7)
...
Caused by: java.lang.ClassNotFoundException: scala.reflect.ClassManifest
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 29 more
I think this happens because of the incompatibility between the Scala 2.9.x and 2.10.x in handling reflections. As I was able to run the same set of examples under Scala 2.9.x. My question is, is there anyway to go around this and run Scala^Z3 under Scala 2.10?
From looking at the project propertie and build file (https://github.com/psuter/ScalaZ3/blob/master/project/build.properties and https://github.com/psuter/ScalaZ3/blob/master/project/build/scalaZ3.scala) I infer that scalaZ3 is currently provided for scala 2.9.2 only. There is no cross version support at the moment.
You might try to get the code and compile it yourself after having changed the version to scala 2.10.0 in the "build.properties" file.
See this page for instructions on how to compile it: https://github.com/psuter/ScalaZ3.
If you're lucky, the code will compile as is under scala 2.10. If you're not, there might be some small fixes to do. Cross your fingers.
If you are not in a hurry, you could also bug the Scala^Z3 authors and ask them for scala 2.10 version of the library.
I'm copying the instructions from my response to your issue on GitHub, as it may help someone in the future.
The current status is that the old sbt project does not seem mix well with Scala 2.10. Here are the instructions for a "manual" compilation of the project, for Linux. This works for me with Z3 4.3 (grabbed from the Z3 git repo) and Scala 2.10. After installing Z3 per the original instructions:
First compile the Java files:
$ mkdir bin
$ find src/main -name '*.java' -exec javac -d bin {} +
Then compile the C files. For this, you need to generate the JNI headers first, then compile the shared library. The options in the commands below are for Linux. To find our where the JNI headers are, I run (new java.io.File(System.getProperty("java.home")).getParent in a Scala console (and add /include to the result).
$ javah -classpath bin -d src/c z3.Z3Wrapper
$ gcc -o lib-bin/libscalaz3.so -shared -Wl,-soname,libscalaz3.so \
-I/usr/lib/jvm/java-6-sun-1.6.0.26/include \
-I/usr/lib/jvm/java-6-sun-1.6.0.26/include/linux \
-Iz3/4.3/include -Lz3/4.3/lib \
-g -lc -Wl,--no-as-needed -Wl,--copy-dt-needed -lz3 -fPIC -O2 -fopenmp \
src/c/*.[ch]
Now compile the Scala files:
$ find src/main -name '*.scala' -exec scalac -classpath bin -d bin {} +
You'll get "feature warnings", which is typical when moving to 2.10, and another warning about a non-exhaustive pattern match.
Now let's make a jar file out of everything...
$ cd bin
$ jar cvf scalaz3.jar z3
$ cd ..
$ jar uf bin/scalaz3.jar lib-bin/libscalaz3.so
...and now you should have bin/scalaz3.jar containing everything you need. Let's try it out:
$ export LD_LIBRARY_PATH=z3/4.3/lib
$ scala -cp bin/scalaz3.jar
scala> z3.scala.version
Hope this helps!
This does not directly answer the question, but might help others trying to build scalaz3 with scala 2.10.
I built ScalaZ3 with Scala 2.10.1 and Z3 4.3.0 on Windows 7. I tested it with the integer constraints example at http://lara.epfl.ch/w/jniz3-scala-examples and it is working fine.
Building Z3
The Z3 4.3.0 download at codeplex does not include libZ3.lib file. So I had to download the source and build it on my machine. The build process is quite simple.
Building ScalaZ3
Currently, build.properties has sbt version 0.7.4 and scala version 2.9.2. This builds fine. ( I had to make some minor modifications to the build.scala file. Modify z3LibPath(z3VN).absolutePath to z3LibPath(z3VN).absolutePath + "\\libz3.lib" in the gcc task. )
Now if I change scala version to 2.10.1 in build.properties, I get "Error compiling sbt component 'compiler-interface'" error on launching sbt. I have no clue why this happens.
I then changed sbt version to 0.12.2 and scala version to 2.10.1, and started with the fresh source. I have also added build.sbt in project root folder containing scalaVersion := "2.10.1". This is required since in sbt 0.12.2, the file build.properties is supposed to be used for specifying the sbt version only. More info about sbt version differences at (https://github.com/harrah/xsbt/wiki/Migrating-from-SBT-0.7.x-to-0.10.x).
I get error Z3Wrapper.java:27: cannot find symbol LibraryChecksum. This happens because the file LibraryChecksum.java which is supposted to be generated by the build (project\build\build.scala) is not generated. Looks like the package task does not execute the tasks in (project\build\build.scala). The tasks compute-checksum, javah, and gcc are not executed. This may be happening because sbt 0.12.2 expects the build.scala file to be directly under the project folder.
I then copied the LibraryChecksum.java generated from the previous build, the build then goes through. The generated jar file does not contain scalaz3.dll.
I then executed javah and gcc tasks manually. The command for these tasks can be copied from the log of successful build with scala 2.9.2(I made appropriate modifications to the commands for scala 2.10.1). Here also, I had to make some changes. I had to explicitly add full path of scala-library.jar classpath of the javah task.
I then added the lib-bin\scalaz3.dll to the jar file using jar uf target\scala-2.10\scalaz3.jar lib-bin/scalaz3.dll.
Related
If you follow the steps at the official Scala 3 sites, like Dotty or Scala Lang then it recommends using Coursier to install Scala 3. The problem is that neither or these explain how to run a compiled Scala 3 application after following the steps.
Scala 2:
> cs install scala
> scalac HelloScala2.scala
> scala HelloScala2
Hello, Scala 2!
Scala 3:
> cs install scala3-compiler
> scala3-compiler HelloScala3.scala
Now how do you run the compiled application with Scala 3?
Currently there does not seem to be a way to launch a runner for Scala 3 using coursier, see this issue. As a workaround, you can install the binaries from the github release page. Scroll all the way down passed the contribution list to see the .zip file and download and unpack it to some local folder. Then put the unpacked bin directory on your path. After a restart you will get the scala command (and scalac etc) in terminal.
Another workaround is using the java runner directly with a classpath from coursier by this command:
java -cp $(cs fetch -p org.scala-lang:scala3-library_3:3.0.0):. myMain
Replace myMain with the name of your #main def function. If it is in a package myPack you need to say myPack.myMain (as usual).
Finally, it seems that is possible to run scala application like scala 2 version using scala3 in Coursier:
cs install scala3
Then, you can compile it with scala3-compiler and run with scala3:
scala3-compiler Main.scala
scala3 Main.scala
This work-around seems to work for me:
cs launch scala3-repl:3+ -M dotty.tools.MainGenericRunner -- YourScala3File.scala
This way, you don't even have to compile the source code first.
In case your source depends on third-party libraries, you can specify the dependencies like this:
cs launch scala3-repl:3+ -M dotty.tools.MainGenericRunner -- -classpath \
$(cs fetch --classpath io.circe:circe-generic_3:0.14.1):. \
YourScala3File.scala
This would be an example where you use the circe library that's compiled with Scala 3. You should be able to specify multiple third-party libraries with the fetch sub-command.
There are multiple binary incompatible scala 2 versions, however the document says the installation is either via IDE or SBT.
DOWNLOAD SCALA 2
Then, install Scala:...either by installing an IDE such as IntelliJ, or sbt, Scala's build tool.
Spark 3 needs Scala 2.12.
Spark 3.1.2 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x).
Then how can we make sure the scala version is 2.12 if we install sbt?
Or the documentation is not accurate and it should be "to use specific version of scala, need to download specific scala version on your own"?
Updates
As per the answer by mario-galic, in ONE-CLICK INSTALL FOR SCALA it is said:
Installing Scala has always been a task more challenging than necessary, with the potential to drive away beginners. Should I install Scala itself? sbt? Some other build tools? What about a better REPL like Ammonite? Oh and before all that I need to install Java?
To solve this problem, the Scala Center contracted Alexandre Archambault in January 2020 to add a one-click install of Scala through coursier. For example, on Linux, all we now need is:
$ curl -Lo cs https://git.io/coursier-cli-linux && chmod +x cs && ./cs setup
The Scala version is specified in the build.sbt file so SBT will download the appropriate version of Scala as necessary.
I personally use SDKMAN! to install Java and then SBT.
The key concept to understand is the difference between system-wide installation and project-specific version. System-wide installation ends up somewhere on the PATH like
/usr/local/bin/scala
and can be installed in various ways, personally I recommend coursier one-click install for Scala
curl -Lo cs https://git.io/coursier-cli-linux && chmod +x cs && ./cs setup
Project-specific version is specified by scalaVersion sbt settings which downloads Scala to coursier cache location. To see the Scala version and location used by the particular project try show scalaInstance which
inspect scalaInstance
[info] Task: sbt.internal.inc.ScalaInstance
[info] Description:
[info] Defines the Scala instance to use for compilation, running, and testing.
Scala should be binary compatible within minor version so Spark 3 or any other software built against any Scala 2.12.x version should work with any other Scala 2.12.x version where we have major.minor.patch. Note binary compatibility is not guaranteed for internal compiler APIs, so for example when publishing compiler plugins the best practice is to publish it against full specific Scala version. For example notice how kind-projector compiler plugin is published against full Scala version 2.13.6
https://repo1.maven.org/maven2/org/typelevel/kind-projector_2.13.6/
whilst scala-cats application-level library is published against any Scala 2.13.x version
https://repo1.maven.org/maven2/org/typelevel/cats-core_2.13/
Similarly spark is published against any Scala 2.12.x version
https://repo1.maven.org/maven2/org/apache/spark/spark-core_2.12/
Regarding system-wide installation one trick I do for quick switching of versions is to put scala-runners on the PATH and then different versions can be launched via --scala-version argument
scala --scala-version 2.12.14
Using coursier or scala-runners you can even switch JDK quickly via -C--jvm for example
scala --scala-version 2.12.14 -C--jvm=11
For a project, there should be no need to download manually a specific version of Scala. sbt either directly or indirectly via an IDE will download all the dependencies behind the scenes for you, so the only thing to specify is sbt setting scalaVersion.
Using Python as analogy to Scala, and Pipenv as anology to sbt, then python_version in Pipfile is similar to scalaVersion in build.sbt. After executing pipenv shell and pipenv install you end up with project-specific shell environment with project specific Python version and dependencies. sbt similarly downloads project specific Scala version and dependencies based of build.sbt although it has no need for lock files or for modifying your shell environment.
So I am editing MovieLensALS.scala and I want to just recompile the examples jar with my modified MovieLensALS.scala.
I used build/mvn -pl :spark-examples_2.10 compile followed by build/mvn -pl :spark-examples_2.10 package which finish normally. I have SPARK_PREPEND_CLASSES=1 set.
But when I re-run MovieLensALS using bin/spark-submit --class org.apache.spark.examples.mllib.MovieLensALS examples/target/scala-2.10/spark-examples-1.4.0-hadoop2.4.0.jar --rank 5 --numIterations 20 --lambda 1.0 --kryo data/mllib/sample_movielens_data.txt I get java.lang.StackOverflowError even though all I added to MovieLensALS.scala is a println saying that this is the modified file, with no other modifications whatsoever.
My scala version is 2.11.8 and spark version is 1.4.0 and I am following the discussion on this thread to do what I am doing.
Help will be appreciated.
So I ended up figuring it out myself. I compiled using mvn compile -rf :spark-examples_2.10 followed by mvn package -rf :spark-examples_2.10 to generate the .jar file. Note that the jar file produced here is spark-examples-1.4.0-hadoop2.2.0.jar.
On the other hand, the stackoverflow error was because of a long lineage. For that I could either use checkpoints of reduce numiterations, I did the later. I followed this for it.
I am a bit confused right now. I have installed IntelliJ and the Scala SDK using the installation wizard provided by IntelliJ. I selected the latest version 2.11.8 but were not able to run it in the terminal.
After that I ran
sudo apt-get install scala
which is apparently version 2.9.2
scala -version
Scala code runner version 2.9.2 -- Copyright 2002-2011, LAMP/EPFL
It appears that I now have installed scala under
/home/<user>/.ivy2/cache
(which comes from IntelliJ and has version 2.11.8) and also
whereis scala
scala: /usr/bin/scala /usr/bin/X11/scala /usr/share/man/man1/scala.1.gz
which comes from the package manager and hast version 2.9.2.
How would I clean up this mess?
This is not really a mess. It's perfectly possible to install a system-wide Scala and one based on your projects. The system-wide one really makes little sense except for you to run shell scripts using scala script.scala. You can also see that those Debian and Ubuntu packages are horribly outdated.
Since Scala is not an end-user application but a compiler or library required by specific project builds, one usually does not need and/or use a global installation. When you build a project using sbt or IntelliJ, your project can freely choose the Scala version it wants to use, whether that's 2.9.2, 2.10.x, 2.11.x or 2.12.0-Milestone... And therefore, these installations are all kept concurrently in a cache such as the Ivy cache you are seeing from IntelliJ/sbt.
$ ls -la ~/.ivy2/cache/org.scala-lang/scala-compiler/jars/
... scala-compiler-2.10.0.jar
... scala-compiler-2.10.2.jar
... scala-compiler-2.10.3.jar
... scala-compiler-2.10.4.jar
... scala-compiler-2.10.5.jar
... scala-compiler-2.10.6.jar
... scala-compiler-2.11.0.jar
... scala-compiler-2.11.1.jar
... scala-compiler-2.11.2.jar
... scala-compiler-2.11.5.jar
... scala-compiler-2.11.6.jar
... scala-compiler-2.11.7.jar
... scala-compiler-2.11.8.jar
... scala-compiler-2.12.0-M2.jar
... scala-compiler-2.12.0-M3.jar
... scala-compiler-2.7.3.jar
... scala-compiler-2.8.0.jar
... scala-compiler-2.8.2.jar
... scala-compiler-2.9.0.jar
... scala-compiler-2.9.2.jar
If you want a global installation because you want to run scripts without setting up a minimal sbt project, I recommend not to use apt (you can uninstall scala again) but simply download the latest version from http://scala-lang.org/download/ - I'm not aware of any .deb package depending on a .deb Scala installation.
A scalacheck jar was accidentally included in the standard distribution of Scala 2.9.2, in the lib directory, along with the standard scala runtime classes (e.g. scala-library.jar). This was discovered, and fixed for subsequent Scala distributions.
I'd like to run the scala 2.9.2 interpreter and use a different version of scalacheck, but I can't get it to ignore the version in lib.
I tried:
$ LOAD_SCALACHECK='import org.scalacheck.Gen; println(Gen.choose(0, 1).sample.get)'
$ scala -e "$LOAD_SCALACHECK"
$ scala -nobootcp -e "$LOAD_SCALACHECK"
$ scala -Dscala.usejavacp=false -e "$LOAD_SCALACHECK"
$ scala -Dscala.usejavacp=false -nobootcp -e "$LOAD_SCALACHECK"
All of these still used the scalacheck.jar. Is there any way, aside from deleting the jar from lib, to run the interpreter excluding a jar from lib on the classpath?
You could always just excise the interloper, at least for a non-SBT-managed Scala... If you did that in a an SBT-managed Scala I'm pretty sure SBT would just put it back.
In the SBT case, you might try revoking all permissions to read that JAR. I don't know what the classloader would do in that case, but one hopes it's robust enough to skip over it and keep looking through the class-path...