scalac's dependencyfile option doesn't work? - scala

I was reading a previous stack overflow question about setting up ant tasks for scalac:
FSC recompiles every time
The weird thing is when I test the -dependencyfile option, I find no dependency file generated anywhere. I am testing this with ubuntu's scalac 2.9.1 and also with the official 2.10.2 jar in ant tasks. It seems the command-line utility and the ant task both take my input without complaining anything (they are grammatically correct?) though.
Am I using this feature in the wrong way?
E.g. from command-line:
scalac -dependencyfile scala_dependencies *.scala
from an ant tasks
<!-- inside a macro definition -->
<scalac destdir="#{destdir}" deprecation="yes"
force="yes" dependencyfile="scala_dependencies"
addparams="#{addparams}"
scalacdebugging="true">
<!-- src, classpath, patternset and etc -->
</scalac>
updates:
I added -make:transitive options to comand-line it does generate a file for me but I ran it for the second time, scalac simply crashed.
$ scalac -make:transitive -dependencyfile scala_dependencies *.scala
Recompiling 2 files
warning: there were 1 deprecation warnings; re-run with -deprecation for details
error: java.lang.NullPointerException
at scala.tools.nsc.io.Path$.apply(Path.scala:73)
at scala.tools.nsc.dependencies.DependencyAnalysis$$anonfun$saveDependencyAnalysis$1.apply(DependencyAnalysis.scala:33)
at scala.tools.nsc.dependencies.DependencyAnalysis$$anonfun$saveDependencyAnalysis$1.apply(DependencyAnalysis.scala:33)
at scala.tools.nsc.dependencies.Files$FileDependencies$$anonfun$emit$1$2$$anonfun$apply$6.apply(Files.scala:96)
at scala.tools.nsc.dependencies.Files$FileDependencies$$anonfun$emit$1$2$$anonfun$apply$6.apply(Files.scala:96)
at scala.collection.mutable.HashSet.foreach(HashSet.scala:72)
at scala.tools.nsc.dependencies.Files$FileDependencies$$anonfun$emit$1$2.apply(Files.scala:96)
at scala.tools.nsc.dependencies.Files$FileDependencies$$anonfun$emit$1$2.apply(Files.scala:96)
at scala.collection.mutable.OpenHashMap$$anonfun$foreach$1.apply(OpenHashMap.scala:221)
at scala.collection.mutable.OpenHashMap$$anonfun$foreach$1.apply(OpenHashMap.scala:219)
at scala.collection.mutable.OpenHashMap$$anonfun$foreachUndeletedEntry$1.apply(OpenHashMap.scala:226)
at scala.collection.mutable.OpenHashMap$$anonfun$foreachUndeletedEntry$1.apply(OpenHashMap.scala:226)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
at scala.collection.mutable.ArrayOps.foreach(ArrayOps.scala:38)
at scala.collection.mutable.OpenHashMap.foreachUndeletedEntry(OpenHashMap.scala:226)
at scala.collection.mutable.OpenHashMap.foreach(OpenHashMap.scala:219)
at scala.tools.nsc.dependencies.Files$FileDependencies.emit$1(Files.scala:96)
at scala.tools.nsc.dependencies.Files$FileDependencies.writeTo(Files.sca
la:103)
at scala.tools.nsc.dependencies.Files$FileDependencies$$anonfun$writeTo$1.apply(Files.scala:92)
at scala.tools.nsc.dependencies.Files$FileDependencies$$anonfun$writeTo$1.apply(Files.scala:92)
at scala.tools.nsc.dependencies.Files$class.writeToFile(Files.scala:163)
at scala.tools.nsc.Global$dependencyAnalysis$.writeToFile(Global.scala:498)
at scala.tools.nsc.dependencies.Files$FileDependencies.writeTo(Files.scala:92)
at scala.tools.nsc.dependencies.DependencyAnalysis$class.saveDependencies(DependencyAnalysis.scala:87)
at scala.tools.nsc.Global$dependencyAnalysis$.saveDependencies(Global.scala:498)
at scala.tools.nsc.dependencies.DependencyAnalysis$class.saveDependencyAnalysis(DependencyAnalysis.scala:32)
at scala.tools.nsc.Global$dependencyAnalysis$.saveDependencyAnalysis(Global.scala:498)
at scala.tools.nsc.Global$Run.compileSources(Global.scala:1022)
at scala.tools.nsc.Global$Run.compile(Global.scala:1038)
at scala.tools.nsc.Main$.process(Main.scala:106)
at scala.tools.nsc.Main$.main(Main.scala:123)
at scala.tools.nsc.Main.main(Main.scala)
for 2.10.2, the compiler complains:
warning: -make is deprecated: this option is unmaintained. Use sbt or an IDE for selective recompilation.
and I still can't find the generate file

Related

Compiling with scalac does not find sbt dependencies

I tried running my Scala code in VSCode editor. I am able to run my script via spark-submit command. But when I am trying with scalac to compile, I am getting:
.\src\main\scala\sample.scala:1: error: object apache is not a member of package org
import org.apache.spark.sql.{SQLContext,SparkSession}
I have already added respective library dependencies to build.sbt.
Have you tried running sbt compile?
Running scalac directly means you're compiling only one file, without the benefits of sbt and especially the dependencies that you have added in your build.sbt file.
In a sbt project, there's no reason to use scalac directly. This defeats the purpose of sbt.

Where to put dependencies for scala

I'm getting this compile error:
owner#PC ~/scala/fxml: scalac x.scala
x.scala:1: error: object asynchttpclient is not a member of package org
import org.asynchttpclient.*;
^
one error found
I figured I needed to download the .java files for org.asynchttpclient.* so I copied those to c:\classes and set CLASS_PATH to c:\classes but that didn't work.
Note: I know about sbt and maven but I just want to get scalac working.
The error is with the dependency for x.scala. You need to download the asynchttpclient jar if you don't have it. Then apply the following command to include it in compilation.
scalac -classpath "asynchttpclient.jar:other dependent jars" x.scala
It's CLASSPATH, not CLASS_PATH. You can also use -classpath ... as an option to scalac.

How to compile and build spark examples into jar?

So I am editing MovieLensALS.scala and I want to just recompile the examples jar with my modified MovieLensALS.scala.
I used build/mvn -pl :spark-examples_2.10 compile followed by build/mvn -pl :spark-examples_2.10 package which finish normally. I have SPARK_PREPEND_CLASSES=1 set.
But when I re-run MovieLensALS using bin/spark-submit --class org.apache.spark.examples.mllib.MovieLensALS examples/target/scala-2.10/spark-examples-1.4.0-hadoop2.4.0.jar --rank 5 --numIterations 20 --lambda 1.0 --kryo data/mllib/sample_movielens_data.txt I get java.lang.StackOverflowError even though all I added to MovieLensALS.scala is a println saying that this is the modified file, with no other modifications whatsoever.
My scala version is 2.11.8 and spark version is 1.4.0 and I am following the discussion on this thread to do what I am doing.
Help will be appreciated.
So I ended up figuring it out myself. I compiled using mvn compile -rf :spark-examples_2.10 followed by mvn package -rf :spark-examples_2.10 to generate the .jar file. Note that the jar file produced here is spark-examples-1.4.0-hadoop2.2.0.jar.
On the other hand, the stackoverflow error was because of a long lineage. For that I could either use checkpoints of reduce numiterations, I did the later. I followed this for it.

How do I distribute a Scala macro as a project?

Suppose I have a Scala compile-time macro that I find useful and would like to share it (I do). How do I create a JAR file that when loaded into another project would execute the macro when compiling the new project?
Specifically, I've made a StaticAnnotation that rewrites the AST of the class that it wraps before compile time. This works in my Maven build (macro defined in the main directory, runs on test cases in the test directory) because I have
<compilerPlugins>
<compilerPlugin>
<groupId>org.scalamacros</groupId>
<artifactId>paradise_2.10.5</artifactId>
<version>2.1.0-M5</version>
</compilerPlugin>
</compilerPlugins>
in my scala-maven-plugin. (I'm starting with a Scala 2.10 project and if it works, will provide both 2.10 and 2.11.)
But if I put the resulting JAR on a Scala console classpath, in a Scala script, or into another Maven project (without special compiler plugins), it simply ignores the macro: the AST does not get overwritten and my compile-time println statements don't execute. If I use the #compileTimeOnly annotation on my macro (new in Scala 2.11), then it complains with the #compileTimeOnly error message.
Do I really need to tell my users to add compiler plugins in their pom.xml files, as well as alternate instructions for SBT and other build tools? Other packages containing macros (MacWire, Log4s) don't come with complicated build instructions: they just say, "point to this dependency in Maven Central." I couldn't find the magic in their build process that makes this work. What am I missing?
If you're relying on a macro-paradise-only feature then yes, you do need to tell your users to add compiler plugins. See http://docs.scala-lang.org/overviews/macros/annotations.html . The projects you mention are only using the scala compiler's built-in (non-paradise) macro features, not macro annotations.

Running Scala^Z3 with Scala 2.10

I installed Scala^Z3 on my Mac OSX (Mountain Lion, JDK 7, Scala 2.10, Z3 4.3) successfully (following this: http://lara.epfl.ch/w/ScalaZ3). Everything went fine except that I cannot run any example from this website (http://lara.epfl.ch/w/jniz3-scala-examples) without getting this nasty error:
java.lang.NoClassDefFoundError: scala/reflect/ClassManifest
at .<init>(<console>:8)
at .<clinit>(<console>)
at .<init>(<console>:7)
...
Caused by: java.lang.ClassNotFoundException: scala.reflect.ClassManifest
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
... 29 more
I think this happens because of the incompatibility between the Scala 2.9.x and 2.10.x in handling reflections. As I was able to run the same set of examples under Scala 2.9.x. My question is, is there anyway to go around this and run Scala^Z3 under Scala 2.10?
From looking at the project propertie and build file (https://github.com/psuter/ScalaZ3/blob/master/project/build.properties and https://github.com/psuter/ScalaZ3/blob/master/project/build/scalaZ3.scala) I infer that scalaZ3 is currently provided for scala 2.9.2 only. There is no cross version support at the moment.
You might try to get the code and compile it yourself after having changed the version to scala 2.10.0 in the "build.properties" file.
See this page for instructions on how to compile it: https://github.com/psuter/ScalaZ3.
If you're lucky, the code will compile as is under scala 2.10. If you're not, there might be some small fixes to do. Cross your fingers.
If you are not in a hurry, you could also bug the Scala^Z3 authors and ask them for scala 2.10 version of the library.
I'm copying the instructions from my response to your issue on GitHub, as it may help someone in the future.
The current status is that the old sbt project does not seem mix well with Scala 2.10. Here are the instructions for a "manual" compilation of the project, for Linux. This works for me with Z3 4.3 (grabbed from the Z3 git repo) and Scala 2.10. After installing Z3 per the original instructions:
First compile the Java files:
$ mkdir bin
$ find src/main -name '*.java' -exec javac -d bin {} +
Then compile the C files. For this, you need to generate the JNI headers first, then compile the shared library. The options in the commands below are for Linux. To find our where the JNI headers are, I run (new java.io.File(System.getProperty("java.home")).getParent in a Scala console (and add /include to the result).
$ javah -classpath bin -d src/c z3.Z3Wrapper
$ gcc -o lib-bin/libscalaz3.so -shared -Wl,-soname,libscalaz3.so \
-I/usr/lib/jvm/java-6-sun-1.6.0.26/include \
-I/usr/lib/jvm/java-6-sun-1.6.0.26/include/linux \
-Iz3/4.3/include -Lz3/4.3/lib \
-g -lc -Wl,--no-as-needed -Wl,--copy-dt-needed -lz3 -fPIC -O2 -fopenmp \
src/c/*.[ch]
Now compile the Scala files:
$ find src/main -name '*.scala' -exec scalac -classpath bin -d bin {} +
You'll get "feature warnings", which is typical when moving to 2.10, and another warning about a non-exhaustive pattern match.
Now let's make a jar file out of everything...
$ cd bin
$ jar cvf scalaz3.jar z3
$ cd ..
$ jar uf bin/scalaz3.jar lib-bin/libscalaz3.so
...and now you should have bin/scalaz3.jar containing everything you need. Let's try it out:
$ export LD_LIBRARY_PATH=z3/4.3/lib
$ scala -cp bin/scalaz3.jar
scala> z3.scala.version
Hope this helps!
This does not directly answer the question, but might help others trying to build scalaz3 with scala 2.10.
I built ScalaZ3 with Scala 2.10.1 and Z3 4.3.0 on Windows 7. I tested it with the integer constraints example at http://lara.epfl.ch/w/jniz3-scala-examples and it is working fine.
Building Z3
The Z3 4.3.0 download at codeplex does not include libZ3.lib file. So I had to download the source and build it on my machine. The build process is quite simple.
Building ScalaZ3
Currently, build.properties has sbt version 0.7.4 and scala version 2.9.2. This builds fine. ( I had to make some minor modifications to the build.scala file. Modify z3LibPath(z3VN).absolutePath to z3LibPath(z3VN).absolutePath + "\\libz3.lib" in the gcc task. )
Now if I change scala version to 2.10.1 in build.properties, I get "Error compiling sbt component 'compiler-interface'" error on launching sbt. I have no clue why this happens.
I then changed sbt version to 0.12.2 and scala version to 2.10.1, and started with the fresh source. I have also added build.sbt in project root folder containing scalaVersion := "2.10.1". This is required since in sbt 0.12.2, the file build.properties is supposed to be used for specifying the sbt version only. More info about sbt version differences at (https://github.com/harrah/xsbt/wiki/Migrating-from-SBT-0.7.x-to-0.10.x).
I get error Z3Wrapper.java:27: cannot find symbol LibraryChecksum. This happens because the file LibraryChecksum.java which is supposted to be generated by the build (project\build\build.scala) is not generated. Looks like the package task does not execute the tasks in (project\build\build.scala). The tasks compute-checksum, javah, and gcc are not executed. This may be happening because sbt 0.12.2 expects the build.scala file to be directly under the project folder.
I then copied the LibraryChecksum.java generated from the previous build, the build then goes through. The generated jar file does not contain scalaz3.dll.
I then executed javah and gcc tasks manually. The command for these tasks can be copied from the log of successful build with scala 2.9.2(I made appropriate modifications to the commands for scala 2.10.1). Here also, I had to make some changes. I had to explicitly add full path of scala-library.jar classpath of the javah task.
I then added the lib-bin\scalaz3.dll to the jar file using jar uf target\scala-2.10\scalaz3.jar lib-bin/scalaz3.dll.