Running ScalaZ3 on Mac - scala

Running a program that uses ScalaZ3. This is on a Max with OS XAt run time I get an exception as follows
Could not find /lib-bin/libscalaz3.dylib
java.io.FileNotFoundException: Could not find /lib-bin/libscalaz3.dylib
at z3.Z3Wrapper.loadLib(Z3Wrapper.java:74)
at z3.Z3Wrapper.loadFromJar(Z3Wrapper.java:52)
at z3.Z3Wrapper.<clinit>(Z3Wrapper.java:32)
at z3.scala.Z3Config.<init>(Z3Config.scala:6)
Later I get a second stack trace
java.lang.UnsatisfiedLinkError: z3.Z3Wrapper.mkConfig()J
at z3.Z3Wrapper.mkConfig(Native Method)
at z3.scala.Z3Config.<init>(Z3Config.scala:6)
at thesis.z3option.Z3SessionCore$class.open(Z3SessionCore.scala:89)
I'm using the jar file at
jar-releases/64/scala-2.9.1/scalaz3-3.2.c.jar
in the github respository.
Edit: Here are the errors I got on step (3) of the set up process: sbt package
n060h152:ScalaZ3 theo$ sbt package
[info] Building project ScalaZ3 4.0a against Scala 2.9.2
[info]
[info] == compute-checksum ==
[info] == compute-checksum ==
[info]
[info] == compile ==
[info] Source analysis: 0 new/modified, 0 indirectly invalidated, 0 removed.
[info] Compiling main sources...
[info] Nothing to compile.
[info] Post-analysis: 273 classes.
[info] == compile ==
[info]
[info] == javah ==
[info] Generating JNI C headers
[info] javah -classpath /Users/theo/Documents/ScalaZ3/target/scala_2.9.2/classes -d /Users/theo/Documents/ScalaZ3/src/c z3.Z3Wrapper
[error] Cannot find annotation method 'bytes()' in type 'scala.reflect.ScalaSignature': class file for scala.reflect.ScalaSignature not found
[error] Error: Class scala.ScalaObject could not be found.
[info] == javah ==
[error] Error running javah: Non-zero exit code.
[info]
[info] Total time: 4 s, completed 24-May-2013 2:52:36 PM
[info]
[info] Total session time: 6 s, completed 24-May-2013 2:52:36 PM

Related

sbt-scoverage + specs2 not working; no tests discovered during analysis

I am unable to get sbt-scoverage to discover and analyze our tests. We should have pretty good test coverage (at least 80%); all tests run and pass fine. But the scoverage report shows nearly zero coverage (~ 3%).
The only thing I can think of, is that we are using specs2 (and also scalacheck). Most of the examples I've seen use ScalaTest.
For example: We have four classes in the models package. They are showing zero coverage, even though the test results show that they have been tested thoroughly:
[info] Compiling 5 Scala sources to /Users/zbeckman/Projects/Accenture/Trident_IDC_Training/target/scala-2.11/test-classes...
[info] TestCreditModel
[info] + Given a credit
[info] When validation is performed
[info] Then credits between 1 and 60 should be acceptable
[info] + And credits less than 1 or greater than 60 should not
[info]
[info] + Given a category and attribute credits
[info] When validation is performed
[info] Then we should get validated category credits
[info] + And we should get validated attribute credits
[info] + And invalid category credits should not pass validation
[info] + And invalid attribute credits should not pass validation
[info]
[info] Total for specification TestCreditModel
[info] Finished in 86 ms
[info] 6 examples, 600 expectations, 0 failure, 0 error
[info]
[info] TestCategoryModel
[info] + Given a well formed category
[info] When validation is performed
[info] Categories with good abbreviations and descriptions should be valid
[info] + And those with invalid abbreviations should not
[info] + And those with invalid descriptions should not
[info]
[info]
[info] Total for specification TestCategoryModel
[info] Finished in 167 ms
[info] 3 examples, 300 expectations, 0 failure, 0 error
But I see scoverage is definitely running:
[info] Compiling 11 Scala sources and 1 Java source to /Users/zbeckman/Projects/Accenture/Trident_IDC_Training/target/scala-2.11/classes...
[info] [info] Cleaning datadir [/Users/zbeckman/Projects/Accenture/Trident_IDC_Training/target/scala-2.11/scoverage-data]
[info] [info] Beginning coverage instrumentation
[info] [info] Instrumentation completed [31 statements]
[info] [info] Wrote instrumentation file [/Users/zbeckman/Projects/Accenture/Trident_IDC_Training/target/scala-2.11/scoverage-data/scoverage.coverage.xml]
[info] [info] Will write measurement data to [/Users/zbeckman/Projects/Accenture/Trident_IDC_Training/target/scala-2.11/scoverage-data]
But the coverage report is clearly not picking up the classes that we are testing:
[IDC] $ coverageReport
[info] Waiting for measurement data to sync...
[info] Reading scoverage instrumentation [/Users/zbeckman/Projects/Accenture/Trident_IDC_Training/target/scala-2.11/scoverage-data/scoverage.coverage.xml]
[info] Reading scoverage measurements...
[info] Generating scoverage reports...
[info] Written Cobertura report [/Users/zbeckman/Projects/Accenture/Trident_IDC_Training/target/scala-2.11/coverage-report/cobertura.xml]
[info] Written XML coverage report [/Users/zbeckman/Projects/Accenture/Trident_IDC_Training/target/scala-2.11/scoverage-report/scoverage.xml]
[info] Written HTML coverage report [/Users/zbeckman/Projects/Accenture/Trident_IDC_Training/target/scala-2.11/scoverage-report/index.html]
[info] Statement coverage.: 3.23%
[info] Branch coverage....: 100.00%
[info] Coverage reports completed
[error] Coverage is below minimum [3.23% < 80.0%]
[trace] Stack trace suppressed: run last *:coverageReport for the full output.
[error] (*:coverageReport) Coverage minimum was not reached
[error] Total time: 1 s, completed Aug 25, 2016 1:11:27 PM
When viewing the report, I see the four classes in the models package show completely untested.
We have the following in our built.sbt:
coverageEnabled := true
coverageMinimum := 80
coverageFailOnMinimum := true
Obviously, since we have the above settings, and since scoverage is failing to find the tests, we get a build failure:
[error] Coverage is below minimum [3.23% < 80.0%]
[error] (*:coverageReport) Coverage minimum was not reached
[error] Total time: 1 s, completed Aug 25, 2016 1:39:12 PM

Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2

While compiling the Maven project the following error occured:
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-streaming-flume-sink_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal incremental compile
[INFO] Using incremental compilation
[INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes...
[ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found.
[ERROR] with Logging {
[ERROR] ^
[ERROR] one error found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 7.992s
[INFO] Finished at: Fri Apr 15 17:44:33 CEST 2016
[INFO] Final Memory: 25M/350M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed ->
[Help 1]
[ERROR]
I removed the property <useZincServer>true</useZincServer> from pom.xml, and still the Logging error persists.
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-streaming-flume-sink_2.10 ---
[INFO] Using incremental compilation
[INFO] Compiling 6 Scala sources and 3 Java sources to /home/gorlec/Desktop/test/external/flume-sink/target/scala-2.10/classes...
[ERROR] /home/gorlec/Desktop/test/external/flume-sink/src/main/scala/org/apache/spark/streaming/flume/sink/SparkAvroCallbackHandler.scala:47: identifier expected but 'with' found.
[ERROR] with Logging {
[ERROR] ^
[ERROR] one error found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5.814s
[INFO] Finished at: Fri Apr 15 17:41:00 CEST 2016
[INFO] Final Memory: 25M/335M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-streaming-flume-sink_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed ->
[Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
I checked that PATH and JAVA_HOME are defined in ~/.bashrc as follows:
export PATH=$PATH:/usr/lib/jvm/java-7-openjdk-amd64/bin
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
The only issue that I noticed is that echo $JAVA_HOME gives an empty output, though I did source ~/.bashrc.
Any help is highly appreciated.
The problem could be this [INFO] Using incremental compilation
In your pom.xml try to remove the line <recompileMode>incremental</recompileMode>
and then try again.
It is strange echo $JAVA_HOME gives an empty output.While compling the Spakr Source,I import the mvn clean package success porject into eclipse,I meet the same problem.And I found the solution here:
How to solve “Plugin execution not covered by lifecycle configuration” for Spring Data Maven Builds
I think you were compiling Spark with Scala 2.10. If so, you should do as follows.
cd /path/to/Spark
./dev/change-scala-version.sh 2.10
./build/mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package
Hope this helps.

Building Spark using Maven fails

I have a spark program written in Scala and I'm trying to build it using Maven. However maven built fails without any obvious error. Here is the error message:
[INFO] Compiling 1 source files to D:\Scala-IDE\Workspace\ClassifierMaven\classifier\target\classes at 1438214677096
[ERROR] error: class file needed by package is missing.
[INFO] reference type ClassTag of package reflect refers to nonexisting symbol.
[ERROR] one error found
[INFO] Picked up _JAVA_OPTIONS: -Xmx2G
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17.022 s
[INFO] Finished at: 2015-07-29T17:04:41-07:00
[INFO] Final Memory: 30M/879M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scala-tools:maven-scala-plugin:2.15.0:compile (default) on project classifier: wrap: org.apache.commons.exec.ExecuteException: Process
exited with an error: 1(Exit value: 1) -> [Help 1]
what is the problem?
SOLUTION: the problem was missing a dependency in my POM. I added the dependency and now I'm getting a new error message. HEre is the error I'm getting now
[INFO] excludes = []
[INFO] D:\Scala-IDE\Workspace\ClassifierMaven\classifier\src\main\scala:-1: info: compiling
[INFO] Compiling 1 source files to D:\Scala-IDE\Workspace\ClassifierMaven\classifier\target\classes at 1438274292378
[INFO] java.lang.reflect.InvocationTargetException
[INFO] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[INFO] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
[INFO] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[INFO] at java.lang.reflect.Method.invoke(Method.java:606)
[INFO] at org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
[INFO] at org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
[ERROR] Caused by: java.lang.AssertionError: assertion failed: List(object Byte, object Byte)
[INFO] at scala.tools.nsc.symtab.Symbols$Symbol.suchThat(Symbols.scala:1063)
[INFO] at scala.tools.nsc.symtab.Symbols$Symbol.companionModule0(Symbols.scala:1269)
[INFO] at scala.tools.nsc.symtab.Symbols$Symbol.companionModule(Symbols.scala:1277)
[INFO] at scala.tools.nsc.symtab.Symbols$Symbol.linkedClassOfClass(Symbols.scala:1296)
[INFO] at scala.tools.nsc.symtab.Definitions$definitions$.addModuleMethod$1(Definitions.scala:707)
[INFO] at scala.tools.nsc.symtab.Definitions$definitions$.initValueClasses(Definitions.scala:710)
[INFO] at scala.tools.nsc.symtab.Definitions$definitions$.init(Definitions.scala:787)
[INFO] at scala.tools.nsc.Global$Run.<init>(Global.scala:597)
[INFO] at scala.tools.nsc.Main$.process(Main.scala:107)
[INFO] at scala.tools.nsc.Main$.main(Main.scala:122)
[INFO] at scala.tools.nsc.Main.main(Main.scala)
[INFO] ... 6 more
[INFO] Picked up _JAVA_OPTIONS: -Xmx2G
It seems that you are missing the scala-reflect package from your class path. I'd suggest looking at some of the sample pom's (like learning spark, or some of the other databricks ones) and using one of those as a starting point for your build.
I got same issue, check your java version in your pom. it must be 1.8

Can't run GwtTestCase when updating GWT to 2.6.0

After updating my GWT version to 2.6.0, i got this error when running my old GwtTestCase :
[ERROR] The -out option is deprecated. This option will be removed in future GWT release and will throw an error if it is still used. Please use -war option instead.
[INFO] <b>[ERROR] RunStyleHtmlUnit: Unknown browser name FF3, expected browser name: one of [IE8, IE9, Chrome, FF17]</b>
[INFO] [ERROR] shell failed in doStartup method
[INFO] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 1.682 sec <<< FAILURE!
[INFO] testSerializeDeserialize(m6.sherpa.portal.widget.base.client.components.frame.GwtTestSerializer) Time elapsed: 1.641 sec <<< ERROR!
[INFO] com.google.gwt.junit.JUnitFatalLaunchException: Shell failed to start
[INFO] at com.google.gwt.junit.JUnitShell.getUnitTestShell(JUnitShell.java:732)
[INFO] at com.google.gwt.junit.JUnitShell.runTest(JUnitShell.java:705)
[INFO] at com.google.gwt.junit.client.GWTTestCase.runTest(GWTTestCase.java:421)
[INFO] at junit.framework.TestCase.runBare(TestCase.java:141)
[INFO] at junit.framework.TestResult$1.protect(TestResult.java:122)
[INFO] at junit.framework.TestResult.runProtected(TestResult.java:142)
[INFO] at junit.framework.TestResult.run(TestResult.java:125)
[INFO] at junit.framework.TestCase.run(TestCase.java:129)
[INFO] at com.google.gwt.junit.client.GWTTestCase.run(GWTTestCase.java:247)
[INFO] at junit.framework.TestSuite.runTest(TestSuite.java:255)
[INFO] at junit.framework.TestSuite.run(TestSuite.java:250)
[INFO] at junit.framework.TestSuite.runTest(TestSuite.java:255)
[INFO] at junit.framework.TestSuite.run(TestSuite.java:250)
[INFO] at junit.framework.TestSuite.runTest(TestSuite.java:255)
[INFO] at junit.framework.TestSuite.run(TestSuite.java:250)
[INFO] at org.codehaus.mojo.gwt.test.MavenTestRunner.doRun(MavenTestRunner.java:105)
[INFO] at junit.textui.TestRunner.start(TestRunner.java:183)
[INFO] at org.codehaus.mojo.gwt.test.MavenTestRunner.main(MavenTestRunner.java:63)
Any Idea ?
Best way to fix it is to just update your gwt-maven-plugin to 2.6.0 too.
In previous versions (up to and including 2.5.1), the <htmlunit> configuration property defaulted to FF3, but that value is no longer valid in GWT 2.6.0. gwt-maven-plugin 2.6.0 now defaults the property to FF17.
It looks like GWT 2.6.0 no more supports FireFox v3. FF3 has been replaced with FF17.

How to set up a local DBpedia mirror

I am trying to set up a local DBpedia Information Extraction Framework, but there seems to have some problems which I cannot deal with. I followed the instructions on the official site using Intelli J, it went well until I get to the final step that tells "select DBpedia dump extraction ->Plugins -> scala ->scala:run", the error message shows below:
D:\program\jdk\bin\java -Dmaven.home=D:\program\apache-maven-3.0.5 -Dclassworlds.conf=D:\program\apache-maven-3.0.5\bin\m2.conf -Didea.launcher.port=7534 "-Didea.launcher.bin.path=D:\program\IntelliJ IDEA Community Edition 12.1.3\bin" -Dfile.encoding=UTF-8 -classpath "D:\program\apache-maven-3.0.5\boot\plexus-classworlds-2.4.jar;D:\program\IntelliJ IDEA Community Edition 12.1.3\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain org.codehaus.classworlds.Launcher --fail-fast --strict-checksums org.scala-tools:maven-scala-plugin:2.15.2:run
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building DBpedia Dump Extraction 3.8
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] >>> maven-scala-plugin:2.15.2:run (default-cli) # dump >>>
[INFO]
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) # dump ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory E:\code\Scala\extraction_framework\dump\src\main\resources
[INFO]
[INFO] --- maven-scala-plugin:2.15.2:compile (process-resources) # dump ---
[INFO] Checking for multiple versions of scala
[INFO] includes = [**/*.scala,**/*.java,]
[INFO] excludes = []
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) # dump ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-scala-plugin:2.15.2:compile (compile) # dump ---
[INFO] Checking for multiple versions of scala
[INFO] includes = [**/*.scala,**/*.java,]
[INFO] excludes = []
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) # dump ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory E:\code\Scala\extraction_framework\dump\src\test\resources
[INFO]
[INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile) # dump ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-scala-plugin:2.15.2:testCompile (test-compile) # dump ---
[INFO] Checking for multiple versions of scala
[INFO] includes = [**/*.scala,**/*.java,]
[INFO] excludes = []
[WARNING] No source files found.
[INFO]
[INFO] org.dbpedia.extraction.dump.sql.Import
parsing \home\release\wikipedia\wikipedias.csv
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org_scala_tools_maven_executions.MainHelper.runMain(MainHelper.java:161)
at org_scala_tools_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
Caused by: java.io.FileNotFoundException: \home\release\wikipedia\wikipedias.csv (ϵͳ�Ҳ���ָ����·����)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.(FileInputStream.java:138)
at scala.io.Source$.fromFile(Source.scala:91)
at scala.io.Source$.fromFile(Source.scala:76)
at org.dbpedia.extraction.util.WikiInfo$.fromFile(WikiInfo.scala:26)
at org.dbpedia.extraction.util.ConfigUtils$.parseLanguages(ConfigUtils.scala:43)
at org.dbpedia.extraction.dump.sql.Import$.main(Import.scala:26)
at org.dbpedia.extraction.dump.sql.Import.main(Import.scala)
... 6 more
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.104s
[INFO] Finished at: Fri May 24 18:34:48 CST 2013
[INFO] Final Memory: 10M/175M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scala-tools:maven-scala-plugin:2.15.2:run (default-cli) on project dump: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: -10000(Exit value: -10000) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Process finished with exit code 1
what should I do to fix it?
PS: I did it all on windows 7 OS, with intelli j 12 community version, maven 3 and jdk 7 installed. If any other info is needed, please let me know, thanks.