How to run Ruta scripts from command line?
I tried this but not sure if this is right command.
javac DataExtraction.ruta
Error : error: Class names, 'DataExtraction.ruta', are only accepted if annotation processing is explicitly requested
Unlike the .java classes, UIMA Ruta scripts cannot be run in command line. You could, however, wrap the Ruta scripts within an UIMA Java annotator and access it in command line through the java compiler.
Youc an use the main method of org.apache.uima.ruta.ide.launching.RutaLauncher, but you need to built to classpath yourself, e.g., uimaj-core, ruta-ep-ide-ui and ruta-core with their dependencies.
Related
Please note that I am better dataminer than programmer.
I am trying to run examples from book "Advanced analytics with Spark" from author Sandy Ryza (these code examples can be downloaded from "https://github.com/sryza/aas"),
and I run into following problem.
When I open this project in Intelij Idea and try to run it, I get error "Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/rdd/RDD"
Does anyone know how to solve this issue ?
Does this mean i am using wrong version of spark ?
First when I tried to run this code, I got error "Exception in thread "main" java.lang.NoClassDefFoundError: scala/product", but I solved it by setting scala-lib to compile in maven.
I use Maven 3.3.9, Java 1.7.0_79 and scala 2.11.7 , spark 1.6.1. I tried both Intelij Idea 14 and 15 different versions of java (1.7), scala (2.10) and spark, but to no success.
I am also using windows 7.
My SPARK_HOME and Path variables are set, and i can execute spark-shell from command line.
The examples in this book will show a --master argument to sparkshell, but you will need to specify arguments as appropriate for your environment. If you don’t have Hadoop installed you need to start the spark-shell locally. To execute the sample you can simply pass paths to local file reference (file:///), rather than a HDFS reference (hdfs://)
The author suggest an hybrid development approach:
Keep the frontier of development in the REPL, and, as pieces of code
harden, move them over into a compiled library.
Hence the samples code are considered as compiled libraries rather than standalone application. You can make the compiled JAR available to spark-shell by passing it to the --jars property, while maven is used for compiling and managing dependencies.
In the book the author describes how the simplesparkproject can be executed:
use maven to compile and package the project
cd simplesparkproject/
mvn package
start the spark-shell with the jar dependencies
spark-shell --master local[2] --driver-memory 2g --jars ../simplesparkproject-0.0.1.jar ../README.md
Then you can access you object within the spark-shell as follows:
val myApp = com.cloudera.datascience.MyApp
However if you want to execute the sample code as Standalone application and execute it within idea you need to modify the pom.xml.
Some of dependencies are required for compilation, but are available in an spark runtime environment. Therefore these dependencies are marked with scope provided in the pom.xml.
<!--<scope>provided</scope>-->
you can remake the provided scope, than you will be able to run the samples within idea. But you can not provide this jar as dependency for the spark shell anymore.
Note: using maven 3.0.5 and Java 7+. I had problems with maven 3.3.X version with the plugin versions.
I have an application which consists of some Java classes and some Python scripts; I use Jython from my Java code to invoke the Python scripts.
My application is built using Maven, and produces an ordinary .jar file containing the compiled Java classes, and the Python scripts as resources. I also use maven-assembly-plugin to generate a myapp-with-dependencies.jar that contains the same plus bundles in the contents of the Jython dependencies.
When I run my application using the with-dependencies jar:
java -classpath target/myapp-1.0.0-SNAPSHOT-jar-with-dependencies.jar com.example.myapp.Main
...it works as expected. But when I run it using two separate jars:
java -classpath "target/myapp-1.0.0-SNAPSHOT.jar:/Users/richard/.m2/repository/org/python/jython-standalone/2.5.3/jython-standalone-2.5.3.jar" com.example.myapp.Main
...it fails: when I ask Jython to execute "import mylib" I get an exception:
ImportError: No module named mylib
In both cases, the exact contents of the classpath should be completely identical, the only difference is that in the first case everything is in one jar, but in the second case it is split across two jars.
What could be causing this behaviour?
I have a class written in Scala and I am trying to make it available to the Scala Context so that I can make use of it for further processing. The problem is that I need to run this from the shell and I am having a hard time figuring out how to compile the class and make it available to the context.
I am aware of compiling the class and making use of directly, but I am not able to figure out how to do the same on the Scala shell. Any pointers in this regard would be great.
In the Scala REPL you can use the command :cp <path> to add a directory or JAR (that contains your compiled Scala class) to the classpath, so that it is available for the REPL to use.
(Ofcourse, replace <path> in that command with the actual directory or JAR path).
To see what other commands are available in the Scala REPL, use the command :help.
I am using stanford Parser in my code. I have added all the relevent libraries in the project. When I run my code on console it works perfectly fine. But after creating a 'runnable jar' of the source with an option "Copy required libraries into a subfolder next to the generated JAR" and run the same on commmand promt it throws an error:
Exception in thread "Thread-2" java.lang.NoSuchMethodError:
edu.stanford.nlp.process.DocumentPreprocessor.(Ljava/io/Reader;)V
at edu.stanford.nlp.tagger.maxent.MaxentTagger.tokenizeText(MaxentTagger.java:852)
at edu.stanford.nlp.tagger.maxent.MaxentTagger.tokenizeText(MaxentTagger.java:837)
at I have provided all the required libraries in classpath. Also the method tokenizeText is present in the MaxentTagger. Please suggest some solution.
This almost certainly means that you have combined incompatible releases of the parser and tagger. E.g., perhaps the version of the tagger being inserted into the jar file is different to the one picked up when you run the code on the command-line. What versions of the parser and tagger are you using? From the line numbers in the stacktrace, it appears not to be the latest version.
A program of mine (written in Scala 2.8) works fine when launched by means of NetBeans IDE. But when I try to run it from outside, with "java- jar", it says "Exception in thread "main" java.lang.NoClassDefFoundError: scala/ScalaObject...". Putting all the libraries, incl. Scala runtime inside the same dir as the jar to be run doesn't help. If i try to run the jar with scala itself, it complains that it can't decode it as utf-8 (it expects a scala source rather than a jar, I suppose). So how do I run a Scala application at all?
UPDATE: For those who come here later having the same question I'd recommend to read comments under barjak's answer (including those latest ones hidden), the answer is there. VonC also gives some interesting links on the subject.
The -jar and -classpath of the java command are mutually exclusive : you can't do java -jar YourScalaProg.jar -classpath scala-library.jar
If you want to run your application with java -jar, then the full classpath must be specified in the Class-Path section of the jar's manifest.
You can run your application using only -classpath, like that : java -classpath YourScalaProg.jar:scala-library.jar your.package.MainClass.
Are you using scala-library.jar as described in Adventures with Scala blog post?
java -classpath scala-library.jar:. YourClass
or:
java -classpath scala-library.jar:yourApp.jar YourClass
Where YourClass is was your scalac compiled Scala code.
You will find the same scala-library.jar used in the SO question "Creating a jar file from a Scala file" (or in the blog post "the not so elegant way of creating an executable jar from scala code").