Unable to run data-validator_2.11-0.12.1.jar on my local machine - scala

I'm trying to use target data-validator, but when I run it in cmd, it shows me an error like:
D:\Spark\DemoTime>spark-submit --master local data-validator_2.11-0.12.1.jar --help
21/12/28 15:05:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" org.apache.spark.SparkException: No main class set in JAR; please specify one with --class.
at org.apache.spark.deploy.SparkSubmit.error(SparkSubmit.scala:972)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:492)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:898)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
But if I give the main class name, it shows like:
D:\Spark\DemoTime>spark-submit --class com.target.data_validator.Main data-validator_2.11-0.13.0.jar --help
Exception in thread "main" java.lang.NoClassDefFoundError: com/typesafe/scalalogging/Logger
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:42)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.typesafe.scalalogging.Logger
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 13 more
log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
I'm following the data-validator tool from target:
https://github.com/target/data-validator
But I'm unable to sort this thing out.
You can find the jar file at:
https://github.com/target/data-validator/releases
Hence I would request your help in running this tool.
Please let me know if you need anything else.
Thanks.

From the GitHub's README:
Assemble fat jar: sbt clean assembly
spark-submit --master local data-validator-assembly-0.13.0.jar --help
You need a fat jar of data-validator + its dependencies to be able to run it. You've only provided the data-validator simple jar.

Related

java.lang.NoClassDefFoundError: org/apache/spark/sq/sources/v2/StreamingWriteSupportProvider trying to pull from kafka topic in scala

I'm using a spark-shell instance to test the pulling of data from a client's kafka source. To launch the instance I am using the command spark-shell --jars spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar (all jars are present in the woring dir).
However, when I run the command val df = spark.read.format("kafka")........... after a few seconds it crashes with the below:
java.lang.NoClassDefFoundError: org/apache/spark/sql/sources/v2/StreamingWriteSupportProvider
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:344)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:533)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:304)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.StreamingWriteSupportProvider
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 79 more
HOWEVER - if I change the order of the jars in the spark-shell command to spark-shell --jars kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar, spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, instead crashes with:
java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/ByteArrayDeserializer
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<init>(KafkaSourceProvider.scala:376)
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<clinit>(KafkaSourceProvider.scala)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.validateBatchOptions(KafkaSourceProvider.scala:330)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.createRelation(KafkaSourceProvider.scala:113)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:309)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.serialization.ByteArrayDeserializer
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 55 more
I am developing behind a very strict proxy managed by our client and am unable to user --packages instead, and I am at a bit of a loss here, am I unable to load all 3 dependencies at the launch of the shell? Am I missing another step somewhere?
In the Structured Streaming + Kafka Integration Guide it says:
For experimenting on spark-shell, you need to add this above library and its dependencies too when invoking spark-shell.
The library you are using seems to be customized and not publicly available in the maven central repository. That means, I can not look into its dependencies.
However, looking at the latest stable version 2.4.5 the dependencies according to maven central repository is kafka-clients version 2.0.0.
You are trying to import multiple scala versions 2.11 & 2.12 of different libraries.
Please add same version of scala libraries & check below how to import into spark-shell.
spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.5,org.apache.kafka:kafka_2.11:2.4.1,org.apache.kafka:kafka-clients:2.4.1
One occasionally disruptive issue is dealing with dependency conflicts in cases where a user application and Spark itself both depend on the same library. This comes up relatively rarely, but when it does, it can be vexing for users. Typically, this will manifest itself when a NoSuchMethodError, a ClassNotFoundException, or some other JVM exception related to class loading is thrown during the execution of a Spark job. There are two solutions to this problem. The first is to modify your application to depend on the same version of the third-party library that Spark does. The second is to modify the packaging of your application using a procedure that is often called “shading.” The Maven build tool supports shading through advanced configuration of the plug-in shown in Example 7-5 (in fact, the shading capability is why the plugin is named maven-shade-plugin). Shading allows you to make a second copy of the conflicting package under a different namespace and rewrites your application’s code to use the renamed version. This somewhat brute-force technique is quite effective at resolving runtime dependency conflicts. For specific instructions on how to shade dependencies, see the documentation for your build tool.
I would try to know the scala version of the spark-shell because, it can be a scala version issue
scala> util.Properties.versionString
res3: String = version 2.11.8
if not, then check what spark version you are using and third-party library versions you are using as dependencies because, I am sure there is newest or oldest that your spark version doesn't support.
I hope it helps.

cloudera hadoop : caused by: java.lang.ClassNotFoundException: org.apache.htrace.core.Tracer$Builder

I just ran an example program following the tutorial :
http://web.stanford.edu/class/cs246/homeworks/tutorial.pdf
and got the following error:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
16/10/24 21:48:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/htrace/core/Tracer$Builder
at org.apache.hadoop.fs.FsTracer.get(FsTracer.java:42)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2630)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:93)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2680)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2662)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:379)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:178)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:363)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:520)
at edu.stanford.cs246.wordcount.WordCount.run(WordCount.java:43)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at edu.stanford.cs246.wordcount.WordCount.main(WordCount.java:24)
Caused by: java.lang.ClassNotFoundException: org.apache.htrace.core.Tracer$Builder
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 13 more
how to fix the problem so I can compile the program ? Thanks!
You just need to add htrace jar file. For me htrace-core4-4.0.1-incubating.jar in hadoop-hdfs/libs worked. Thanks
Add external JARs
I add 6 external JARs, then finally I run WordCount.java successfully. They are:
htrace-core4-4.0.1-incubating.jar
httpclient-4.5.2.jar
httpcore-4.4.1.jar
jackson-core-asl-1.9.13.jar
jackson-mapper-asl-1.9.13.jar
slf4j-nop-1.6.1.jar
Hope it is helpful to you~

Mapreduce using Scala Error: java.lang.ClassNotFoundException: scala.Predef$

I tried to implement a simple mapreduce job through scala. However, when I run the package using the command,
hadoop jar hadoop.jar mapreduce.MaxTemperature hdfs://sandbox/user/ajay/input hdfs://sandbox/user/ajay/output
I get the error,
16/09/06 16:06:12 INFO mapreduce.Job: Task Id : attempt_1473177830264_0002_m_000001_2, Status : FAILED Error: java.lang.ClassNotFoundException: scala.Predef$
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at mapreduce.MaxTemperatureMapper.map(MaxTemperatureMapper.scala:17)
at mapreduce.MaxTemperatureMapper.map(MaxTemperatureMapper.scala:9)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Even though I've added the scala-library to my class path, I get the above error.
hadoop version: Hadoop 2.7.1.2.3.0.0-255
scala version: 2.11.8
java version 1.7.0_85
Any suggestion is appreciated.
Apart from adding the scala library to the client's classpath, it has to be added to all nodes where the task gets executed. This can be achieved using the ToolRunner hadoop jar scala-2.11/hadoop_2.11-0.1.0.jar mapreduce.WordCount -libjars /usr/lib/scala-2.11.8/lib/scala-library.jar

Sparkling water droplet project running issue

I cloned sparkling droplet project from 'https://github.com/h2oai/h2o-droplets/tree/master/sparkling-water-droplet'. And cleaned and build project using ./gradlew clean , ./gradlew build respectively. After that tried to run project using command
spark-submit --class water.droplets.SparklingWaterDroplet build/libs/sparkling-water-droplet-app.jar
Then got following error message:
Exception in thread "main" java.lang.NoClassDefFoundError: water/fvec/Frame
at water.droplets.SparklingWaterDroplet.main(SparklingWaterDroplet.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: water.fvec.Frame
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more
You have to run ./gradlew shadowJar and use the jar created by it build/libs/sparkling-water-droplet-app.jar instead since it creates a fat jar which contains all classes required while submitting a job (the build task does not do that).

java.lang.ClassNotFoundException: org.slf4j.LoggerFactory Maven

I just installed Maven on Eclipse. If I run the pom.xml, I get the error below.
People who use Windows don't get this error. I have Mac OS X.
When I add slf4j-simple-1.7.5.jr and slf4j-api-1.7.5.jar to the build path, it works.
Is there a way NOT to add external jars?
EDIT 2:
I used Maven clean via Terminal and that worked. I think the problem is Maven internal in Eclipse.
EDIT 3:
I deleted the projects and checked out it all from SVN and it worked somehow.
[INFO] Scanning for projects...
[ERROR] Internal error: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException: org/slf4j/LoggerFactory: org.slf4j.LoggerFactory -> [Help 1]
org.apache.maven.InternalErrorException: Internal error: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:168)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:534)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
Caused by: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at org.eclipse.tycho.core.maven.utils.MavenCompatiblityHelper.getPluginDescriptor(MavenCompatiblityHelper.java:86)
at org.eclipse.tycho.core.maven.utils.PluginRealmHelper.execute(PluginRealmHelper.java:95)
at org.eclipse.tycho.p2.resolver.P2TargetPlatformResolver.getDependencyMetadata(P2TargetPlatformResolver.java:144)
at org.eclipse.tycho.p2.resolver.P2TargetPlatformResolver.setupProjects(P2TargetPlatformResolver.java:126)
at org.eclipse.tycho.core.resolver.DefaultTychoDependencyResolver.setupProject(DefaultTychoDependencyResolver.java:87)
at org.eclipse.tycho.core.maven.TychoMavenLifecycleParticipant.afterProjectsRead(TychoMavenLifecycleParticipant.java:77)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:273)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
... 11 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.eclipse.tycho.core.maven.utils.MavenCompatiblityHelper.getPluginDescriptor(MavenCompatiblityHelper.java:68)
... 18 more
Caused by: java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
at com.ning.http.client.AsyncHttpClient.<clinit>(AsyncHttpClient.java:147)
at org.sonatype.aether.connector.async.AsyncRepositoryConnector.<init>(AsyncRepositoryConnector.java:150)
at org.sonatype.aether.connector.async.AsyncRepositoryConnectorFactory.newInstance(AsyncRepositoryConnectorFactory.java:110)
at org.sonatype.aether.impl.internal.DefaultRemoteRepositoryManager.getRepositoryConnector(DefaultRemoteRepositoryManager.java:333)
at org.sonatype.aether.impl.internal.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:456)
at org.sonatype.aether.impl.internal.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:220)
at org.sonatype.aether.impl.internal.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:197)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:268)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.readArtifactDescriptor(DefaultArtifactDescriptorReader.java:173)
at org.sonatype.aether.impl.internal.DefaultRepositorySystem.readArtifactDescriptor(DefaultRepositorySystem.java:316)
at org.apache.maven.plugin.internal.DefaultPluginDependenciesResolver.resolve(DefaultPluginDependenciesResolver.java:108)
at org.apache.maven.plugin.internal.DefaultMavenPluginManager.getPluginDescriptor(DefaultMavenPluginManager.java:143)
... 23 more
Caused by: java.lang.ClassNotFoundException: org.slf4j.LoggerFactory
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:244)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:230)
... 35 more
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/InternalErrorException
If your code makes use of slf4j, you'll also have to add the API and one of the implementation jars to your project. So the answer is: no.