Not sure if this is the right place to post this question. (Apologies if it isn't. And if so, please point me in the right direction.)
I am attempting to compile Apache Bahir to generate scala 2.11 artifacts (as mvn clean install -P scala-2.11 -Dscala-2.11 -DskipTests). When attempting do so, I am running into the following build issues:
[INFO] Reactor Summary for Apache Bahir - Parent POM 3.0.0-SNAPSHOT:
[INFO]
[INFO] Apache Bahir - Parent POM .......................... SUCCESS [ 4.166 s]
[INFO] Apache Bahir - Common .............................. SUCCESS [ 20.276 s]
[INFO] Apache Bahir - Spark SQL Cloudant DataSource ....... FAILURE [ 0.147 s]
[INFO] Apache Bahir - Spark SQL Streaming Akka ............ SKIPPED
[INFO] Apache Bahir - Spark SQL Streaming MQTT ............ SKIPPED
[INFO] Apache Bahir - Spark SQL Streaming JDBC ............ SKIPPED
[INFO] Apache Bahir - Spark SQL Streaming SQS ............. SKIPPED
[INFO] Apache Bahir - Spark Streaming Akka ................ SKIPPED
[INFO] Apache Bahir - Spark Streaming MQTT ................ SKIPPED
[INFO] Apache Bahir - Spark Streaming PubNub .............. SKIPPED
[INFO] Apache Bahir - Spark Streaming Google PubSub ....... SKIPPED
[INFO] Apache Bahir - Spark Streaming Twitter ............. SKIPPED
[INFO] Apache Bahir - Spark Streaming ZeroMQ .............. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 24.877 s
[INFO] Finished at: 2020-10-07T18:42:46-07:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project spark-sql-cloudant_2.12: Could not resolve
dependencies for project org.apache.bahir:spark-sql-cloudant_2.12:jar:3.0.0-SNAPSHOT:
The following artifacts could not be resolved: org.apache.bahir:bahir-common_2.11:jar:3.0.0-
SNAPSHOT, org.apache.bahir:bahir-common_2.11:jar:tests:3.0.0-SNAPSHOT: Failure to find
org.apache.bahir:bahir-common_2.11:jar:3.0.0-SNAPSHOT in https://repository.apache.org/snapshots
was cached in the local repository, resolution will not be reattempted until the update
interval of apache.snapshots has elapsed or updates are forced -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the
following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <args> -rf :spark-sql-cloudant_2.12
Upon examining the common/target directory, I see that although there is a scala-2.11 sub-directory with compiled class files in there, I don't see the appropriate bahir-common_2.11-3.0.0-SNAPSHOT*.jar files being generated (while the bahir-common_2.12-3.0.0-SNAPSHOT*.jar seem to be generated just fine).
I was wondering if anyone here could potentially help with this. Thanks in advance!
I suppose this is one way to do it(?):
$ cd <project_root_dir>
$ ./dev/change-scala-version.sh 2.11
$ mvn clean package -Pscala-2.11 -Dscala-2.11 -DskipTest
PS: If anyone has an authoritative answer to this, please chime in. Thanks!
Related
I cloned the 'master' copy of Apache Zeppelin from https://github.com/apache/zeppelin in my Eclipse workspace and followed the steps for installation given as follows:
Created a new Java project and imported zeppelin.
Converted it into a Maven project
I had previously installed NodeJS and subsequently bower.
Activate Maven repository index updates
Window=>Preferences=>Maven and checked the following:
Download Artifact Sources
Download Artifact JavaDoc
Download repository index updates on startup
Update Maven projects on startup
Made sure I had JDK installed
C:/Program Files (or X86)/Java/.. should have JDK(NOT JRE)
Windows => Preferences => search for ‘jre’
Installed JREs => Add… => Standard VM
JRE home: Installed jdk folder location/jdk1.8.***
Right-click on the Project=>Run As => Run Configurations
Double click Maven Build
Name: clean package
Base directory: the zeppelin directory
Goals: clean package
Check ‘Skip Tests’
JRE tab=> choose JDK(installed above) instead of JRE
Run
Most modules pass except zeppelin-web onward
[INFO] Reactor Summary:
[INFO]
[INFO] Zeppelin ........................................... SUCCESS [ 20.674 s]
[INFO] Zeppelin: Interpreter .............................. SUCCESS [01:55 min]
[INFO] Zeppelin: Zengine .................................. SUCCESS [02:36 min]
[INFO] Zeppelin: Display system apis ...................... SUCCESS [01:21 min]
[INFO] Zeppelin: Spark dependencies ....................... SUCCESS [03:32 min]
[INFO] Zeppelin: Spark .................................... SUCCESS [04:05 min]
[INFO] Zeppelin: Markdown interpreter ..................... SUCCESS [ 6.287 s]
[INFO] Zeppelin: Angular interpreter ...................... SUCCESS [ 3.118 s]
[INFO] Zeppelin: Shell interpreter ........................ SUCCESS [ 3.650 s]
[INFO] Zeppelin: Livy interpreter ......................... SUCCESS [ 30.402 s]
[INFO] Zeppelin: HBase interpreter ........................ SUCCESS [04:02 min]
[INFO] Zeppelin: Apache Pig Interpreter ................... SUCCESS [03:37 min]
[INFO] Zeppelin: PostgreSQL interpreter ................... SUCCESS [ 16.102 s]
[INFO] Zeppelin: JDBC interpreter ......................... SUCCESS [ 17.661 s]
[INFO] Zeppelin: File System Interpreters ................. SUCCESS [ 16.002 s]
[INFO] Zeppelin: Flink .................................... SUCCESS [04:32 min]
[INFO] Zeppelin: Apache Ignite interpreter ................ SUCCESS [02:22 min]
[INFO] Zeppelin: Kylin interpreter ........................ SUCCESS [ 5.049 s]
[INFO] Zeppelin: Python interpreter ....................... SUCCESS [ 4.519 s]
[INFO] Zeppelin: Lens interpreter ......................... SUCCESS [02:00 min]
[INFO] Zeppelin: Apache Cassandra interpreter ............. SUCCESS [03:49 min]
[INFO] Zeppelin: Elasticsearch interpreter ................ SUCCESS [01:47 min]
[INFO] Zeppelin: BigQuery interpreter ..................... SUCCESS [ 24.181 s]
[INFO] Zeppelin: Alluxio interpreter ...................... SUCCESS [02:17 min]
[INFO] Zeppelin: web Application .......................... FAILURE [01:27 min]
[INFO] Zeppelin: Server ................................... SKIPPED
[INFO] Zeppelin: Packaging distribution ................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 41:58 min
[INFO] Finished at: 2016-10-20T10:47:21-05:00
[INFO] Final Memory: 141M/508M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal com.github.eirslett:frontend-maven-plugin:0.0.25:grunt (grunt build) on project zeppelin-web: Failed to run task: 'grunt build --no-color' failed. (error code 3) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn -rf :zeppelin-web
When I did a -e and -X for a debug and error stack I got this:
[ERROR] Failed to execute goal com.github.eirslett:frontend-maven-plugin:0.0.25:grunt (grunt build) on project zeppelin-web: Failed to run task: 'grunt build --no-color' failed. (error code 3) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal com.github.eirslett:frontend-maven-plugin:0.0.25:grunt (grunt build) on project zeppelin-web: Failed to run task
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoFailureException: Failed to run task
at com.github.eirslett.maven.plugins.frontend.mojo.AbstractFrontendMojo.execute(AbstractFrontendMojo.java:66)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
... 20 more
Caused by: com.github.eirslett.maven.plugins.frontend.lib.TaskRunnerException: 'grunt build --no-color' failed. (error code 3)
at com.github.eirslett.maven.plugins.frontend.lib.NodeTaskExecutor.execute(NodeTaskExecutor.java:59)
at com.github.eirslett.maven.plugins.frontend.mojo.GruntMojo.execute(GruntMojo.java:64)
at com.github.eirslett.maven.plugins.frontend.mojo.AbstractFrontendMojo.execute(AbstractFrontendMojo.java:64)
... 22 more
[ERROR]
I have been looking through SO for weeks but this is the closest post I found: Apache Zeppelin installation grunt build error and even asked the person if a solution was found, no response. The solution posted in there didn't work for me.
I am doing all this on a Windows Server 2008 R2 Standard box. I'd be deeply grateful if someone could point me in the right direction in terms of a solution.
It looks like the frontend webapp build fails on your windows box.
Please try running npm run build manually on the latest sources - that will give more verbose logs and will help to find the reason.
At least on Linux - there are few pre-requests in order for it to work i.e libfontconfig
I am officially porting OrientDB (community release) on powerpc ( ppc64le ) . I have cloned the repo from here and followed build steps as below :
$ git clone https://github.com/orientechnologies/orientdb
$ git checkout develop
$ cd orientdb
$ mvn clean install
However getting test case failures as below
Failed tests:
TestBatchRemoteResultSet.runBatchQuery:59
Tests in error:
OCommandExecutorSQLDeleteVertexTest.testDeleteVertexBatch:87 » ODatabase Datab...
OCommandExecutorSQLDeleteVertexTest.testDeleteVertexFromSubquery:114 » ODatabase
OCommandExecutorSQLDeleteVertexTest.testDeleteVertexFromSubquery2:127 » ODatabase
OCommandExecutorSQLDeleteVertexTest.testDeleteVertexLimit:72 » ODatabase Datab...
SimplePropertyLinkTest.testSimplePropertyLink:27 ClassCast java.lang.String in...
Tests run: 384, Failures: 1, Errors: 5, Skipped: 123
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] OrientDB ........................................... SUCCESS [ 3.305 s]
[INFO] OrientDB Test Commons .............................. SUCCESS [ 5.100 s]
[INFO] OrientDB Core ...................................... SUCCESS [03:11 min]
[INFO] OrientDB Client .................................... SUCCESS [ 7.290 s]
[INFO] OrientDB Object .................................... SUCCESS [ 15.109 s]
[INFO] OrientDB Tools ..................................... SUCCESS [ 9.159 s]
[INFO] OrientDB Server .................................... SUCCESS [ 39.269 s]
[INFO] OrientDB GraphDB ................................... FAILURE [01:11 min]
[INFO] OrientDB Tests ..................................... SKIPPED
[INFO] OrientDB Distributed Server ........................ SKIPPED
[INFO] OrientDB Lucene full text index .................... SKIPPED
[INFO] OrientDB JDBC Driver ............................... SKIPPED
[INFO] OrientDB ETL ....................................... SKIPPED
[INFO] OrientDB Community Distribution .................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 05:43 min
[INFO] Finished at: 2016-06-15T13:25:21+05:30
[INFO] Final Memory: 134M/227M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.19.1:test (default-test) on project orientdb-graphdb: There are test failures.
[ERROR]
[ERROR] Please refer to /root/amit/rethinkDB/orientDB_dev/graphdb/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :orientdb-graphdb
[1]: https://github.com/orientechnologies/orientdb
Any pointers on this , have raised a git issue ticket here , but thought of asking here if anyone has seen such errors.
I am following these instructions for building spark. When I do :
mvn -DskipTests clean package -e
My system details:
Processor: Intel® Core™ i5-5300U CPU # 2.30GHz × 4
OS Type: 64-bit Ubuntu 15.04
JVM: OpenJDK 64-Bit Server VM (24.79-b02)
Java: version 1.7.0_79
Scala version: 2.11.4
Getting following error:
[INFO] Spark Project Parent POM ........................... SUCCESS [ 4.121 s]
[INFO] Spark Project Test Tags ............................ SUCCESS [ 2.811 s]
[INFO] Spark Project Sketch ............................... SUCCESS [ 11.832 s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 10.080 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 8.214 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 6.070 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 16.173 s]
[INFO] Spark Project Core ................................. SUCCESS [04:26 min]
[INFO] Spark Project GraphX ............................... SUCCESS [05:40 min]
[INFO] Spark Project Streaming ............................ SUCCESS [12:02 min]
[INFO] Spark Project Catalyst ............................. FAILURE [11:01 min]
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project Docker Integration Tests ............. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Project External Akka ........................ SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External MQTT Assembly ............... SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 34:11 min
[INFO] Finished at: 2016-02-18T11:31:49+05:30
[INFO] Final Memory: 77M/758M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-catalyst_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. NullPointerException -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-catalyst_2.11: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:145)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
... 20 more
Caused by: java.lang.NullPointerException
at scala.tools.nsc.typechecker.Typers$Typer.typedFunction$1(Typers.scala:5280)
at scala.tools.nsc.typechecker.Typers$Typer.typedOutsidePatternMode$1(Typers.scala:5320)
at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5352)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5359)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5395)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5422)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5369)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5373)
at scala.tools.nsc.typechecker.Typers$Typer.typedArg(Typers.scala:3164)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$class.typedArgWithFormal$1(PatternTypers.scala:112)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$$anonfun$2.apply(PatternTypers.scala:115)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$$anonfun$2.apply(PatternTypers.scala:115)
at scala.runtime.Tuple2Zipped$$anonfun$map$extension$1.apply(Tuple2Zipped.scala:46)
at scala.runtime.Tuple2Zipped$$anonfun$map$extension$1.apply(Tuple2Zipped.scala:44)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.runtime.Tuple2Zipped$.map$extension(Tuple2Zipped.scala:44)
at scala.tools.nsc.typechecker.PatternTypers$PatternTyper$class.typedArgsForFormals(PatternTypers.scala:115)
at scala.tools.nsc.typechecker.Typers$Typer.typedArgsForFormals(Typers.scala:111)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$handleMonomorphicCall$1(Typers.scala:3470)
Take a look at the official docs for building Spark.
If you want to compile Spark with Scala 2.11, try the following (assuming you are in the root of the source directory):
./dev/change-scala-version.sh 2.11
./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests -Dscala-2.11 clean package
So, I am trying to build Spark 1.3.0 (standalone mode) on Windows 7 using Maven, but I'm getting a build failure. I am not sure if it is a dependency issue, or if something is wrong with Scala, or the plugin used to compile the project, or something else entirely.
java -version
java version "1.8.0_31"
Java(TM) SE Runtime Environment (build 1.8.0_31-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.31-b07, mixed mode)
Here is the command I am using to build:
mvn -DskipTests -X clean package
I get the following output:
[DEBUG] processing sourceDirectory=C:\Program Files\Apache Software Foundation\spark-1.3.0\mllib\src\main\scala encoding=null
error file=C:\Program Files\Apache Software Foundation\spark-1.3.0\mllib\src\main\scala\org\apache\spark\mllib\clustering\LDAModel.scala message=Input length = 1
Saving to outputFile=C:\Program Files\Apache Software Foundation\spark-1.3.0\mllib\scalastyle-output.xml
Processed 143 file(s)
Found 1 errors
Found 0 warnings
Found 0 infos
Finished in 1032 ms
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 5.456 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 16.592 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 9.359 s]
[INFO] Spark Project Core ................................. SUCCESS [03:48 min]
[INFO] Spark Project Bagel ................................ SUCCESS [ 20.622 s]
[INFO] Spark Project GraphX ............................... SUCCESS [ 49.076 s]
[INFO] Spark Project Streaming ............................ SUCCESS [01:22 min]
[INFO] Spark Project Catalyst ............................. SUCCESS [01:26 min]
[INFO] Spark Project SQL .................................. SUCCESS [01:47 min]
[INFO] Spark Project ML Library ........................... FAILURE [01:55 min]
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:01 min
[INFO] Finished at: 2015-03-30T15:06:01-04:00
[INFO] Final Memory: 67M/944M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalastyle:scalastyle-maven-plugin:0.4.0:check (default) on project spark-mllib_2.10: Failed during scalastyle execution: You have 1 Scalastyle violation(s). -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.scalastyle:scalastyle-maven-plugin:0.4.0:check (default) on project spark-mllib_2.10: Failed during scalastyle execution
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:216)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:355)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:216)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:160)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: Failed during scalastyle execution
at org.scalastyle.maven.plugin.ScalastyleViolationCheckMojo.performCheck(ScalastyleViolationCheckMojo.java:238)
at org.scalastyle.maven.plugin.ScalastyleViolationCheckMojo.execute(ScalastyleViolationCheckMojo.java:199)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:132)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
Caused by: org.apache.maven.plugin.MojoFailureException: You have 1 Scalastyle violation(s).
at org.scalastyle.maven.plugin.ScalastyleViolationCheckMojo.performCheck(ScalastyleViolationCheckMojo.java:230)
... 22 more
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :spark-mllib_2.10
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
How do I fix this?
UPDATE: Issue has been reported and resolved: https://issues.apache.org/jira/browse/SPARK-6532
For anyone else looking for a work-around for this issue, simply modify the following plugin in the pom.xml to include inputEncoding for UTF-8:
<plugin>
<groupId>org.scalastyle</groupId>
<artifactId>scalastyle-maven-plugin</artifactId>
<version>0.4.0</version>
<configuration>
<verbose>false</verbose>
<failOnViolation>true</failOnViolation>
<includeTestSourceDirectory>false</includeTestSourceDirectory>
<failOnWarning>false</failOnWarning>
<sourceDirectory>${basedir}/src/main/scala</sourceDirectory>
<testSourceDirectory>${basedir}/src/test/scala</testSourceDirectory>
<configLocation>scalastyle-config.xml</configLocation>
<outputFile>scalastyle-output.xml</outputFile>
<outputEncoding>UTF-8</outputEncoding>
<inputEncoding>UTF-8</inputEncoding> <!--add this line-->
</configuration>
This only seems to be a problem for Windows users.
Could some please help me on how to use the scala-eclipse IDE for spark ?
I came across this link - http://syndeticlogic.net/?p=311. But I am unable to follow it.
I entered the command - mvn -Phadoop2 eclipse:clean eclipse:eclipse inside the spark directory after a long list of downloads it gave me some error. Please help. Thanks
Below is the error i received
Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM .......................... SUCCESS [5:22.386s]
[INFO] Spark Project Core ................................ SUCCESS [17:20.807s]
[INFO] Spark Project Bagel ............................... FAILURE [2.159s]
[INFO] Spark Project GraphX .............................. SKIPPED
[INFO] Spark Project ML Library .......................... SKIPPED
[INFO] Spark Project Streaming ........................... SKIPPED
[INFO] Spark Project Tools ............................... SKIPPED
[INFO] Spark Project Catalyst ............................ SKIPPED
[INFO] Spark Project SQL ................................. SKIPPED
[INFO] Spark Project Hive ................................ SKIPPED
[INFO] Spark Project REPL ................................ SKIPPED
[INFO] Spark Project Assembly ............................ SKIPPED
[INFO] Spark Project External Twitter .................... SKIPPED
[INFO] Spark Project External Kafka ...................... SKIPPED
[INFO] Spark Project External Flume ...................... SKIPPED
[INFO] Spark Project External ZeroMQ ..................... SKIPPED
[INFO] Spark Project External MQTT ....................... SKIPPED
[INFO] Spark Project Examples ............................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:15.115s
[INFO] Finished at: Wed May 07 15:27:51 GMT+05:30 2014
[INFO] Final Memory: 22M/81M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "hadoop2" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process (default) on project spark-bagel_2.10: Failed to resolve dependencies for one or more projects in the reactor. Reason: Missing:
[ERROR] ----------
[ERROR] 1) org.apache.spark:spark-core_2.10:jar:1.0.0-SNAPSHOT
[ERROR]
[ERROR] Try downloading the file manually from the project website.
[ERROR]
[ERROR] Then, install it using the command:
[ERROR] mvn install:install-file -DgroupId=org.apache.spark -DartifactId=spark-core_2.10 -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file
[ERROR]
[ERROR] Alternatively, if you host your own repository you can deploy the file there:
[ERROR] mvn deploy:deploy-file -DgroupId=org.apache.spark -DartifactId=spark-core_2.10 -Dversion=1.0.0-SNAPSHOT -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id]
[ERROR]
[ERROR] Path to dependency:
[ERROR] 1) org.apache.spark:spark-bagel_2.10:jar:1.0.0-SNAPSHOT
[ERROR] 2) org.apache.spark:spark-core_2.10:jar:1.0.0-SNAPSHOT
[ERROR]
[ERROR] ----------
[ERROR] 1 required artifact is missing.
[ERROR]
[ERROR] for artifact:
[ERROR] org.apache.spark:spark-bagel_2.10:jar:1.0.0-SNAPSHOT
[ERROR]
[ERROR] from the specified remote repositories:
[ERROR] maven-repo (http://repo.maven.apache.org/maven2, releases=true, snapshots=false),
[ERROR] apache-repo (https://repository.apache.org/content/repositories/releases, releases=true, snapshots=false),
[ERROR] jboss-repo (https://repository.jboss.org/nexus/content/repositories/releases, releases=true, snapshots=false),
[ERROR] mqtt-repo (https://repo.eclipse.org/content/repositories/paho-releases, releases=true, snapshots=false),
[ERROR] cloudera-repo (https://repository.cloudera.com/artifactory/cloudera-repos, releases=true, snapshots=false),
[ERROR] apache.snapshots (http://repository.apache.org/snapshots, releases=false, snapshots=true),
[ERROR] central (http://repo.maven.apache.org/maven2, releases=true, snapshots=false)
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :spark-bagel_2.10
This is because there is no profile called hadoop2 in the pom.xml. The closest matches are hadoop-2.2,hadoop-2.3 etc.
You can run the following
mvn -Phadoop-2.2 eclipse:clean eclipse:eclipse
or you may run 'mvn help:all-profiles' to list all the profiles and use one from it
If you want to contribute to the Apache Spark project, then
Go to spark home and run sbt/sbt eclipse
In Scala IDE, Select File | Import | Existing Projects into Workspace.
Select root directory :MY_SPARK_HOME
Select Search for Nested Projects
Select the projects that you want
Do not select "Copy projects into workspace".
If you want to use the spark libraries in an application that you are using then,
- You can create a jar using the sbt/sbt assembly command and then add that jar as a library to your application project
Also refer to the eclipse documentation here at: https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-Eclipse