Play Framework 2.4 with Akka - scala

I'm having a Play Framework app when throws the following error when trying to run:
[info] Set current project to inland24 (in build file:/Users/MyUser/Desktop/MyProj/)
[info] Updating {file:/Users/MyUser/Desktop/MyProj/}root...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[warn] There may be incompatibilities among your library dependencies.
[warn] Here are some of the libraries that were evicted:
[warn] * com.typesafe.akka:akka-actor_2.11:2.3.13 -> 2.4-SNAPSHOT
[warn] Run 'evicted' to see detailed eviction warnings
--- (Running the application, auto-reloading is enabled) ---
java.lang.ClassNotFoundException: akka.event.slf4j.Slf4jLoggingFilter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(DynamicAccess.scala:67)
at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(DynamicAccess.scala:66)
at scala.util.Try$.apply(Try.scala:191)
at akka.actor.ReflectiveDynamicAccess.getClassFor(DynamicAccess.scala:66)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:612)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:143)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:127)
at play.api.libs.concurrent.ActorSystemProvider$.start(Akka.scala:291)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:205)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:61)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.core.server.DevServerStart$.mainDev(DevServerStart.scala:60)
at play.core.server.DevServerStart$.mainDevHttpMode(DevServerStart.scala:50)
at play.core.server.DevServerStart.mainDevHttpMode(DevServerStart.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at play.runsupport.Reloader$.startDevMode(Reloader.scala:223)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.devModeServer$lzycompute$1(PlayRun.scala:74)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.play$sbt$run$PlayRun$$anonfun$$anonfun$$anonfun$$devModeServer$1(PlayRun.scala:74)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:100)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:53)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) java.lang.reflect.InvocationTargetException
[error] Total time: 13 s, completed Oct 9, 2015 8:06:18 PM
Here is what I have as dependency:
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.11" % "2.4-SNAPSHOT"
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.1.0"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.2"
libraryDependencies += "com.typesafe.akka" %% "akka-slf4j" % "2.3.6"
scalaVersion := "2.11.6"
Is there anything that I should add?

After changing the dependency to the following:
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
resolvers += "Typesafe Snapshots" at "http://repo.typesafe.com/typesafe/snapshots/"
It worked!

Related

Can we create a table using spark.sql api from the IDE

I'm on IntelliJ and my spark session looks like this -
val spark = SparkSession.builder()
.appName("Spark SQL")
.config("spark.master", "local")
.config("spark.sql.warehouse.dir", "src/main/resources/warehouse") //to create a user-defined warehouse for storing tables
.config("spark.network.timeout" , "10000000s")//to avoid Heartbeat exception
.getOrCreate()
While I can create a database using
spark.sql("create database newdb")
This creates a directory under src/main/resources/warehouse
However when I attempt to create a table using the same manner
spark.sql("create table testing(id int, name string)"), it fails
Exception in thread "main" org.apache.spark.sql.AnalysisException: Hive support is required to CREATE Hive TABLE (AS SELECT);;
'CreateTable `testing`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, ErrorIfExists
Thereafter, I did add enableHiveSupport() while creating spark session, but it also leads me to this exception
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
Can anyone help me with this?
EDIT
This is my sbt
name := "spark-essentials"
version := "0.1"
scalaVersion := "2.12.10"
val sparkVersion = "3.0.0-preview"
val vegasVersion = "0.3.11"
val postgresVersion = "42.2.2"
resolvers ++= Seq(
"bintray-spark-packages" at "https://dl.bintray.com/spark-packages/maven",
"Typesafe Simple Repository" at "https://repo.typesafe.com/typesafe/simple/maven-releases",
"MavenRepository" at "https://mvnrepository.com"
)
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-hive" % sparkVersion,
// logging
"org.apache.logging.log4j" % "log4j-api" % "2.4.1",
"org.apache.logging.log4j" % "log4j-core" % "2.4.1",
// postgres for DB connectivity
"org.postgresql" % "postgresql" % postgresVersion
)
EDIT:
below is the stack trace
20:41:08 WARN ObjectStore:568 - Failed to get database default, returning NoSuchObjectException
20:41:08 WARN Hive:168 - Failed to access metastore. This class should not accessed in runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:203)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:127)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:300)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:421)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:314)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:68)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:67)
at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:221)
at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:221)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:147)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:137)
at org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:170)
at org.apache.spark.sql.internal.SharedState.globalTempViewManager(SharedState.scala:165)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$2(HiveSessionStateBuilder.scala:56)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager$lzycompute(SessionCatalog.scala:92)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager(SessionCatalog.scala:92)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupRelation(SessionCatalog.scala:741)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveRelations$$lookupTableFromCatalog(Analyzer.scala:781)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.resolveRelation(Analyzer.scala:725)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$6.applyOrElse(Analyzer.scala:765)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$6.applyOrElse(Analyzer.scala:757)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUp$3(AnalysisHelper.scala:90)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:72)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUp$1(AnalysisHelper.scala:90)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp$(AnalysisHelper.scala:84)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUp$2(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:376)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:214)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:374)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:327)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUp$1(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp$(AnalysisHelper.scala:84)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUp$2(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:376)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:214)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:374)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:327)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsUp$1(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsUp$(AnalysisHelper.scala:84)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:757)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:694)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:130)
at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
at scala.collection.immutable.List.foldLeft(List.scala:89)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:127)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:119)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:119)
at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:168)
at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:162)
at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:122)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:98)
at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:98)
at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:146)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:145)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:66)
at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:63)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:63)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:55)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:95)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:607)
at part4sql.SparkSql$.delayedEndpoint$part4sql$SparkSql$1(SparkSql.scala:29)
at part4sql.SparkSql$delayedInit$body.apply(SparkSql.scala:7)
at scala.Function0.apply$mcV$sp(Function0.scala:39)
at scala.Function0.apply$mcV$sp$(Function0.scala:39)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:17)
at scala.App.$anonfun$main$1$adapted(App.scala:80)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.App.main(App.scala:80)
at scala.App.main$(App.scala:78)
at part4sql.SparkSql$.main(SparkSql.scala:7)
at part4sql.SparkSql.main(SparkSql.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
... 94 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
... 100 more
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: file:src/main/resources/warehouse
Last line in the stacktrace --
Caused by: java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: file:src/main/resources/warehouse
-- hints that the proper way to set spark.sql.warehouse.dir is to supply an absolute path for metastore directory, something like file:///<project_root_dir>/src/main/resources/warehouse.
try including /usr/hdp/current/spark-client/conf/hive-site.xml or respective hive-site file while you submit spark.
Also you can try including below configuration when submitting spark job
--conf "spark.sql.catalogImplementation=hive" \

Missing dependency after updating intelliJ

I'm so close to giving up. I've had so many issues with compatibility recently - just utterly ridiculous how fragile applications are. In fixing an issue I had with type errors by updating to IntelliJ 2019.1.2 CE. Upon trying to compile the project I get:
error: error while loading package, Missing dependency 'object java.lang.Object in compiler mirror', required by C:\Users\sambo\.sbt\boot\scala-2.10.6\lib\scala-library.jar(scala/package
.class)
error: error while loading package, Missing dependency 'object java.lang.Object in compiler mirror', required by C:\Users\sambo\.sbt\boot\scala-2.10.6\lib\scala-library.jar(scala/runtime
/package.class)
error: scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
at scala.reflect.internal.Mirrors$RootsBase.getClassByName(Mirrors.scala:99)
at scala.reflect.internal.Mirrors$RootsBase.getRequiredClass(Mirrors.scala:102)
at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass$lzycompute(Definitions.scala:264)
at scala.reflect.internal.Definitions$DefinitionsClass.ObjectClass(Definitions.scala:264)
at scala.reflect.internal.Definitions$DefinitionsClass.AnyRefClass$lzycompute(Definitions.scala:263)
at scala.reflect.internal.Definitions$DefinitionsClass.AnyRefClass(Definitions.scala:263)
at scala.reflect.internal.Definitions$DefinitionsClass.specialPolyClass(Definitions.scala:1120)
at scala.reflect.internal.Definitions$DefinitionsClass.RepeatedParamClass$lzycompute(Definitions.scala:407)
at scala.reflect.internal.Definitions$DefinitionsClass.RepeatedParamClass(Definitions.scala:407)
at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1154)
at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1152)
at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1196)
at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1196)
at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1261)
at scala.tools.nsc.Global$Run.<init>(Global.scala:1290)
at scala.tools.nsc.Driver.doCompile(Driver.scala:32)
at scala.tools.nsc.Main$.doCompile(Main.scala:79)
at scala.tools.nsc.Driver.process(Driver.scala:54)
at scala.tools.nsc.Main.process(Main.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at sbt.compiler.RawCompiler.apply(RawCompiler.scala:33)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:159)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1$$anonfun$apply$2.apply(AnalyzingCompiler.scala:155)
at sbt.IO$.withTemporaryDirectory(IO.scala:358)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:155)
at sbt.compiler.AnalyzingCompiler$$anonfun$compileSources$1.apply(AnalyzingCompiler.scala:152)
at sbt.IO$.withTemporaryDirectory(IO.scala:358)
at sbt.compiler.AnalyzingCompiler$.compileSources(AnalyzingCompiler.scala:152)
at sbt.compiler.IvyComponentCompiler$$anonfun$sbt$compiler$IvyComponentCompiler$$compileAndInstall$1$$anonfun$apply$2$$anonfun$apply$mcV$sp$1.apply(ComponentCompiler.scala:121)
at sbt.compiler.IvyComponentCompiler$$anonfun$sbt$compiler$IvyComponentCompiler$$compileAndInstall$1$$anonfun$apply$2$$anonfun$apply$mcV$sp$1.apply(ComponentCompiler.scala:118)
at sbt.IO$.withTemporaryDirectory(IO.scala:358)
at sbt.compiler.IvyComponentCompiler$$anonfun$sbt$compiler$IvyComponentCompiler$$compileAndInstall$1$$anonfun$apply$2.apply$mcV$sp(ComponentCompiler.scala:118)
at sbt.compiler.IvyComponentCompiler$$anonfun$sbt$compiler$IvyComponentCompiler$$compileAndInstall$1$$anonfun$apply$2.apply(ComponentCompiler.scala:118)
at sbt.compiler.IvyComponentCompiler$$anonfun$sbt$compiler$IvyComponentCompiler$$compileAndInstall$1$$anonfun$apply$2.apply(ComponentCompiler.scala:118)
at sbt.BufferedLogger.bufferQuietly(BufferedLogger.scala:31)
at sbt.compiler.IvyComponentCompiler$$anonfun$sbt$compiler$IvyComponentCompiler$$compileAndInstall$1.apply(ComponentCompiler.scala:116)
at sbt.compiler.IvyComponentCompiler$$anonfun$sbt$compiler$IvyComponentCompiler$$compileAndInstall$1.apply(ComponentCompiler.scala:111)
at sbt.IO$.withTemporaryDirectory(IO.scala:358)
at sbt.compiler.IvyComponentCompiler.sbt$compiler$IvyComponentCompiler$$compileAndInstall(ComponentCompiler.scala:111)
at sbt.compiler.IvyComponentCompiler$$anonfun$apply$1.apply$mcV$sp(ComponentCompiler.scala:102)
at sbt.IfMissing$Define.apply(ComponentManager.scala:75)
at sbt.ComponentManager.sbt$ComponentManager$$createAndCache$1(ComponentManager.scala:39)
at sbt.ComponentManager$$anonfun$sbt$ComponentManager$$fromGlobal$1$1.apply(ComponentManager.scala:27)
at sbt.ComponentManager$$anonfun$sbt$ComponentManager$$fromGlobal$1$1.apply(ComponentManager.scala:26)
at sbt.ComponentManager$$anon$1.call(ComponentManager.scala:50)
at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
at xsbt.boot.Using$.withResource(Using.scala:10)
at xsbt.boot.Using$.apply(Using.scala:9)
at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
at xsbt.boot.Locks$.apply0(Locks.scala:31)
at xsbt.boot.Locks$.apply(Locks.scala:28)
at sbt.ComponentManager.lock(ComponentManager.scala:50)
at sbt.ComponentManager.lockGlobalCache(ComponentManager.scala:49)
at sbt.ComponentManager.sbt$ComponentManager$$fromGlobal$1(ComponentManager.scala:25)
at sbt.ComponentManager$$anonfun$files$1$$anonfun$apply$2.apply(ComponentManager.scala:44)
at sbt.ComponentManager$$anonfun$files$1$$anonfun$apply$2.apply(ComponentManager.scala:44)
at sbt.ComponentManager.sbt$ComponentManager$$getOrElse$1(ComponentManager.scala:32)
at sbt.ComponentManager$$anonfun$files$1.apply(ComponentManager.scala:44)
at sbt.ComponentManager$$anonfun$files$1.apply(ComponentManager.scala:44)
at sbt.ComponentManager$$anon$1.call(ComponentManager.scala:50)
at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:93)
at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:78)
at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:97)
at xsbt.boot.Using$.withResource(Using.scala:10)
at xsbt.boot.Using$.apply(Using.scala:9)
at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:58)
at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:48)
at xsbt.boot.Locks$.apply0(Locks.scala:31)
at xsbt.boot.Locks$.apply(Locks.scala:28)
at sbt.ComponentManager.lock(ComponentManager.scala:50)
at sbt.ComponentManager.lockLocalCache(ComponentManager.scala:47)
at sbt.ComponentManager.files(ComponentManager.scala:44)
at sbt.ComponentManager.file(ComponentManager.scala:53)
at sbt.compiler.IvyComponentCompiler.apply(ComponentCompiler.scala:102)
at sbt.compiler.ComponentCompiler$$anon$2.apply(ComponentCompiler.scala:35)
at sbt.compiler.AnalyzingCompiler.loader(AnalyzingCompiler.scala:118)
at sbt.compiler.AnalyzingCompiler.getInterfaceClass(AnalyzingCompiler.scala:128)
at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:104)
at sbt.compiler.AnalyzingCompiler.newCachedCompiler(AnalyzingCompiler.scala:62)
at sbt.compiler.AnalyzingCompiler.newCachedCompiler(AnalyzingCompiler.scala:57)
at sbt.compiler.CompilerCache$$anon$2.apply(CompilerCache.scala:47)
at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:45)
at sbt.compiler.MixedAnalyzingCompiler$$anonfun$compileScala$1$1.apply$mcV$sp(MixedAnalyzingCompiler.scala:50)
at sbt.compiler.MixedAnalyzingCompiler$$anonfun$compileScala$1$1.apply(MixedAnalyzingCompiler.scala:50)
at sbt.compiler.MixedAnalyzingCompiler$$anonfun$compileScala$1$1.apply(MixedAnalyzingCompiler.scala:50)
at sbt.compiler.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:74)
at sbt.compiler.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:49)
at sbt.compiler.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:64)
at sbt.compiler.IC$$anonfun$compileInternal$1.apply(IncrementalCompiler.scala:160)
at sbt.compiler.IC$$anonfun$compileInternal$1.apply(IncrementalCompiler.scala:160)
at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:66)
at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:64)
at sbt.inc.IncrementalCommon.cycle(IncrementalCommon.scala:32)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:72)
at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:71)
at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:99)
at sbt.inc.Incremental$.compile(Incremental.scala:71)
at sbt.inc.IncrementalCompile$.apply(Compile.scala:54)
at sbt.compiler.IC$.compileInternal(IncrementalCompiler.scala:160)
at sbt.compiler.IC$.incrementalCompile(IncrementalCompiler.scala:138)
at sbt.Compiler$.compile(Compiler.scala:155)
at sbt.Compiler$.compile(Compiler.scala:141)
at sbt.Defaults$.sbt$Defaults$$compileIncrementalTaskImpl(Defaults.scala:886)
at sbt.Defaults$$anonfun$compileIncrementalTask$1.apply(Defaults.scala:877)
at sbt.Defaults$$anonfun$compileIncrementalTask$1.apply(Defaults.scala:875)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
This is my build:
Seq(
"org.scalatestplus.play" %% "scalatestplus-play" % "3.0.0" % Test,
"com.h2database" % "h2" % "1.4.194",
"org.webjars" % "metisMenu" % "1.1.3",
"com.typesafe.play" %% "play-json" % "2.6.0",
"org.reactivemongo" %% "play2-reactivemongo" % "0.16.6-play26",
"org.reactivemongo" %% "reactivemongo-akkastream" % "0.16.6",
"org.webjars.bower" % "bootstrap-sass" % "3.3.6",
"org.webjars" % "bootstrap" % "3.3.4",
"org.webjars" % "font-awesome" % "4.7.0",
"org.webjars" % "datatables" % "1.10.5",
"org.webjars" % "datatables-plugins" % "1.10.5"
)
I'm using JDK 1.8 / scala 2.12.2 / play: 2.6.2 / sbt: 0.13.15. My %JAVA_HOME is routed to C:\Program Files\Java\jdk1.8.0_131. I've not changed anything else so naturally I'm losing it right now. Any help is welcome to stop the machines winning.
For me these things usually do the trick:
Just Restart Intellij
or
Delete .idea
close Project
delete .idea from the Project
open the project again in Intellij, make sure it is recognised as SBT project.

Exception when using the saveToPhoenix method to load/save a RDD on Hbase

I would like to use the apache-phoenix framework.
The problem is that I keep having an exception telling me that the class HBaseConfiguration can't be found.
Here is the code I want to use:
import org.apache.spark.SparkContext
import org.apache.spark.sql._
import org.apache.phoenix.spark._
// Load INPUT_TABLE
object MainTest2 extends App {
val sc = new SparkContext("local", "phoenix-test")
val sqlContext = new SQLContext(sc)
val df = sqlContext.load("org.apache.phoenix.spark", Map("table" -> "INPUT_TABLE",
"zkUrl" -> "localhost:3888"))
}
Here is the SBT I'm using :
name := "spark-to-hbase"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-mapreduce-client-core" % "2.3.0",
"org.apache.phoenix" % "phoenix-core" % "4.11.0-HBase-1.3",
"org.apache.spark" % "spark-core_2.11" % "2.1.1",
"org.apache.spark" % "spark-sql_2.11" % "2.1.1",
"org.apache.phoenix" % "phoenix-spark" % "4.11.0-HBase-1.3"
)
And here is the exception :
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/hbase/HBaseConfiguration at
org.apache.phoenix.query.ConfigurationFactory$ConfigurationFactoryImpl$1.call(ConfigurationFactory.java:49)
at
org.apache.phoenix.query.ConfigurationFactory$ConfigurationFactoryImpl$1.call(ConfigurationFactory.java:46)
at
org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76)
at
org.apache.phoenix.util.PhoenixContextExecutor.callWithoutPropagation(PhoenixContextExecutor.java:91)
at
org.apache.phoenix.query.ConfigurationFactory$ConfigurationFactoryImpl.getConfiguration(ConfigurationFactory.java:46)
at
org.apache.phoenix.jdbc.PhoenixDriver.initializeConnectionCache(PhoenixDriver.java:151)
at
org.apache.phoenix.jdbc.PhoenixDriver.(PhoenixDriver.java:142)
at
org.apache.phoenix.jdbc.PhoenixDriver.(PhoenixDriver.java:69)
at org.apache.phoenix.spark.PhoenixRDD.(PhoenixRDD.scala:43)
at
org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:52)
at
org.apache.spark.sql.execution.datasources.LogicalRelation.(LogicalRelation.scala:40)
at
org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:389)
at
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
at
org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
at org.apache.spark.sql.SQLContext.load(SQLContext.scala:965) at
MainTest2$.delayedEndpoint$MainTest2$1(MainTest2.scala:9) at
MainTest2$delayedInit$body.apply(MainTest2.scala:6) at
scala.Function0$class.apply$mcV$sp(Function0.scala:34) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.App$$anonfun$main$1.apply(App.scala:76) at
scala.collection.immutable.List.foreach(List.scala:381) at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76) at
MainTest2$.main(MainTest2.scala:6) at MainTest2.main(MainTest2.scala)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.HBaseConfiguration at
java.net.URLClassLoader.findClass(URLClassLoader.java:381) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 26 more
I've already tried to change the HADOOP_CLASSPATH in hadoop-env.sh like it is said in this previous post.
What can I do to overcome this problem?
I found a solution to my problem. As the exception says, my compiler isn't able to find the class HBaseConfiguration. HBaseConfiguration is used inside org.apache.hadoop.hbase library and so is needed to compile. I noticed that the HBaseConfiguration class wasn't present in the org.apache.hadoop library as I thought. For the hbase 1.3.1 version installed on my PC computer, I managed to find this class in the hbase-common-1.3.1 jar located in my HBASE_HOME/lib folder.
Then I include this dependency in my built.SBT :
"org.apache.hbase" % "hbase-common" % "1.3.1"
And the Exception was gone.

Cannot get uTest to see my tests

I'm trying to get uTest to work with ScalaJS and SBT. SBT is compiling the files, and uTest is running, but it simply ignores my tests. Try as I might I cannot find any difference between my code and the tutorial examples.
build.sbt:
enablePlugins(ScalaJSPlugin)
name := "Scala.js Stuff"
scalaVersion := "2.11.5" // or any other Scala version >= 2.10.2
scalaJSStage in Global := FastOptStage
libraryDependencies += "com.lihaoyi" %% "utest" % "0.3.0"
testFrameworks += new TestFramework("utest.runner.Framework")
src/test/scala/com/mysite/jovian/GeometryTest.scala:
package com.mysite.jovian
import utest._
object GeometryTest extends TestSuite {
def tests = TestSuite {
'addPoints {
val p: Point = new Point(3,4)
val q: Point = new Point(4,3)
val expected: Point = new Point(8,8)
assert(p.plus(q).equals(expected))
throw new Exception("foo")
}
'fail {
assert(1==2)
}
}
}
Output:
> reload
[info] Loading project definition from /Users/me/Dropbox (Personal)/mysite/flocks/project
[info] Set current project to Scala.js Stuff (in build file:/Users/me/Dropbox%20(Personal)/mysite/flocks/)
> test
[success] Total time: 1 s, completed Mar 6, 2015 7:01:41 AM
> test-only -- com.mysite.jovian.GeometryTest
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] No tests to run for test:testOnly
[success] Total time: 1 s, completed Mar 6, 2015 7:01:49 AM
If I introduce a syntax error, sbt test does see it:
> test
[info] Compiling 1 Scala source to /Users/me/Dropbox (Personal)/mysite/flocks/target/scala-2.11/test-classes...
[error] /Users/me/Dropbox (Personal)/mysite/flocks/src/test/scala/com/mysite/jovian/GeometryTest.scala:21: not found: value blablablablabla
[error] blablablablabla
[error] ^
[error] one error found
[error] (test:compile) Compilation failed
[error] Total time: 1 s, completed Mar 6, 2015 7:03:54 AM
So it's definitely seeing the code, it just doesn't seem to think that "tests" contains any tests.
Otherwise, in the non-test code, SBT+ScalaJS seems to be working fine...
Thanks for any help, I am mystified.
Your mistake lies in the dependency on uTest:
libraryDependencies += "com.lihaoyi" %% "utest" % "0.3.0"
This is a JVM dependency. To use the Scala.js-enabled dependency, use %%% instead of %%, like this:
libraryDependencies += "com.lihaoyi" %%% "utest" % "0.3.0"
Additionally, you probably want this dependency only in the Test configuration, so add % "test" a the end:
libraryDependencies += "com.lihaoyi" %%% "utest" % "0.3.0" % "test"

Deploy lift application on Tomcat

I have a problem at deploying a lift application on a Tomcat Server.
At server start the log shows an error:
INFO: Deploying web application archive lift.war
Nov 08, 2013 5:47:18 PM org.apache.catalina.core.StandardContext start
SEVERE: Error filterStart
Nov 08, 2013 5:47:18 PM org.apache.catalina.core.StandardContext start
SEVERE: Context [/lift] startup failed due to previous errors
The error occurs on either Tomcat 6 and 7 with lift Version 2.6 with scala 2.10.3 on Tomcat 7
Thanks in advance!
EDIT:
Here's the stack trace from Tomcat:
SCHWERWIEGEND: Exception starting filter LiftFilter
java.lang.NoSuchMethodError: scala.Predef$.Map()Lscala/collection/immutable/Map$;
at net.liftweb.common.BoxTrait$class.$init$(Box.scala:62)
at net.liftweb.common.Box$.<init>(Box.scala:49)
at net.liftweb.common.Box$.<clinit>(Box.scala)
at net.liftweb.util.Props$.mode$lzycompute(Props.scala:112)
at net.liftweb.util.Props$.mode(Props.scala:110)
at net.liftweb.util.Props$.devMode$lzycompute(Props.scala:204)
at net.liftweb.util.Props$.devMode(Props.scala:204)
at net.liftweb.http.LiftRules$.<init>(LiftRules.scala:79)
at net.liftweb.http.LiftRules$.<clinit>(LiftRules.scala)
at net.liftweb.http.provider.servlet.ServletFilterProvider$class.init(ServletFilterProvider.scala:38)
at net.liftweb.http.LiftFilter.init(LiftServlet.scala:928)
at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:277)
at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:258)
at org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:382)
at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:103)
at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4649)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5305)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:899)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:875)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:618)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:963)
at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1600)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
EDIT:
Here is my build.sbt:
name := "lift_project"
version := "0.0.1"
organization := "net.liftweb"
scalaVersion := "2.10.3"
resolvers ++= Seq("snapshots" at "http://oss.sonatype.org/content/repositories/snapshots",
"staging" at "http://oss.sonatype.org/content/repositories/staging",
"releases" at "http://oss.sonatype.org/content/repositories/releases"
)
seq(com.github.siasia.WebPlugin.webSettings :_*)
unmanagedBase <<= baseDirectory { base => base / "lib" }
unmanagedJars in Compile <<= baseDirectory map { base => (base ** "*.jar").classpath }
unmanagedResourceDirectories in Test <+= (baseDirectory) { _ / "src/main/webapp" }
scalacOptions ++= Seq("-deprecation", "-unchecked", "-feature", "-language:implicitConversions", "-language:postfixOps")
libraryDependencies ++= {
val liftVersion = "2.6-M1"
Seq(
"net.liftweb" %% "lift-webkit" % liftVersion % "compile",
"net.liftweb" %% "lift-mapper" % liftVersion % "compile",
"net.liftweb" %% "lift-ldap" % liftVersion % "compile",
"org.eclipse.jetty" % "jetty-webapp" % "8.1.7.v20120910" % "container,test",
"org.eclipse.jetty.orbit" % "javax.servlet" % "3.0.0.v201112011016" % "container,test" artifacts Artifact("javax.servlet", "jar", "jar"),
"ch.qos.logback" % "logback-classic" % "1.0.6",
"com.h2database" % "h2" % "1.3.167",
"mysql" % "mysql-connector-java" % "5.1.25",
"javax.servlet" % "servlet-api" % "2.5" % "provided->default"
)
}
net.virtualvoid.sbt.graph.Plugin.graphSettings
EDIT:
And the content of the lib folder:
activation-1.1.jar
commons-codec-1.6.jar
commons-fileupload-1.2.2.jar
h2-1.3.167.jar
htmlparser-1.4.jar
iText-2.1.5.jar
jcommon-1.0.18.jar
jfreechart-1.0.15.jar
jfreechart-1.0.15-demo.jar
jfreechart-1.0.15-experimental.jar
jfreechart-1.0.15-swt.jar
joda-convert-1.2.jar
joda-time-2.1.jar
junit.jar
lift-actor_2.10-2.6-M1.jar
lift-common_2.10-2.6-M1.jar
lift-db_2.10-2.6-M1.jar
lift-json_2.10-2.6-M1.jar
lift-ldap_2.10-2.6-M1.jar
lift-mapper_2.10-2.6-M1.jar
lift-markdown_2.10-2.6-M1.jar
lift-proto_2.10-2.6-M1.jar
lift-util_2.10-2.6-M1.jar
lift-webkit_2.10-2.6-M1.jar
logback-classic-1.0.6.jar
logback-core-1.0.6.jar
mail-1.4.4.jar
mysql-connector-java-5.1.25.jar
paranamer-2.4.1.jar
sbt-launch.jar
scala-compiler.jar
scala-library.jar
scalap-2.10.0.jar
scala-reflect-2.10.3.jar
slf4j-api-1.7.2.jar
swtgraphics2d.jar
Ok. I guess the problem lies here:
unmanagedJars in Compile <<= baseDirectory map { base => (base ** "*.jar").classpath }
This simply adds every jar under your project to the WAR. I don't know which SBT version you're using but IIRC older versions of SBT put scala-library.jar in their build path under your project root (baseDirectory), which, in your case, gets included during the build process, resulting in a jar conflict.
Try removing the suspected line or change 'baseDirectory' to something more specific like 'unmanagedBase'.
scala-library.jar does look suspicious, as we cannot tell the version from the file name. Maybe it is an old standard library, for which your lift version is not working.
Try
unzip -p scala-library.jar META-INF/MANIFEST.MF
to see the version, you need it to be 2.10.0 or newer.
I agree with harp seal pup that the problem is probably caused by your copying of the unmanaged jars. Try not to use unmanaged jars (remove the line, let sbt manage the dependencies).
After using some try&error methods I finally identified the jar causing trouble:
sbt-launch.jar
After removing this from the lib folder everything works fine on Tomcat (and Jetty as well)