I tried to add JBoss 7.0 AS to eclipse and trying to deploy openshift project on it. but it fail. below is log
13:11:58,393 ERROR [org.jboss.as.controller] (pool-1-thread-2) Operation ("deploy") failed - address: ([("deployment" => "panchali-1.0.war")]): java.util.NoSuchElementException: No child 'runtime-name' exists
at org.jboss.dmr.ModelValue.requireChild(ModelValue.java:362) [jboss-dmr-1.0.0.Final.jar:1.0.0.Final]
at org.jboss.dmr.ObjectModelValue.requireChild(ObjectModelValue.java:298) [jboss-dmr-1.0.0.Final.jar:1.0.0.Final]
at org.jboss.dmr.ModelNode.require(ModelNode.java:703) [jboss-dmr-1.0.0.Final.jar:1.0.0.Final]
at org.jboss.as.server.deployment.DeploymentDeployHandler.execute(DeploymentDeployHandler.java:65)
at org.jboss.as.controller.OperationContextImpl.executeStep(OperationContextImpl.java:353) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.doCompleteStep(OperationContextImpl.java:298) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.completeStep(OperationContextImpl.java:223) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.CompositeOperationHandler.execute(CompositeOperationHandler.java:83) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.executeStep(OperationContextImpl.java:353) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.doCompleteStep(OperationContextImpl.java:298) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.completeStep(OperationContextImpl.java:223) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.ModelControllerImpl$DefaultPrepareStepHandler.execute(ModelControllerImpl.java:350) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.executeStep(OperationContextImpl.java:353) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.doCompleteStep(OperationContextImpl.java:298) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.completeStep(OperationContextImpl.java:223) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.ModelControllerImpl.execute(ModelControllerImpl.java:119) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.remote.ModelControllerClientOperationHandler$ExecuteRequestHandler.doProcessRequest(ModelControllerClientOperationHandler.java:154) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.remote.ModelControllerClientOperationHandler$ExecuteRequestHandler.access$100(ModelControllerClientOperationHandler.java:85) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.remote.ModelControllerClientOperationHandler$ExecuteRequestHandler$1.call(ModelControllerClientOperationHandler.java:114) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.remote.ModelControllerClientOperationHandler$ExecuteRequestHandler$1.call(ModelControllerClientOperationHandler.java:112) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at java.util.concurrent.FutureTask.run(Unknown Source) [:1.8.0_91]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [:1.8.0_91]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [:1.8.0_91]
at java.lang.Thread.run(Unknown Source) [:1.8.0_91]
13:16:46,827 ERROR [org.jboss.as.controller] (pool-1-thread-3) Operation ("undeploy") failed - address: ([("deployment" => "panchali-1.0.war")]): java.util.NoSuchElementException: No child 'runtime-name' exists
at org.jboss.dmr.ModelValue.requireChild(ModelValue.java:362) [jboss-dmr-1.0.0.Final.jar:1.0.0.Final]
at org.jboss.dmr.ModelNode.require(ModelNode.java:703) [jboss-dmr-1.0.0.Final.jar:1.0.0.Final]
at org.jboss.as.server.deployment.DeploymentUndeployHandler.execute(DeploymentUndeployHandler.java:58)
at org.jboss.as.controller.OperationContextImpl.executeStep(OperationContextImpl.java:353) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.doCompleteStep(OperationContextImpl.java:298) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.completeStep(OperationContextImpl.java:223) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.CompositeOperationHandler.execute(CompositeOperationHandler.java:83) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.executeStep(OperationContextImpl.java:353) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.doCompleteStep(OperationContextImpl.java:298) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.completeStep(OperationContextImpl.java:223) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.ModelControllerImpl$DefaultPrepareStepHandler.execute(ModelControllerImpl.java:350) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.executeStep(OperationContextImpl.java:353) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.doCompleteStep(OperationContextImpl.java:298) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.OperationContextImpl.completeStep(OperationContextImpl.java:223) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.ModelControllerImpl.execute(ModelControllerImpl.java:119) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.remote.ModelControllerClientOperationHandler$ExecuteRequestHandler.doProcessRequest(ModelControllerClientOperationHandler.java:154) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.remote.ModelControllerClientOperationHandler$ExecuteRequestHandler.access$100(ModelControllerClientOperationHandler.java:85) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.remote.ModelControllerClientOperationHandler$ExecuteRequestHandler$1.call(ModelControllerClientOperationHandler.java:114) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at org.jboss.as.controller.remote.ModelControllerClientOperationHandler$ExecuteRequestHandler$1.call(ModelControllerClientOperationHandler.java:112) [jboss-as-controller-7.0.2.Final.jar:7.0.2.Final]
at java.util.concurrent.FutureTask.run(Unknown Source) [:1.8.0_91]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [:1.8.0_91]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [:1.8.0_91]
at java.lang.Thread.run(Unknown Source) [:1.8.0_91]
I had created a simple application using openshift with two gear 1. JBoss 2. mysql
Now when i made changes in project i need to test that on PROD which is very dangerous. So i need to create similar dev env for testing and if success then we will push to PROD.
please provide detail steps.
CSS is also loading now. I Suppose to put path like "style/style.css".
I put path "/style/style.css" as "/" is taking as a root and searching for folder style. by removing "/" problem get solved.
Related
I'm using a JDBC connector for the first time and trying to write a dataframe into a PostgreSQL database. I am strictly doing it according to:
How to use JDBC source to write and read data in (Py)Spark?
Driver is installed and code is specified:
postgres_url = "jdbc:postgresql://<IP_ADRESS>:<PORT>/postgres"
properties = {
"user": "user",
"password": "password",
"driver": "org.postgresql.Driver"
}
df.write.jdbc(url=postgres_url, table="newtable", mode = "overwrite", properties = properties)
After executing the code, I get this error:
py4j.protocol.Py4JJavaError: An error occurred while calling o110.jdbc.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 1.0 failed 4 times, most recent failure: Lost task 1.3 in stage 1.0 (TID 49, <SERVER>, executor 9): org.postgresql.util.PSQLException: The connection attempt failed.
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:331)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:223)
at org.postgresql.Driver.makeConnection(Driver.java:400)
at org.postgresql.Driver.connect(Driver.java:259)
at org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.connect(DriverWrapper.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:610)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.postgresql.core.PGStream.createSocket(PGStream.java:241)
at org.postgresql.core.PGStream.<init>(PGStream.java:98)
at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:109)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:235)
... 22 more
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1889)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1877)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1876)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1876)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2110)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2059)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2048)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:935)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:933)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:933)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.saveTable(JdbcUtils.scala:834)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:82)
at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:515)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.postgresql.util.PSQLException: The connection attempt failed.
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:331)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49)
at org.postgresql.jdbc.PgConnection.<init>(PgConnection.java:223)
at org.postgresql.Driver.makeConnection(Driver.java:400)
at org.postgresql.Driver.connect(Driver.java:259)
at org.apache.spark.sql.execution.datasources.jdbc.DriverWrapper.connect(DriverWrapper.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:63)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.savePartition(JdbcUtils.scala:610)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$saveTable$1.apply(JdbcUtils.scala:834)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:935)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
... 1 more
Caused by: java.net.SocketTimeoutException: connect timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.postgresql.core.PGStream.createSocket(PGStream.java:241)
at org.postgresql.core.PGStream.<init>(PGStream.java:98)
at org.postgresql.core.v3.ConnectionFactoryImpl.tryConnect(ConnectionFactoryImpl.java:109)
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:235)
... 22 more
The weird thing is, when I look afterwards into the PostgreSQL, I can see the newly created table, but it is empty. So it seems that installing the connection and creating the structure DOES work, but filling in the data does not. I also tried "ssl" = "false" in the properties, but did not work either. Any suggestions would be greatly appreciated.
Best regards
The connection should be opened between your driver and Postgres AND between your workers and Postgres. Did you check that? The creation of the table is probably done on the driver's side. But then, your workers cannot connect to your DB, therefore, no data. Check that!
You probably have someone in your company who can help you with that. Either, some security/network guy. Otherwise, you need to connect on each driver and test the connectivity to your DB
I’m facing errors while writing parquet data to some temporary folder in ADLS-GEN2.
My folder ‘X’ has parquet files Partitioned with YYYY MM DD sub folders. I have loaded entire data from ‘X’ folder to a dataframe and tried to write the data from dataframe to some temporary folder ‘y’ in ADLS gen2 using simple command below. But writing to folder ‘y’ is failing with below error. Any help/suggestions greatly appreciated .
WriteCommand:
Folder_x_df.write.mode("overwrite")
.option("header","true")
.parquet(folder_y_path)
Error:
Job aborted.Caused by: Job aborted due to stage failure.Caused by: FileAlreadyExistsException: Operation failed: "The specified path already exists.", 409, PUT,https://adlsname.dfs.core.windows.net/............./y/part-00221-c0d1b33f-65f1-4474-a4c8-094df51599bd-c000.snappy.parquet?resource=file&timeout=90, PathAlreadyExists, "The specified path already exists. RequestId:5393b5b3-801f-0003-1da7-16cfb5000000 Time:2021-03-11T18:50:09.5444226Z". Caused by: AbfsRestOperationException....
I'm using cluster version: 6.4.x-scala2.11
Completed Error Log:
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:201)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:192)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:108)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:106)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:126)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:140)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$5.apply(SparkPlan.scala:193)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:189)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:140)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:117)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:115)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:711)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:711)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withCustomExecutionEnv$1.apply(SQLExecution.scala:113)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:243)
at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:99)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:173)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:711)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:307)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:235)
at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:601)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:26)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:176)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:178)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:180)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:182)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:184)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:186)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:188)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:190)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:192)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:194)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:196)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:198)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw.<init>(command-1978144014673122:200)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw.<init>(command-1978144014673122:202)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw.<init>(command-1978144014673122:204)
at lined41e33c74e694199a189dffb81fc63f246.$read.<init>(command-1978144014673122:206)
at lined41e33c74e694199a189dffb81fc63f246.$read$.<init>(command-1978144014673122:210)
at lined41e33c74e694199a189dffb81fc63f246.$read$.<clinit>(command-1978144014673122)
at lined41e33c74e694199a189dffb81fc63f246.$eval$.$print$lzycompute(<notebook>:7)
at lined41e33c74e694199a189dffb81fc63f246.$eval$.$print(<notebook>:6)
at lined41e33c74e694199a189dffb81fc63f246.$eval.$print(<notebook>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:396)
at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:373)
at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)
at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373)
at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
at scala.util.Try$.apply(Try.scala:192)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 136 in stage 6.0 failed 4 times, most recent failure: Lost task 136.3 in stage 6.0 (TID 166, 10.139.64.6, executor 2): org.apache.hadoop.fs.FileAlreadyExistsException: PUT, https://adlsname.dfs.core.windows.net/............./y/part-00136-360acd3f-6ae8-4e37-8dce-eb6b47bd0407-c000.snappy.parquet?resource=file&timeout=90
StatusCode=409
StatusDescription=The specified path already exists.
ErrorCode=PathAlreadyExists
ErrorMessage=The specified path already exists.
RequestId:24504b99-d01f-0094-639e-1500eb000000
Time:2021-03-10T11:10:42.1357194Z
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.checkException(AzureBlobFileSystem.java:946)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.create(AzureBlobFileSystem.java:224)
at com.databricks.spark.metrics.FileSystemWithMetrics.create(FileSystemWithMetrics.scala:291)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$create$1$$anonfun$apply$10$$anonfun$apply$11.apply(DatabricksFileSystemV2.scala:544)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$create$1$$anonfun$apply$10$$anonfun$apply$11.apply(DatabricksFileSystemV2.scala:541)
at com.databricks.s3a.S3AExeceptionUtils$.convertAWSExceptionToJavaIOException(DatabricksStreamUtils.scala:108)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$create$1$$anonfun$apply$10.apply(DatabricksFileSystemV2.scala:541)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$create$1$$anonfun$apply$10.apply(DatabricksFileSystemV2.scala:541)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$withUserContextRecorded$1.apply(DatabricksFileSystemV2.scala:936)
at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionContext(DatabricksFileSystemV2.scala:450)
at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionTags(DatabricksFileSystemV2.scala:450)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withUserContextRecorded(DatabricksFileSystemV2.scala:909)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$create$1.apply(DatabricksFileSystemV2.scala:540)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2$$anonfun$create$1.apply(DatabricksFileSystemV2.scala:540)
at com.databricks.logging.UsageLogging$$anonfun$recordOperation$1.apply(UsageLogging.scala:428)
at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionContext(DatabricksFileSystemV2.scala:450)
at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.withAttributionTags(DatabricksFileSystemV2.scala:450)
at com.databricks.logging.UsageLogging$class.recordOperation(UsageLogging.scala:409)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.recordOperation(DatabricksFileSystemV2.scala:450)
at com.databricks.backend.daemon.data.client.DatabricksFileSystemV2.create(DatabricksFileSystemV2.scala:537)
at com.databricks.backend.daemon.data.client.DatabricksFileSystem.create(DatabricksFileSystem.scala:128)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.parquet.hadoop.util.HadoopOutputFile.create(HadoopOutputFile.java:74)
at org.apache.parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:255)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:433)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:382)
at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.<init>(ParquetOutputWriter.scala:37)
at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anon$1.newInstance(ParquetFileFormat.scala:162)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.newOutputWriter(FileFormatDataWriter.scala:120)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.<init>(FileFormatDataWriter.scala:108)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:239)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:173)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:172)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.doRunTask(Task.scala:140)
at org.apache.spark.scheduler.Task.run(Task.scala:113)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$13.apply(Executor.scala:537)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1541)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:543)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: PUT, https://adlsname.dfs.core.windows.net/............./y/part-00136-360acd3f-6ae8-4e37-8dce-eb6b47bd0407-c000.snappy.parquet?resource=file&timeout=90
StatusCode=409
StatusDescription=The specified path already exists.
ErrorCode=PathAlreadyExists
ErrorMessage=The specified path already exists.
RequestId:24504b99-d01f-0094-639e-1500eb000000
Time:2021-03-10T11:10:42.1357194Z
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.services.AbfsRestOperation.execute(AbfsRestOperation.java:134)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.services.AbfsClient.createPath(AbfsClient.java:243)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.createFile(AzureBlobFileSystemStore.java:322)
at shaded.databricks.v20180920_b33d810.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.create(AzureBlobFileSystem.java:220)
... 49 more
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:2362)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:2350)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:2349)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2349)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:1102)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:1102)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1102)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2582)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2529)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2517)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:897)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2282)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:170)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:192)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:108)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:106)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:126)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:140)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$5.apply(SparkPlan.scala:193)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:189)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:140)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:117)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:115)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:711)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:711)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withCustomExecutionEnv$1.apply(SQLExecution.scala:113)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:243)
at org.apache.spark.sql.execution.SQLExecution$.withCustomExecutionEnv(SQLExecution.scala:99)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:173)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:711)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:307)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:293)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:235)
at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:601)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:26)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:176)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:178)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:180)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:182)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:184)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:186)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:188)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:190)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:192)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:194)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:196)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw$$iw.<init>(command-1978144014673122:198)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw$$iw.<init>(command-1978144014673122:200)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw$$iw.<init>(command-1978144014673122:202)
at lined41e33c74e694199a189dffb81fc63f246.$read$$iw.<init>(command-1978144014673122:204)
at lined41e33c74e694199a189dffb81fc63f246.$read.<init>(command-1978144014673122:206)
at lined41e33c74e694199a189dffb81fc63f246.$read$.<init>(command-1978144014673122:210)
at lined41e33c74e694199a189dffb81fc63f246.$read$.<clinit>(command-1978144014673122)
at lined41e33c74e694199a189dffb81fc63f246.$eval$.$print$lzycompute(<notebook>:7)
at lined41e33c74e694199a189dffb81fc63f246.$eval$.$print(<notebook>:6)
at lined41e33c74e694199a189dffb81fc63f246.$eval.$print(<notebook>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:396)
at com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$9.apply(DriverLocal.scala:373)
at com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)
at com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:275)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373)
at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
at com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)
at scala.util.Try$.apply(Try.scala:192)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
at java.lang.Thread.run(Thread.java:748)
I was running some tests in my local machine and everything was well.
Now, I'm running in a cluster with the same jar file, dataset, etc, and I'm getting the following exception:
java.lang.NullPointerException
at learner.MyLearner$$anonfun$18.apply(MyLearner.scala:165)
at learner.MyLearner$$anonfun$18.apply(MyLearner.scala:165)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
at scala.collection.AbstractIterator.to(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1499)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1487)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1486)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1486)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1714)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1669)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1658)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2062)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2087)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:936)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
at org.apache.spark.rdd.RDD.collect(RDD.scala:935)
at learner.MyLearner$.learn(MyLearner.scala:168)
at learner.Learner.learnAndClassify(Learner.scala:123)
at learner.Learner.run(Learner.scala:27)
at Launcher$.main(Launcher.scala:56)
at Launcher.main(Launcher.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NullPointerException
at learner.MyLearner$$anonfun$18.apply(MyLearner.scala:165)
at learner.MyLearner$$anonfun$18.apply(MyLearner.scala:165)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
at scala.collection.AbstractIterator.to(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1336)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
at scala.collection.AbstractIterator.toArray(Iterator.scala:1336)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:936)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2062)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:108)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
In MyLearner.scala, line 165 is this:
val rdd2: RDD[Rule] = rdd1.map { case (id, _) => new Rule(brdcst.value.elements(id).features, 0L, 0, id.toString, variables, labels, classes) }
I don't understand the exception and what could be the reason if in local mode this works well. Sorry if this is a obvious question, I have tried to figured it out but I coudn't.
I have tried to create a single rule, in order to check the new Rule(...) part, and I've put it in an RDD and it worked!
Info: The Rule class extends of the Serializable one.
Also, I did the following only to check the access to the broadcast variable, and it also works:
rdd1.collect().foreach{case (id, _) => println(brdcst.value.elements(id).features)}
Any suggestions or ideas?
I am installing the Worklight server 6.1 with the DB2 10.5, but almost at the last step the installation failed leaving the error below. Can you please explain what is wrong?
OS 64 bit Redhat, with java 1.6
Bellow is the content of "failed-install.log"
Detected Java version: 1.6 in: /opt/IBM/InstallationManager/eclipse/jre_6.0.0.sr9_20110208_03/jre
Detected OS: Linux
BUILD FAILED
/opt/IBM/Worklight/WorklightServer/post-install.xml:335: The following error occurred while executing this line:
/opt/IBM/Worklight/WorklightServer/post-install.xml:2257: The following error occurred while executing this line:
/opt/IBM/Worklight/WorklightServer/post-install.xml:2610: The following error occurred while executing this line:
/opt/IBM/Worklight/WorklightServer/post-install.xml:5582: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-1666, SQLSTATE=42613, SQLERRMC=Generated column, DRIVER=3.66.46
at com.ibm.db2.jcc.am.dd.a(dd.java:741)
at com.ibm.db2.jcc.am.dd.a(dd.java:60)
at com.ibm.db2.jcc.am.dd.a(dd.java:127)
at com.ibm.db2.jcc.am.oo.c(oo.java:2763)
at com.ibm.db2.jcc.am.oo.d(oo.java:2751)
at com.ibm.db2.jcc.am.oo.b(oo.java:2134)
at com.ibm.db2.jcc.t4.ab.i(ab.java:226)
at com.ibm.db2.jcc.t4.ab.c(ab.java:48)
at com.ibm.db2.jcc.t4.o.b(o.java:38)
at com.ibm.db2.jcc.t4.tb.h(tb.java:124)
at com.ibm.db2.jcc.am.oo.hb(oo.java:2129)
at com.ibm.db2.jcc.am.oo.a(oo.java:3284)
at com.ibm.db2.jcc.am.oo.e(oo.java:1085)
at com.ibm.db2.jcc.am.oo.execute(oo.java:1064)
at org.apache.tools.ant.taskdefs.SQLExec.execSQL(SQLExec.java:775)
at org.apache.tools.ant.taskdefs.SQLExec.runStatements(SQLExec.java:745)
at org.apache.tools.ant.taskdefs.SQLExec$Transaction.runTransaction(SQLExec.java:1055)
at org.apache.tools.ant.taskdefs.SQLExec$Transaction.access$000(SQLExec.java:985)
at org.apache.tools.ant.taskdefs.SQLExec.execute(SQLExec.java:653)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.taskdefs.Sequential.execute(Sequential.java:68)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.taskdefs.MacroInstance.execute(MacroInstance.java:398)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.taskdefs.Sequential.execute(Sequential.java:68)
at net.sf.antcontrib.logic.IfTask.execute(IfTask.java:197)
at sun.reflect.GeneratedMethodAccessor14.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.TaskAdapter.execute(TaskAdapter.java:154)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:392)
at org.apache.tools.ant.Target.performTasks(Target.java:413)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
at org.apache.tools.ant.Project.executeTargets(Project.java:1251)
at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:442)
at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:392)
at org.apache.tools.ant.Target.performTasks(Target.java:413)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
at org.apache.tools.ant.helper.SingleCheckExecutor.executeTargets(SingleCheckExecutor.java:38)
at org.apache.tools.ant.Project.executeTargets(Project.java:1251)
at org.apache.tools.ant.taskdefs.Ant.execute(Ant.java:442)
at org.apache.tools.ant.taskdefs.CallTarget.execute(CallTarget.java:105)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:392)
at org.apache.tools.ant.Target.performTasks(Target.java:413)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
at org.apache.tools.ant.Project.executeTarget(Project.java:1368)
at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
at org.apache.tools.ant.Project.executeTargets(Project.java:1251)
at org.apache.tools.ant.Main.runBuild(Main.java:811)
at org.apache.tools.ant.Main.startAnt(Main.java:217)
at org.apache.tools.ant.launch.Launcher.run(Launcher.java:280)
at org.apache.tools.ant.launch.Launcher.main(Launcher.java:109)
Total time: 17 seconds
The tables of Application Center use the 'Generated By' feature. Have you entered a setting in your database that is incompatible with that feature ?
This description of the error message you have shows that a setting DFT_TABLE_ORG could be the cause of the error you experiment:
http://pic.dhe.ibm.com/infocenter/db2luw/v10r5/index.jsp?topic=%2Fcom.ibm.db2.luw.messages.sql.doc%2Fdoc%2Fmsql01666n.html
I am trying to create webservers under Servers ---> Server Types ----> Web Servers ----> New
Error :
An error occurred while processing request:
Message: Missing message for key ""
javax.servlet.jsp.JspException: Missing message for key ""
at org.apache.struts.taglib.bean.MessageTag.doStartTag(Unknown Source)
at _ibmjsp.secure.layouts._stepsLayout._jspService(_stepsLayout.java:1040)
at com.ibm.ws.jsp.runtime.HttpJspBase.service(HttpJspBase.java:98)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:831)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1655)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1595)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:104)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain._doFilter(WebAppFilterChain.java:77)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.doFilter(WebAppFilterManager.java:908)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:932)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:500)
at com.ibm.ws.webcontainer.servlet.ServletWrapperImpl.handleRequest(ServletWrapperImpl.java:178)
at com.ibm.wsspi.webcontainer.servlet.GenericServletWrapper.handleRequest(GenericServletWrapper.java:121)
at com.ibm.ws.jsp.webcontainerext.AbstractJSPExtensionServletWrapper.handleRequest(AbstractJSPExtensionServletWrapper.java:239)
at com.ibm.ws.webcontainer.webapp.WebAppRequestDispatcher.forward(WebAppRequestDispatcher.java:341)
at org.apache.struts.action.RequestProcessor.doForward(Unknown Source)
at org.apache.struts.tiles.TilesRequestProcessor.doForward(Unknown Source)
at org.apache.struts.tiles.TilesRequestProcessor.processTilesDefinition(Unknown Source)
at org.apache.struts.tiles.TilesRequestProcessor.processForwardConfig(Unknown Source)
at com.ibm.isclite.container.controller.InformationController.processForwardConfig(InformationController.java:217)
at org.apache.struts.action.RequestProcessor.process(Unknown Source)
at org.apache.struts.action.ActionServlet.process(Unknown Source)
at org.apache.struts.action.ActionServlet.doPost(Unknown Source)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:738)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:831)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1655)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1595)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:104)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain._doFilter(WebAppFilterChain.java:77)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.doFilter(WebAppFilterManager.java:908)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:932)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:500)
at com.ibm.ws.webcontainer.servlet.ServletWrapperImpl.handleRequest(ServletWrapperImpl.java:178)
at com.ibm.ws.webcontainer.webapp.WebAppRequestDispatcher.forward(WebAppRequestDispatcher.java:341)
at org.apache.struts.action.RequestProcessor.doForward(Unknown Source)
at org.apache.struts.tiles.TilesRequestProcessor.doForward(Unknown Source)
at org.apache.struts.action.RequestProcessor.processForwardConfig(Unknown Source)
at org.apache.struts.tiles.TilesRequestProcessor.processForwardConfig(Unknown Source)
at com.ibm.isclite.container.controller.InformationController.processForwardConfig(InformationController.java:217)
at org.apache.struts.action.RequestProcessor.process(Unknown Source)
at org.apache.struts.action.ActionServlet.process(Unknown Source)
at org.apache.struts.action.ActionServlet.doPost(Unknown Source)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:738)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:831)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1655)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1595)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:131)
at com.ibm.ws.console.core.servlet.WSCUrlFilter.setUpCommandAssistence(WSCUrlFilter.java:927)
at com.ibm.ws.console.core.servlet.WSCUrlFilter.continueStoringTaskState(WSCUrlFilter.java:494)
at com.ibm.ws.console.core.servlet.WSCUrlFilter.doFilter(WSCUrlFilter.java:315)
at com.ibm.ws.webcontainer.filter.FilterInstanceWrapper.doFilter(FilterInstanceWrapper.java:188)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:116)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain._doFilter(WebAppFilterChain.java:77)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.doFilter(WebAppFilterManager.java:908)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:932)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:500)
at com.ibm.ws.webcontainer.servlet.ServletWrapperImpl.handleRequest(ServletWrapperImpl.java:178)
at com.ibm.ws.webcontainer.webapp.WebApp.handleRequest(WebApp.java:3826)
at com.ibm.ws.webcontainer.webapp.WebGroup.handleRequest(WebGroup.java:276)
at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:931)
at com.ibm.ws.webcontainer.WSWebContainer.handleRequest(WSWebContainer.java:1583)
at com.ibm.ws.webcontainer.channel.WCChannelLink.ready(WCChannelLink.java:186)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleDiscrimination(HttpInboundLink.java:455)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleNewInformation(HttpInboundLink.java:384)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.ready(HttpInboundLink.java:288)
at com.ibm.ws.ssl.channel.impl.SSLConnectionLink.determineNextChannel(SSLConnectionLink.java:1016)
at com.ibm.ws.ssl.channel.impl.SSLConnectionLink$MyReadCompletedCallback.complete(SSLConnectionLink.java:639)
at com.ibm.ws.ssl.channel.impl.SSLReadServiceContext$SSLReadCompletedCallback.complete(SSLReadServiceContext.java:1772)
at com.ibm.ws.tcp.channel.impl.AioReadCompletionListener.futureCompleted(AioReadCompletionListener.java:165)
at com.ibm.io.async.AbstractAsyncFuture.invokeCallback(AbstractAsyncFuture.java:217)
at com.ibm.io.async.AsyncChannelFuture.fireCompletionActions(AsyncChannelFuture.java:161)
at com.ibm.io.async.AsyncFuture.completed(AsyncFuture.java:138)
at com.ibm.io.async.ResultHandler.complete(ResultHandler.java:204)
at com.ibm.io.async.ResultHandler.runEventProcessingLoop(ResultHandler.java:775)
at com.ibm.io.async.ResultHandler$2.run(ResultHandler.java:905)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1550)
AS shown below,
Don't have logging service up and running. I faced this issue when i tried to create a new server after deleting the old one ( incorrect one ).
Interestingly when i try to logout it will show deleted server details as shown below
In Precise WAS is looking for file which doesn't exists after DELETE ( browsing through directories )
:/u01/profiles.7.0.0.11/appsrv01/config/cells / taspmociias304Cell01 / nodes / webserver1-node / node.xml
Why did i delete ?
Named wrong host-name, i.e instead of appsrv01 i named webserver1-node
Valid path does exists in this path
:/u01/profiles.7.0.0.11/appsrv01/config/cells/taspmociias304Cell01/nodes/appsrv01
Yep! as you guys mentioned. it's a PRODUCT defect.