java.lang.ClassNotFoundException: org.apache.axis2.transport.http.AxisServlet - eclipse

When I'm trying to get http://localhost:8080/services link it generates the following exception :
java.lang.ClassNotFoundException: org.apache.axis2.transport.http.AxisServlet
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1305)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1157)
at org.apache.catalina.core.DefaultInstanceManager.loadClass(DefaultInstanceManager.java:520)
at org.apache.catalina.core.DefaultInstanceManager.loadClassMaybePrivileged(DefaultInstanceManager.java:501)
at org.apache.catalina.core.DefaultInstanceManager.newInstance(DefaultInstanceManager.java:120)
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1095)
at org.apache.catalina.core.StandardWrapper.allocate(StandardWrapper.java:817)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:135)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:610)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:516)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1086)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:659)
at org.apache.coyote.http11.Http11NioProtocol$Http11ConnectionHandler.process(Http11NioProtocol.java:223)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1558)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1515)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
I'm using Apache Tomcat 8.0.22 and Axis 1.6.2.
Please guide me to solve this exception.

Missed to update build xml which tasks to copy put some slf4j and commons libs

Related

org.zkoss.zk.ui.http.DHtmlLayoutServlet cannot be cast to javax.servlet.Servlet

I followed the this tutorial to deploy Zk on Jboss 6 but on server start up I am getting the error below:
ERROR
[org.apache.catalina.core.ContainerBase.[jboss.web].[default-host].[/stream-zk]]
(ServerService Thread Pool -- 92) JBWEB000289: Servlet zkLoader threw
load() exception: java.lang.ClassCastException:
org.zkoss.zk.ui.http.DHtmlLayoutServlet cannot be cast to
javax.servlet.Servlet
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1154)
[jbossweb-7.5.7.Final-redhat-1.jar:7.5.7.Final-redhat-1]
at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:1100)
[jbossweb-7.5.7.Final-redhat-1.jar:7.5.7.Final-redhat-1]
at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:3593)
[jbossweb-7.5.7.Final-redhat-1.jar:7.5.7.Final-redhat-1]
at org.apache.catalina.core.StandardContext.start(StandardContext.java:3802)
[jbossweb-7.5.7.Final-redhat-1.jar:7.5.7.Final-redhat-1]
at org.jboss.as.web.deployment.WebDeploymentService.doStart(WebDeploymentService.java:163)
[jboss-as-web-7.5.0.Final-redhat-21.jar:7.5.0.Final-redhat-21]
at org.jboss.as.web.deployment.WebDeploymentService.access$000(WebDeploymentService.java:61)
[jboss-as-web-7.5.0.Final-redhat-21.jar:7.5.0.Final-redhat-21]
at org.jboss.as.web.deployment.WebDeploymentService$1.run(WebDeploymentService.java:96)
[jboss-as-web-7.5.0.Final-redhat-21.jar:7.5.0.Final-redhat-21]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[rt.jar:1.8.0_261]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [rt.jar:1.8.0_261]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[rt.jar:1.8.0_261]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[rt.jar:1.8.0_261]
at java.lang.Thread.run(Thread.java:748) [rt.jar:1.8.0_261]
at org.jboss.threads.JBossThread.run(JBossThread.java:122)
Any help is appreciated.
Such error "...cannot be cast to javax.servlet.Servlet..." is mostly caused by a class that doesn't extend HttpServlet, but ZK really extends HttpServlet.
So the root cause is probably from Jboss.
According to this thread you might load 2 different versions of the servlet API jar.
According to the documentation:
This is a flat namespace and there should not be multiple instances of a class in different deployment JARs. If there are, only the first
loaded will be used and the results may not be as expected
Please check this first.

Spark GPUenabler ClassNotFoundException: CacheGPU

I am using the package IBMSparkGPU/GPUenabler package. I use sbt assembly to package all the dependency into one single jar file and submit it to the spark standalone cluster manager. However, the following error message appear:
org.apache.spark.SparkException: Error sending message [message = CacheGPUDS(a99176e95cf37ba4e5e46b9b172369ac_-1728716590,false)]
at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:119)
at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:78)
at com.ibm.gpuenabler.GPUMemoryManagerMasterEndPoint.com$ibm$gpuenabler$GPUMemoryManagerMasterEndPoint$$tell(GPUMemoryManager.scala:172)
at com.ibm.gpuenabler.GPUMemoryManagerMasterEndPoint$$anonfun$registerGPUMemoryManager$2.apply(GPUMemoryManager.scala:64)
at com.ibm.gpuenabler.GPUMemoryManagerMasterEndPoint$$anonfun$registerGPUMemoryManager$2.apply(GPUMemoryManager.scala:64)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:45)
at com.ibm.gpuenabler.GPUMemoryManagerMasterEndPoint.registerGPUMemoryManager(GPUMemoryManager.scala:64)
at com.ibm.gpuenabler.GPUMemoryManagerMasterEndPoint$$anonfun$receiveAndReply$1.applyOrElse(GPUMemoryManager.scala:143)
at org.apache.spark.rpc.netty.Inbox$$anonfun$process$1.apply$mcV$sp(Inbox.scala:105)
at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:205)
at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:101)
at org.apache.spark.rpc.netty.Dispatcher$MessageLoop.run(Dispatcher.scala:213)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult
at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:77)
at org.apache.spark.rpc.RpcTimeout$$anonfun$1.applyOrElse(RpcTimeout.scala:75)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:59)
at scala.PartialFunction$OrElse.apply(PartialFunction.scala:167)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:83)
at org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:102)
... 16 more
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: com.ibm.gpuenabler.CacheGPUDS
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1866)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1749)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2040)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:259)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:308)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:258)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:257)
at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:577)
at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:562)
at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:159)
at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:107)
at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:119)
at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
... more
There won't be this error message and the program can run if I submit to local[*] as master. I can also eliminate the error when disable the spark.gpuenabler.autocache. However, is there other way to properly fix the issue?
I am using Ubuntu 17.04, JRE 1.8.0, Scala 2.11 and Spark 2.1.0.
It turned out that adding the option
--conf "spark.executor.extraClassPath=file://path/to/jar" will solve the problem. Another thing is that I need to paste the jar file to all the machine with same path. Other wise the worker will not be able to get the jar file.

custom datasource in spark-sql with mist

I have custom data source which I use in sqlContext.read.format(....sparkjobs.fileloader.DataSource). This works well via spark-submit in local & yarn mode.
But, when I invoke the same job via Mist, it throws exception:
Failed to find data source. Please find packages at http://spark-packages.org
My fat jar has the DataSource class. And also, at the beginning of the code, I logged all the jars available at classpath and I can see my jar.
Error Log:
java.lang.ClassNotFoundException: Failed to find data source: c.p.b.f.sparkjobs.fileloader.DataSource. Pleas
e find packages at http://spark-packages.org
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:102)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:109)
at c.p.b.f.sparkjobs.fileloader.m.McParser.processFile(McParser.scala:44)
at c.p.b.f.sparkjobs.fileloader.FileProcessor$$anonfun$processFilesSingle$1.apply(FileProcessor.scala:7
3)
at c.p.b.f.sparkjobs.fileloader.FileProcessor$$anonfun$processFilesSingle$1.apply(FileProcessor.scala:7
0)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at c.p.b.f.sparkjobs.fileloader.FileProcessor.processFilesSingle(FileProcessor.scala:70)
at c.p.b.f.sparkjobs.fileloader.FileProcessor$$anonfun$processFilesBulk$1.apply(FileProcessor.scala:58)
at c.p.b.f.sparkjobs.fileloader.FileProcessor$$anonfun$processFilesBulk$1.apply(FileProcessor.scala:49)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.MapLike$DefaultKeySet.foreach(MapLike.scala:174)
at c.p.b.f.sparkjobs.fileloader.FileProcessor.processFilesBulk(FileProcessor.scala:49)
at c.p.b.f.sparkjobs.fileloader.FileProcessor.init(FileProcessor.scala:33)
at c.p.b.f.sparkjobs.fileloader.Loader.execute(Loader.scala:162)
at c.p.b.f.util.LoaderApp$.execute(LoaderApp.scala:40)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.hydrosphere.mist.jobs.jar.JobInstance.io$hydrosphere$mist$jobs$jar$JobInstance$$invokeMethod(JobInstance.scala:40)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1$$anonfun$apply$3$$anonfun$apply$4.apply(JobInstance.scala:24)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1$$anonfun$apply$3$$anonfun$apply$4.apply(JobInstance.scala:24)
at cats.syntax.CatchOnlyPartiallyApplied.apply(either.scala:294)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1$$anonfun$apply$3.apply(JobInstance.scala:24)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1$$anonfun$apply$3.apply(JobInstance.scala:23)
at cats.syntax.EitherOps$.flatMap$extension(either.scala:129)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1.apply(JobInstance.scala:23)
at io.hydrosphere.mist.jobs.jar.JobInstance$$anonfun$run$1.apply(JobInstance.scala:22)
at cats.syntax.EitherOps$.flatMap$extension(either.scala:129)
at io.hydrosphere.mist.jobs.jar.JobInstance.run(JobInstance.scala:22)
at io.hydrosphere.mist.worker.runners.scala.ScalaRunner$$anonfun$run$1.apply(ScalaRunner.scala:26)
at io.hydrosphere.mist.worker.runners.scala.ScalaRunner$$anonfun$run$1.apply(ScalaRunner.scala:25)
at cats.syntax.EitherOps$.flatMap$extension(either.scala:129)
at io.hydrosphere.mist.worker.runners.scala.ScalaRunner.run(ScalaRunner.scala:25)
at io.hydrosphere.mist.worker.runners.MistJobRunner$.run(MistJobRunner.scala:18)
at io.hydrosphere.mist.worker.WorkerActor$$anonfun$2.apply(WorkerActor.scala:78)
at io.hydrosphere.mist.worker.WorkerActor$$anonfun$2.apply(WorkerActor.scala:76)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: c.p.b.f.sparkjobs.fileloader.DataSource.DefaultSource
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4$$anonfun$apply$1.apply(ResolvedDataSource.scala:62)
at scala.util.Try$.apply(Try.scala:161)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$$anonfun$4.apply(ResolvedDataSource.scala:62)
at scala.util.Try.orElse(Try.scala:82)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:62)
... 47 more

Zookeeper Configuration in solrJ Throwing Read Timed-out exception

I am using SolrJ to insert/query solr Data [solr cloud 6.5 - 3 machines]. Earlier I used below code to create the SolrClient -
HttpSolrClient server;
server = new HttpSolrClient.Builder("").build();
server.setSoTimeout(20000);
server.setConnectionTimeout(20000);
server.setDefaultMaxConnectionsPerHost(200);
server.setMaxTotalConnections(200);
server.setFollowRedirects(false);
server.setAllowCompression(true);
Now I configured 3 Zookeeper servers on this solr Cloud. Code to create solr client became -
CloudSolrClient server;
String serverURL="zkapp1,zkapp2,zkapp3";
ArrayList zkHosts = newrrayList(Arrays.asList(serverURL.split(",")));
server = new CloudSolrClient.Builder().withZkHost(zkHosts).build();
server.setSoTimeout(20000);
server.setZkConnectTimeout(20000);
server.setDefaultCollection("testSolr");
I want to know where do I need to specify other properties that were present before [DefaultMaxConnectionsPerHost, MaxTotalConnections, FollowRedirects,AllowCompression].
Please guide how to configure Zookeeper because with these settings, after some time below exception is observed [After some successful hits (read/writes)] -
org.apache.solr.client.solrj.SolrServerException: Timeout occured while waiting response from server at: http://solr6CLoudMachine:8983/solr/testSolr
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:621)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:279)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:268)
at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:149)
at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:484)
at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:463)
at com.til.damcore.repository.DAMCoreSolrManager.insertContent(DAMCoreSolrManager.java:288)
at com.til.damcore.services.CMSContentUploadServiceSolrOnly.uploadCMSContent(CMSContentUploadServiceSolrOnly.java:227)
at com.til.damapi.service.InsertContentService.insertSolrOnlyContent(InsertContentService.java:136)
at com.til.damapi.controller.insert.InsertionController.insertDataSolr6(InsertionController.java:175)
at sun.reflect.GeneratedMethodAccessor155.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:222)
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:137)
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:110)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:814)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:737)
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:959)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:969)
at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:871)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:845)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:728)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:305)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:121)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:222)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:99)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:953)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1023)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:589)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:312)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:170)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
at org.apache.http.impl.io.AbstractSessionInputBuffer.fillBuffer(AbstractSessionInputBuffer.java:160)
at org.apache.http.impl.io.SocketInputBuffer.fillBuffer(SocketInputBuffer.java:84)
at org.apache.http.impl.io.AbstractSessionInputBuffer.readLine(AbstractSessionInputBuffer.java:273)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:140)
at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:57)
at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:261)
at org.apache.http.impl.AbstractHttpClientConnection.receiveResponseHeader(AbstractHttpClientConnection.java:283)
at org.apache.http.impl.conn.DefaultClientConnection.receiveResponseHeader(DefaultClientConnection.java:251)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.receiveResponseHeader(ManagedClientConnectionImpl.java:197)
at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:272)
at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:124)
at org.apache.http.impl.client.DefaultRequestDirector.tryExecute(DefaultRequestDirector.java:685)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:487)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:882)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:55)
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:515)
... 45 more
Can anybody please suggest what are the recommended production settings for Zookeeper?
I am completely stuck.
Any help is greatly appreciated.
Thanks,
Vibhav
When you're using SolrJ in conjunction with SolrCloud, Zookeeper returns the name or the IP of the Solr instance you have to connect, looking at your exception the Solr instance is solr6CLoudMachine.
First thing you have to check it is see if you can reach solr6CLoudMachine from the host where you're running the SolrJ Client.
For example, first try to see you can resolve the ip address of solr6CLoudMachine from shell:
ping solr6CLoudMachine
Then try if you can reach the port:
curl http://solr6CLoudMachine:8983/solr/
And again, given that you are facing with a SolrClould cluster, very likely you have more than one Solr instance to reach.
So, from the host where you're running your SolrJ client, you should be able to reach all the Solr instance where your target collections are located.

Cassandra cluster is running but not alble to connect from Spark App

Version: Cassandra version 3.6 , Spark version 1.5.2, Spark-Cassandra-Connector_2.11 version 1.5.0-RC1
cassandra.yaml settings: listen_address:<node_ip> , rpc_address:0.0.0.0 , broadcast_rpc_address:<node_ip> , start_rpc: true , start_native_transport: true , native_transport_port:9042 , rpc_port: 9160 , seeds: 192.168.0.52
Scenario: I have a cassandra cluster with two nodes and one of them is set as seed.
192.168.0.52 (seed node)
192.168.0.55
I am trying to run a web app on other machine, 192.168.0.60. This machine is currently using spark locally but I also tried it in standalone mode. [ but i am getting same error, so I am currently running the app locally.] I have set the CASSANDRA_DB_IP as sparkConf.set("spark.cassandra.connection.host", "192.168.0.52") in Spark app.
The app can't communicate with cassandra db.
Note: If I run some file from app ( like .scala file with main() method in object as an individual program ) it runs perfectly and fetches data from the cassandra db normally. But i throws the following exception while try to run the project/app.
16/06/21 12:53:13 ERROR DefaultErrorHandler:
java.io.IOException: Failed to open native connection to Cassandra at {192.168.0.52}:9042
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:162)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
at com.datastax.spark.connector.rdd.partitioner.CassandraRDDPartitioner$.getTokenFactory(CassandraRDDPartitioner.scala:176)
at org.apache.spark.sql.cassandra.CassandraSourceRelation$.apply(CassandraSourceRelation.scala:212)
at org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:57)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
at com.system.tableManager.TableCommon.getTableDataframe(TableCommon.scala:56)
at com.system.user.UserManagement.<init>(UserManagement.scala:51)
at com.analytics.UI.views.userUI.Login.<init>(Login.java:58)
at com.analytics.UI.AnalyticsUI.init(AnalyticsUI.java:177)
at com.vaadin.ui.UI.doInit(UI.java:682)
at com.vaadin.server.communication.UIInitHandler.getBrowserDetailsUI(UIInitHandler.java:214)
at com.vaadin.server.communication.UIInitHandler.synchronizedHandleRequest(UIInitHandler.java:74)
at com.vaadin.server.SynchronizedRequestHandler.handleRequest(SynchronizedRequestHandler.java:41)
at com.vaadin.server.VaadinService.handleRequest(VaadinService.java:1409)
at com.vaadin.server.VaadinServlet.service(VaadinServlet.java:364)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:617)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:668)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1521)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1478)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /192.168.0.52:9042 (com.datastax.driver.core.exceptions.TransportException: [/192.168.0.52] Connection has been closed))
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:231)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:77)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1382)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:393)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
... 43 more
16/06/21 12:53:13 ERROR AnalyticsUI$Servlet]: Servlet.service() for servlet [com.analytics.UI.AnalyticsUI$Servlet] in context with path [/Sa_UI] threw exception [com.vaadin.server.ServiceException: java.io.IOException: Failed to open native connection to Cassandra at {192.168.0.52}:9042] with root cause
com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: /192.168.0.52:9042 (com.datastax.driver.core.exceptions.TransportException: [/192.168.0.52] Connection has been closed))
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:231)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:77)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1382)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:393)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
at com.datastax.spark.connector.rdd.partitioner.CassandraRDDPartitioner$.getTokenFactory(CassandraRDDPartitioner.scala:176)
at org.apache.spark.sql.cassandra.CassandraSourceRelation$.apply(CassandraSourceRelation.scala:212)
at org.apache.spark.sql.cassandra.DefaultSource.createRelation(DefaultSource.scala:57)
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:125)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:114)
at com.system.tableManager.TableCommon.getTableDataframe(TableCommon.scala:56)
at com.system.user.UserManagement.<init>(UserManagement.scala:51)
at com.analytics.UI.views.userUI.Login.<init>(Login.java:58)
at com.analytics.UI.AnalyticsUI.init(AnalyticsUI.java:177)
at com.vaadin.ui.UI.doInit(UI.java:682)
at com.vaadin.server.communication.UIInitHandler.getBrowserDetailsUI(UIInitHandler.java:214)
at com.vaadin.server.communication.UIInitHandler.synchronizedHandleRequest(UIInitHandler.java:74)
at com.vaadin.server.SynchronizedRequestHandler.handleRequest(SynchronizedRequestHandler.java:41)
at com.vaadin.server.VaadinService.handleRequest(VaadinService.java:1409)
at com.vaadin.server.VaadinServlet.service(VaadinServlet.java:364)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:617)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:668)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1521)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1478)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Make an entry for cassandra node(ip address and hostname) in /etc/hosts of the machine on which you are running spark.
Also Spark 1.5.2 binary comes with scala 2.10 but the spark-cassandra connector you are using is built on scala 2.11.