mongodb spring connection lost overnight - mongodb

I'm using azure cosmosdb with mongoAPI (spring data, mongoRepository)
Each morning, first request to fetch data from mongo causes exception:
Following requests succeed without doing any actions.
Any idea what might be causing this?
Is there a way to have spring automatically recover connections without failing request?
Thanks
The exception:
org.springframework.data.mongodb.UncategorizedMongoDbException: Exception sending message; nested exception is com.mongodb.MongoSocketWriteException: Exception sending message
at org.springframework.data.mongodb.core.MongoExceptionTranslator.translateExceptionIfPossible(MongoExceptionTranslator.java:107)
at org.springframework.data.mongodb.core.MongoTemplate.potentiallyConvertRuntimeException(MongoTemplate.java:2135)
at org.springframework.data.mongodb.core.MongoTemplate.executeFindMultiInternal(MongoTemplate.java:1978)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1784)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1767)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:641)
at org.springframework.data.mongodb.repository.query.MongoQueryExecution$CollectionExecution.execute(MongoQueryExecution.java:79)
at org.springframework.data.mongodb.repository.query.MongoQueryExecution$ResultProcessingExecution.execute(MongoQueryExecution.java:411)
at org.springframework.data.mongodb.repository.query.AbstractMongoQuery.execute(AbstractMongoQuery.java:94)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:483)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:461)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.projection.DefaultMethodInvokingMethodInterceptor.invoke(DefaultMethodInvokingMethodInterceptor.java:56)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.data.repository.core.support.SurroundingTransactionDetectorMethodInterceptor.invoke(SurroundingTransactionDetectorMethodInterceptor.java:57)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213)
...
Caused by: com.mongodb.MongoSocketWriteException: Exception sending message
at com.mongodb.connection.InternalStreamConnection.translateWriteException(InternalStreamConnection.java:465)
at com.mongodb.connection.InternalStreamConnection.sendMessage(InternalStreamConnection.java:208)
at com.mongodb.connection.UsageTrackingInternalConnection.sendMessage(UsageTrackingInternalConnection.java:90)
at com.mongodb.connection.DefaultConnectionPool$PooledConnection.sendMessage(DefaultConnectionPool.java:429)
at com.mongodb.connection.CommandProtocol.sendMessage(CommandProtocol.java:189)
at com.mongodb.connection.CommandProtocol.execute(CommandProtocol.java:111)
at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:168)
at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:289)
at com.mongodb.connection.DefaultServerConnection.command(DefaultServerConnection.java:176)
at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:216)
at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:207)
at com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:113)
at com.mongodb.operation.FindOperation$1.call(FindOperation.java:516)
at com.mongodb.operation.FindOperation$1.call(FindOperation.java:510)
at com.mongodb.operation.OperationHelper.withConnectionSource(OperationHelper.java:431)
at com.mongodb.operation.OperationHelper.withConnection(OperationHelper.java:404)
at com.mongodb.operation.FindOperation.execute(FindOperation.java:510)
at com.mongodb.operation.FindOperation.execute(FindOperation.java:81)
at com.mongodb.Mongo.execute(Mongo.java:836)
at com.mongodb.Mongo$2.execute(Mongo.java:823)
at com.mongodb.DBCursor.initializeCursor(DBCursor.java:870)
at com.mongodb.DBCursor.hasNext(DBCursor.java:142)
at org.springframework.data.mongodb.core.MongoTemplate.executeFindMultiInternal(MongoTemplate.java:1964)
... 129 common frames omitted
Caused by: java.net.SocketException: Broken pipe (Write failed)
at java.net.SocketOutputStream.socketWrite0(Native Method)
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:111)
at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
at sun.security.ssl.OutputRecord.writeBuffer(OutputRecord.java:431)
at sun.security.ssl.OutputRecord.write(OutputRecord.java:417)
at sun.security.ssl.SSLSocketImpl.writeRecordInternal(SSLSocketImpl.java:886)
at sun.security.ssl.SSLSocketImpl.writeRecord(SSLSocketImpl.java:857)
at sun.security.ssl.AppOutputStream.write(AppOutputStream.java:123)
at com.mongodb.connection.SocketStream.write(SocketStream.java:75)
at com.mongodb.connection.InternalStreamConnection.sendMessage(InternalStreamConnection.java:204)
... 150 common frames omitted

It closes the connection because of passing the maximum connection idle time.
You will need to set this property according to your requirement.
maxConnectionIdleTime
This can be set either on your Mongo configuration or application profile.
Good luck.

You can set keep-alive like that:
#Bean
public MongoDbFactory mongoDbFactory() throws Exception {
MongoClientOptions.Builder optionsBuilder = MongoClientOptions.builder();
optionsBuilder.socketKeepAlive(true);
return new SimpleMongoDbFactory(new MongoClientURI(mongoUri, optionsBuilder));
}

Related

Kafka Consumer Connecting to MongoDB

I have a Kafka Consumer service
#KafkaListener(
topics = "topic1",
groupId = "cluster1",
containerFactory = "KafkaListenerContainerFactory")
public void consume(Message message) {
logger.info(String.format("Message recieved -> %s", message.getMsg()));
Long id = message.getId()
RepoDetail repoDetail = testRepo.findByID(id);
logger.info(
String.format(
"Message -> %s", repoDetail.getMessage()));
}
but when it tries to hit the mongo repo i get the following error:
No thread-bound request found: Are you referring to request attributes outside of an actual web request, or processing a request outside of the originally receiving thread? If you are actually operating within a web request and still receive this message, your code is probably running outside of DispatcherServlet: In this case, use RequestContextListener or RequestContextFilter to expose the current request.
Any way I can call a mongoDB from Kafka Consumer?
full log stack:
org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method 'public void com.kafka.api.service.ConsumerService.consume(...)' threw exception; nested exception is java.lang.IllegalStateException: No thread-bound request found: Are you referring to request attributes outside of an actual web request, or processing a request outside of the originally receiving thread? If you are actually operating within a web request and still receive this message, your code is probably running outside of DispatcherServlet: In this case, use RequestContextListener or RequestContextFilter to expose the current request.; nested exception is java.lang.IllegalStateException: No thread-bound request found: Are you referring to request attributes outside of an actual web request, or processing a request outside of the originally receiving thread? If you are actually operating within a web request and still receive this message, your code is probably running outside of DispatcherServlet: In this case, use RequestContextListener or RequestContextFilter to expose the current request.
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:2188) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:2159) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:2120) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:2039) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:1967) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:1853) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:1543) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1190) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1087) ~[spring-kafka-2.6.9.jar:2.6.9]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_212]
at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266) ~[?:1.8.0_212]
at java.util.concurrent.FutureTask.run(FutureTask.java) ~[?:1.8.0_212]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_212]
Caused by: java.lang.IllegalStateException: No thread-bound request found: Are you referring to request attributes outside of an actual web request, or processing a request outside of the originally receiving thread? If you are actually operating within a web request and still receive this message, your code is probably running outside of DispatcherServlet: In this case, use RequestContextListener or RequestContextFilter to expose the current request.
at org.springframework.web.context.request.RequestContextHolder.currentRequestAttributes(RequestContextHolder.java:131) ~[spring-web-5.3.8.jar:5.3.8]
at org.springframework.web.context.support.WebApplicationContextUtils.currentRequestAttributes(WebApplicationContextUtils.java:313) ~[spring-web-5.3.8.jar:5.3.8]
at org.springframework.web.context.support.WebApplicationContextUtils.access$400(WebApplicationContextUtils.java:66) ~[spring-web-5.3.8.jar:5.3.8]
at org.springframework.web.context.support.WebApplicationContextUtils$RequestObjectFactory.getObject(WebApplicationContextUtils.java:329) ~[spring-web-5.3.8.jar:5.3.8]
at org.springframework.web.context.support.WebApplicationContextUtils$RequestObjectFactory.getObject(WebApplicationContextUtils.java:324) ~[spring-web-5.3.8.jar:5.3.8]
at org.springframework.beans.factory.support.AutowireUtils$ObjectFactoryDelegatingInvocationHandler.invoke(AutowireUtils.java:292) ~[spring-beans-5.3.8.jar:5.3.8]
at com.sun.proxy.$Proxy156.getHeader(Unknown Source) ~[?:?]
at com.repository.BaseRepository.getTemplate(BaseRepository.java:57) ~[repo-5.3.008.jar:?]
at com.kafka.api.repository.impl.ProductDetailRepositoryImpl.findByID(ProductDetailRepositoryImpl.java:21) ~[classes/:?]
at com.kafka.api.repository.impl.ProductDetailRepositoryImpl$$FastClassBySpringCGLIB$$824634c2.invoke(<generated>) ~[classes/:?]
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) ~[spring-core-5.3.8.jar:5.3.8]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:779) ~[spring-aop-5.3.8.jar:5.3.8]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.3.8.jar:5.3.8]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:750) ~[spring-aop-5.3.8.jar:5.3.8]
at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:137) ~[spring-tx-5.3.8.jar:5.3.8]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.3.8.jar:5.3.8]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:750) ~[spring-aop-5.3.8.jar:5.3.8]
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:692) ~[spring-aop-5.3.8.jar:5.3.8]
at com.kafka.api.repository.impl.ProductDetailRepositoryImpl$$EnhancerBySpringCGLIB$$ee673ffc.findByID(<generated>) ~[classes/:?]
at com.kafka.api.service.ConsumerService.consume(ConsumerService.java:35) ~[classes/:?]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_212]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_212]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_212]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_212]
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:171) ~[spring-messaging-5.3.8.jar:5.3.8]
at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:120) ~[spring-messaging-5.3.8.jar:5.3.8]
at org.springframework.kafka.listener.adapter.HandlerAdapter.invoke(HandlerAdapter.java:48) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:330) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:87) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:52) ~[spring-kafka-2.6.9.jar:2.6.9]
at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:2139) ~[spring-kafka-2.6.9.jar:2.6.9]
... 11 more
Yes you can. Using a BSON Document and a perfect Write Model, you can upsert data into Mongo DB through BulkWriteResult interface.
BulkWriteResult result = getMongoClient()
.getDatabase(config.getNamespace().getDatabaseName())
.getCollection(config.getNamespace()
.getCollectionName(),BsonDocument.class)
.bulkWrite(writeModels, BULK_WRITE_OPTIONS);
Declare the below bean in your config class,
#Bean public RequestContextListener requestContextListener(){
return new RequestContextListener();
}

Postgres 11, receiving broken pipe, if stays inactive

Spring application is logging broken pipe occasionally. To deal with stale connections issue, application is already using -
testWhileIdle=true and validationQuery=SELECT 1
Stack trace -
Caused by: org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend.
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:333)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:441)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:365)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:155)
at org.postgresql.jdbc.PgPreparedStatement.executeQuery(PgPreparedStatement.java:118)
at sun.reflect.GeneratedMethodAccessor89.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.tomcat.jdbc.pool.StatementFacade$StatementProxy.invoke(StatementFacade.java:114)
at com.sun.proxy.$Proxy137.executeQuery(Unknown Source)
at org.hibernate.engine.jdbc.internal.ResultSetReturnImpl.extract(ResultSetReturnImpl.java:70)
... 133 more
Caused by: java.net.SocketException: Connection reset
at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:115)
at java.net.SocketOutputStream.write(SocketOutputStream.java:155)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
at org.postgresql.core.PGStream.flush(PGStream.java:514)
at org.postgresql.core.v3.QueryExecutorImpl.sendSync(QueryExecutorImpl.java:1363)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:304)
... 143 more
Caused by java.net.SocketException: Broken pipe (Write failed)
A vague feeling, if application stays inactive for while, then it might be giving problem connecting to backend or vice-versa. But since can not reproduce this issue locally, it is hard to point.
You can try setting the timeout of idle connection higher.
idle_in_transaction_session_timeout = 30000
in postgresql.conf
But your error seems strange to me. I would probably check if you dont have way to many connection as well.
On the spring-boot side you could try adding the following properties:
spring.datasource.test-on-borrow=true
spring.datasource.validation-query=SELECT 1
spring.datasource.validation-interval=30000
you could try reducing the validation interval. I set it to the default one there
Hope it helps

Bluemix dashDB couchDB(?) error

After an attempt to create a new dashDB instance, a distinctly non-Netezza/DB2 error is thrown when trying to "manage" this newly purchased instance.
Exception thrown by application class 'org.lightcouch.CouchDbClientBase.executeRequest:-1'
org.lightcouch.CouchDbException: Error executing request.
at org.lightcouch.CouchDbClientBase.executeRequest(Unknown Source)
at org.lightcouch.CouchDbClientBase.get(Unknown Source)
at org.lightcouch.CouchDbClientBase.get(Unknown Source)
at org.lightcouch.CouchDbClientBase.get(Unknown Source)
at org.lightcouch.CouchDatabaseBase.find(Unknown Source)
at com.cloudant.client.api.Database.find(Unknown Source)
at com.ibm.datatools.dsweb.repository.CloudantRepo.getProvisionedServiceInstance(CloudantRepo.java:382)
at com.ibm.datatools.dsweb.controller.BluShiftHTTPController.getInstanceStatus(BluShiftHTTPController.java:870)
at com.ibm.datatools.dsweb.controller.RestEndPoint.launchDashboard(RestEndPoint.java:513)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.wink.server.internal.handlers.InvokeMethodHandler.handleRequest(InvokeMethodHandler.java:63)
at org.apache.wink.server.handlers.AbstractHandler.handleRequest(AbstractHandler.java:33)
at org.apache.wink.server.handlers.RequestHandlersChain.handle(RequestHandlersChain.java:26)
--- clipped for your sanity ---
at org.apache.wink.server.internal.RequestProcessor.handleRequestWithoutFaultBarrier(RequestProcessor.java:207)
at org.apache.wink.server.internal.RequestProcessor.handleRequest(RequestProcessor.java:154)
at org.apache.wink.server.internal.servlet.RestServlet.service(RestServlet.java:124)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:668)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1287)
at [internal classes]
Caused by: java.net.SocketTimeoutException: Read timed out
... 72 more
I'm not quite sure what CouchDB has to do with dashDB, but in any event, another day, another ungracefully handled.
I'll just try again tomorrow, that usually fixies it.
Per your description you got the error above when trying to launch the console to manage the dashDB instance you just created.
This confirms the exception you are seeing, specifically this line:
com.ibm.datatools.dsweb.controller.RestEndPoint.launchDashboard(RestEndPoint.java:513)
The dashDB console is a Web UI with a backend developed using Cloudant NoSQL DB, which is based off of Couch DB. Hence the Couch DB exception you are seeing.
The Cloudant NoSQL DB was probably offline at the moment you tried to launch it, but I agreed that the exception should be handled properly. I will create an internal defect to get the dashDB team provide a fix for this.

Spring Batch Partitioned Step stopped after hours from when a non-skippable exception occured

I want to verify a behaviour of Spring Batch...
When running a partitioned step of a Job I got this exception:
org.springframework.batch.core.JobExecutionException: Partition handler returned an unsuccessful step
at org.springframework.batch.core.partition.support.PartitionStep.doExecute(PartitionStep.java:111)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:195)
at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:137)
at org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:64)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:60)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:152)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:131)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:135)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:301)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:134)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:127)
only this - no previous exceptions that might have triggered this, and then got a FAILED result for my job.
When searching the logs from previous hours-days I noticed these exceptions(3 of them in different partitioned steps):
06/05/2014 21:50:51.996 [Step3TaskExecutor-12] [] ERROR AbstractStep - Line (222) Encountered an error executing the step
org.springframework.retry.RetryException: Non-skippable exception in recoverer while processing; nested exception is java.io.FileNotFoundException: Source 'blabla....pdf' does not exist
at org.springframework.batch.core.step.item.FaultTolerantChunkProcessor$2.recover(FaultTolerantChunkProcessor.java:281)
at org.springframework.retry.support.RetryTemplate.handleRetryExhausted(RetryTemplate.java:435)
at org.springframework.retry.support.RetryTemplate.doExecute(RetryTemplate.java:304)
at org.springframework.retry.support.RetryTemplate.execute(RetryTemplate.java:188)
at org.springframework.batch.core.step.item.BatchRetryTemplate.execute(BatchRetryTemplate.java:217)
at org.springframework.batch.core.step.item.FaultTolerantChunkProcessor.transform(FaultTolerantChunkProcessor.java:290)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.process(SimpleChunkProcessor.java:192)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:75)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:395)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:267)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:77)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:368)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:253)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:195)
at org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler$1.call(TaskExecutorPartitionHandler.java:139)
at org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler$1.call(TaskExecutorPartitionHandler.java:136)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
Caused by: java.io.FileNotFoundException: Source 'blabla....pdf' does not exist
It seemed weird to me that after those exceptions, job continued to run, so I'm thinking that only the slave-steps that this exception occured have failed and master step waited for the rest of slave steps to finish in order to return the first error mentioned.
Can someone verify that this is the problem? it's been driving me crazy for days
That is correct behavior for Spring Batch's partitioning. The PartitionHandler in the master step evaluates the results of all steps at once when they have all returned (or timed out). With regards to what happened in the slaves, those logged errors would be a leading cause to me. However, the definitive answer should be in the job repository (assuming you're using a database backed implementation). When a step fails (even a partitioned slave), the exception is stored there.
I got this error when the utilization of my CPU is very high.
When i added this bean in configuration, it worked for me:
#Bean
public TaskExecutor asyncTaskExecutor() {
SimpleAsyncTaskExecutor taskExecutor = new SimpleAsyncTaskExecutor();
taskExecutor.setConcurrencyLimit(numberOfCores);
return taskExecutor;
}
I use it here:
#Bean
public Step masterStep() throws Exception {
return stepBuilderFactory.get("masterStep")
.partitioner(slaveStep().getName(), partitioner())
.step(slaveStep())
.gridSize(gridSize)
.taskExecutor(asyncTaskExecutor())
.build();
}

Spring MongoDB TCP connection ends then does a query

I am having a very strange problem with Spring 3.1 and MongoDB. I am using the Spring Mongo libs. This is a pretty simple call just pulls some data and inserts some new data. But as it is in the loop when pulling new data the connection is killed. From a capture I see it pull data insert data, then do another query but after a two second delay it sends a FIN packet to the server. the server then a little bit later tries to send data back to the client, but after that the client sends RSTs as it has already killed the connection.
Please let me know what other information I can provide to help understand this issue. I can provide a download link for the a capture file as well.
Thank you for your help
Trace output
org.springframework.dao.DataAccessResourceFailureException: can't call something : id561la.ytel.com/172.31.214.55:27017/cdrstat; nested exception is com.mongodb.MongoException$Network: can't call something : id561la.ytel.com/172.31.214.55:27017/cdrstat
at org.springframework.data.mongodb.core.MongoExceptionTranslator.translateExceptionIfPossible(MongoExceptionTranslator.java:56)
at org.springframework.data.mongodb.core.MongoTemplate.potentiallyConvertRuntimeException(MongoTemplate.java:1665)
at org.springframework.data.mongodb.core.MongoTemplate.executeFindMultiInternal(MongoTemplate.java:1548)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1336)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1322)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:495)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:486)
at dao.BaseMongoSQLDAO.queryForList(BaseMongoSQLDAO.java:35)
at dao.PrefixStatSqlMapDAO.list(PrefixStatSqlMapDAO.java:59)
at services.CDRReaderService.getPrefixStat(CDRReaderService.java:705)
at services.CDRReaderService.processFile(CDRReaderService.java:659)
at services.CDRReaderService.access$100(CDRReaderService.java:42)
at services.CDRReaderService$1.run(CDRReaderService.java:150)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1146)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:679)
Caused by: com.mongodb.MongoException$Network: can't call something : id561la.ytel.com/172.31.214.55:27017/cdrstat
at com.mongodb.DBTCPConnector.innerCall(DBTCPConnector.java:295)
at com.mongodb.DBTCPConnector.call(DBTCPConnector.java:257)
at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:310)
at com.mongodb.DBApiLayer$MyCollection.__find(DBApiLayer.java:295)
at com.mongodb.DBCursor._check(DBCursor.java:368)
at com.mongodb.DBCursor._hasNext(DBCursor.java:459)
at com.mongodb.DBCursor.hasNext(DBCursor.java:484)
at org.springframework.data.mongodb.core.MongoTemplate.executeFindMultiInternal(MongoTemplate.java:1534)
... 13 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.read(SocketInputStream.java:146)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
at org.bson.io.Bits.readFully(Bits.java:46)
at org.bson.io.Bits.readFully(Bits.java:33)
at org.bson.io.Bits.readFully(Bits.java:28)
at com.mongodb.Response.<init>(Response.java:40)
at com.mongodb.DBPort.go(DBPort.java:124)
at com.mongodb.DBPort.call(DBPort.java:74)
at com.mongodb.DBTCPConnector.innerCall(DBTCPConnector.java:286)
... 20 more