Camel-SQL route ServiceUnavailableException when using header value as parameter - postgresql

In ApacheServiceMix 7.0.0 I have defined the following routes using Blueprint:
<reference id="dataSource" interface="javax.sql.DataSource" filter="(dataSourceName=connectuserdata)" />
<bean id="sql" class="org.apache.camel.component.sql.SqlComponent">
<property name="dataSource" ref="dataSource" />
</bean>
<camelContext xmlns="http://camel.apache.org/schema/blueprint">
<package>com.focuscura</package>
<dataFormats>
<!-- here we define a Json data format with the id jack and that it should use the TestPojo as the class type when
doing unmarshal. The unmarshalTypeName is optional, if not provided Camel will use a Map as the type -->
<json id="userdata" library="Jackson" />
</dataFormats>
<route id="connect.userdata_create">
<from uri="jetty:http://localhost:8881/userdata?httpMethodRestrict=POST"/>
<unmarshal ref="userdata"/>
<!--<process ref="scalaUserDataProcessor"/>-->
<log message="Received new userdata" />
<to uri="sql:INSERT INTO public."UserData" (lastname, firstname) VALUES (:#lastname , :#firstname)"/>
</route>
<route id="connect.userdata_get2">
<from uri="jetty:http://localhost:8881/userdata2?httpMethodRestrict=GET"/>
<to uri="sql:SELECT * FROM public."UserData" WHERE id = :#id"/>
<!--<process ref="scalaUserDataProcessor"/>-->
<marshal ref="userdata"/>
</route>
</camelContext>
The Datasource is installed as a seperate service following the instructions as described in this post: How can I install postgresqljdbc to work in Karaf OSGi?
This is working fine! I can post JSON to the Jetty URL and it is inserted into the database.
But when I try to get data from this Endpoint I get the following error
org.osgi.service.blueprint.container.ServiceUnavailableException: The Blueprint container is being or has been destroyed: (&(dataSourceName=connectuserdata)(objectClass=javax.sql.DataSource))
at org.apache.aries.blueprint.container.ReferenceRecipe.getService(ReferenceRecipe.java:241)
at org.apache.aries.blueprint.container.ReferenceRecipe.access$000(ReferenceRecipe.java:56)
at org.apache.aries.blueprint.container.ReferenceRecipe$ServiceDispatcher.call(ReferenceRecipe.java:306)
at Proxyb6abdd30_6f59_4e89_a419_c4ff0558aa62.equals(Unknown Source)
at java.util.WeakHashMap.eq(WeakHashMap.java:287)
at java.util.WeakHashMap.get(WeakHashMap.java:401)
at org.springframework.jdbc.support.SQLErrorCodesFactory.getErrorCodes(SQLErrorCodesFactory.java:204)
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.setDataSource(SQLErrorCodeSQLExceptionTranslator.java:140)
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.
(SQLErrorCodeSQLExceptionTranslator.java:103)
at org.springframework.jdbc.support.JdbcAccessor.getExceptionTranslator(JdbcAccessor.java:99)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:605)
at org.apache.camel.component.sql.SqlProducer.process(SqlProducer.java:100)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:145)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:460)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:121)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:83)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.component.jetty.CamelContinuationServlet.service(CamelContinuationServlet.java:191)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1669)
at org.eclipse.jetty.servlets.MultiPartFilter.doFilter(MultiPartFilter.java:146)
at org.apache.camel.component.jetty.CamelFilterWrapper.doFilter(CamelFilterWrapper.java:43)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
The weird thing is, that when I substitute <to uri="sql:SELECT * FROM public."UserData" WHERE id = **:#id**"/> with <to uri="sql:SELECT * FROM public."UserData" WHERE id = **2**"/>
it works fine again and I get a nice little JSON of user nr 2 returned.
Any hints as to how to solve this?

The problem wasn't really in my route or camel. It was caused by PostgreSQL. By using a parameter in my route I needed to provide an explicit typecast.
To get this working I needed to add ::bigint to the parameter in my route: sql:SELECT lastname FROM public."UserData" WHERE id = :#id::bigint?dataSource=dataSource&allowNamedParameters=true
For some reason the error from Postgres was swallowed by Camel-sql and I got the weird message about Datasource being destroyed.
Thanks everybody for your helpful comments and questions!

Related

Apache Camel invoke SOAP service cast problem

I am new in Apache Camel and I use Red Hat CodeReady Studio 12.16.0.GA. I want invoke soap web service. I have used this example https://tomd.xyz/camel-consume-soap-service/
This is my camel context file
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:camel-cxf="http://camel.apache.org/schema/cxf"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation=" http://www.springframework.org/schema/beans https://www.springframework.org/schema/beans/spring-beans.xsd http://camel.apache.org/schema/spring https://camel.apache.org/schema/spring/camel-spring.xsd http://camel.apache.org/schema/cxf http://camel.apache.org/schema/cxf/camel-cxf.xsd">
<bean class="org.apache.cxf.transport.common.gzip.GZIPInInterceptor" id="gZipInInterceptor"/>
<bean
class="org.apache.cxf.transport.common.gzip.GZIPOutInterceptor" id="gZipOutInterceptor"/>
<camel-cxf:cxfEndpoint
address="http://webservices.oorsprong.org/websamples.countryinfo/CountryInfoService.wso"
id="fullCountryInfoResponseClient" serviceClass="org.oorsprong.websamples_countryinfo.CountryInfoServiceSoapType">
<camel-cxf:inInterceptors>
<ref bean="gZipInInterceptor"/>
</camel-cxf:inInterceptors>
<camel-cxf:outInterceptors>
<ref bean="gZipOutInterceptor"/>
</camel-cxf:outInterceptors>
</camel-cxf:cxfEndpoint>
<bean
class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer" id="bean-66d2672d-c6c0-4984-bc31-90bc30bfaaef"/>
<camelContext id="camel"
xmlns="http://camel.apache.org/schema/spring" xmlns:order="http://fabric8.com/examples/order/v7">
<route id="simple-route">
<from id="_to2" uri="timer:timerName?delay=0&repeatCount=1"/>
<setBody id="_setBody2">
<constant>"US"</constant>
</setBody>
<bean beanType="com.example.GetFullCountryInfoBuilder"
id="_bean1" method="getFullCountryInfo"/>
<setHeader headerName="operationNamespace" id="_setHeader1">
<constant>http://www.oorsprong.org/websamples.countryinfo</constant>
</setHeader>
<setHeader headerName="operationName" id="_setHeader2">
<constant>FullCountryInfo</constant>
</setHeader>
<to id="_to1" uri="cxf:bean:fullCountryInfoResponseClient"/>
<bean beanType="com.example.GetFullCountryInfoBuilder"
id="_bean2" method="getFullCountryInfoOutput"/>
<log id="_log1" message=">>>${body}"/>
</route>
</camelContext>
</beans>
this is my input bean
public class GetFullCountryInfoBuilder {
public GetFullCountryInfoBuilder() {}
#Bean
public FullCountryInfo getFullCountryInfo(#Body String id) {
FullCountryInfo request = new FullCountryInfo();
request.setSCountryISOCode(id);
return request;
}
#Bean
public String getFullCountryInfoOutput(#Body FullCountryInfoResponse response) {
String ret = response.getFullCountryInfoResult().getSName() + " - " + response.getFullCountryInfoResult().getSCapitalCity() + " - " + response.getFullCountryInfoResult().getSCurrencyISOCode();
return ret;
}
}
Still get error org.apache.cxf.interceptor.Fault: org.oorsprong.websamples.FullCountryInfo cannot be cast to java.lang.String
It looks like CXF don't handle FullCountryInfo object but String and Camel tries to convert it.
When I change return of getFullCountryInfo to String this exception disappear but couple of another come in.
Caused by: org.apache.camel.InvalidPayloadException: No body available of type: org.oorsprong.websamples.FullCountryInfoResponse but has value: [org.oorsprong.websamples.TCountryInfo#3c5110df] of type: org.apache.cxf.message.MessageContentsList on: Message[].
Caused by: [org.apache.camel.NoTypeConversionAvailableException - No type converter available to convert from type: org.apache.cxf.message.MessageContentsList to the required type: org.oorsprong.websamples.FullCountryInfoResponse with value [org.oorsprong.websamples.TCountryInfo#3c5110df]] Caused by: No type converter available to convert from type: org.apache.cxf.message.MessageContentsList to the required type: org.oorsprong.websamples.FullCountryInfoResponse with value [org.oorsprong.websamples.TCountryInfo#3c5110df]. Exchange[ID-sw70-1599555257341-0-1].
So input for CXF is not an object as is described in example but string.
Output of CXF is org.apache.cxf.message.MessageContentsList that you have to convert to string to log it. I have used getFullCountryInfoOutput bean in this case.

How can we differentiate between String and Long in job parameters passed to Spring Batch job through CommandLine

I'm using vulnscanner job example and tried to add 2 additional parameters. Here is the url I used to run the job.
curl -d jobParameters=ipAddress=74.54.219.210,outputFile=logs/tb.xml,country=USA,randomId=1245873 http://localhost:8080/partitioningJobs/jobs/vulnScannerJob.json
When I used VARCHAR as datatype for country and BIGINT as datatype for randomId, the example thrown the following exception in scanPorts step.
Caused by: org.springframework.jdbc.UncategorizedSQLException: PreparedStatementCallback; uncategorized SQLException for SQL [UPDATE BATCH_STEP_EXECUTION_CONTEXT SET SHORT_CONTEXT = ?, SERIALIZED_CONTEXT = ? WHERE STEP_EXECUTION_ID = ?]; SQL state [25P02]; error code [0]; ERROR: current transaction is aborted, commands ignored until end of transaction block; nested exception is org.postgresql.util.PSQLException: ERROR: current transaction is aborted, commands ignored until end of transaction block
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:84)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:660)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:909)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:970)
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao.persistSerializedContext(JdbcExecutionContextDao.java:233)
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao.updateExecutionContext(JdbcExecutionContextDao.java:161)
at org.springframework.batch.core.repository.support.SimpleJobRepository.updateExecutionContext(SimpleJobRepository.java:205)
at sun.reflect.GeneratedMethodAccessor110.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:96)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:260)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:94)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy18.updateExecutionContext(Unknown Source)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:451)
... 50 more
Caused by: org.postgresql.util.PSQLException: ERROR: current transaction is aborted, commands ignored until end of transaction block
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2458)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2158)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:291)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:432)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:358)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:171)
at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:138)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:102)
at org.springframework.jdbc.core.JdbcTemplate$2.doInPreparedStatement(JdbcTemplate.java:916)
at org.springframework.jdbc.core.JdbcTemplate$2.doInPreparedStatement(JdbcTemplate.java:909)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:644)
... 68 more
Caused by: org.postgresql.util.PSQLException: ERROR: operator does not exist: bigint = character varying
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 67
I changed both the datatypes to VARCHAR and it ran fine. But I wanted to know whether spring batch recognizes all it's job parameters as String only or other other datatypes too.
FYI, I have used randomId jobParameter in ItemReader and I got this error. I have made all necessary changes in Target and TargetRowMapper class to handle these additional parameters.
<bean id="targetItemReader" class="org.springframework.batch.item.database.JdbcPagingItemReader" scope="step">
<property name="dataSource" ref="dataSource" />
<property name="queryProvider">
<bean
class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="selectClause" value="ID, RANDOMID, IP, PORT, CONNECTED, BANNER" />
<property name="fromClause" value="FROM TARGET" />
<property name="whereClause" value="RANDOMID = :qid AND ID >= :minId AND ID <= :maxId AND CONNECTED IS NULL"/>
<property name="sortKey" value="ID" />
</bean>
</property>
<property name="pageSize" value="10" />
<property name="parameterValues">
<map>
<entry key="minId" value="#{stepExecutionContext[minValue]}"/>
<entry key="maxId" value="#{stepExecutionContext[maxValue]}"/>
<entry key="qid" value="#{jobParameters[randomId]}"/>
</map>
</property>
<property name="rowMapper">
<bean class="com.michaelminella.springbatch.domain.TargetRowMapper"/>
</property>
</bean>
As my understanding, #{jobParameter} is an unmodifiableMap of java.util.Collections of type . It can contain any Object such as
JobExecution jobExecution = jobLauncher.run(this.exampleJob, new
JobParametersBuilder()
.addDate("now", new Date())
.addString("paramString", "stringAsParam")
.addLong("paramLong", 1234L)
.toJobParameters()
For your case, you passed all params as String that is why you are getting an error at query level.
To fix it, I'd like to change
<entry key="qid" value="#{jobParameters[randomId]}"/>
to
<entry key="qid" value="#{T(java.lang.Long).parseLong(jobParameters[randomId])}"/>

WSO2 ESB: Custom URL

I have created my proxy with custom url, based on:
http://wso2.com/library/knowledge-base/2011/01/custom-urls-wso2-esb-proxy-services/
Calling this custom URL with my SOAP message results in an error, I can still use the original url.
custom: /services/wss/PlanningOphaalServiceProxy_v1
original: /services/PlanningOphaalServiceProxy_v1
The error:
TID: [0] [ESB] [2015-08-19 15:47:05,039] ERROR {org.apache.axis2.engine.AxisEngine} - InvalidSecurity {org.apache.axis2.engine.AxisEngine}
org.apache.axis2.AxisFault: InvalidSecurity
at org.apache.rampart.handler.PostDispatchVerificationHandler.invoke(PostDispatchVerificationHandler.java:151)
at org.apache.axis2.engine.Phase.invokeHandler(Phase.java:340)
at org.apache.axis2.engine.Phase.invoke(Phase.java:313)
at org.apache.axis2.engine.AxisEngine.invoke(AxisEngine.java:261)
at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:167)
at org.apache.synapse.transport.passthru.ServerWorker.processEntityEnclosingRequest(ServerWorker.java:411)
at org.apache.synapse.transport.passthru.ServerWorker.run(ServerWorker.java:183)
at org.apache.axis2.transport.base.threads.NativeWorkerPool$1.run(NativeWorkerPool.java:172)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
TID: [0] [ESB] [2015-08-19 15:47:05,041] ERROR {org.apache.synapse.transport.passthru.ServerWorker} - Error processing POST request for : /services/wss/PlanningOphaalServiceProxy_v1 {org.apache.synapse.transport.passthru.ServerWorker}
org.apache.axis2.AxisFault: InvalidSecurity
Solved: This is not possible. CustomURI does not work in combination with WS-Security, according to WSO2 SUpport
I tried Custom URI and working fine after adding custom dispatched in axis2.xml
below is URI for my proxy
http://localhost:8280/GenericProxy
and Proxy code as below:
<?xml version="1.0" encoding="UTF-8"?>
<proxy xmlns="http://ws.apache.org/ns/synapse"
name="CustomProxy"
transports="https,http"
statistics="disable"
trace="disable"
startOnLoad="true">
<target>
<inSequence>
<property name="messageType"
value="application/xml"
scope="default"
type="STRING"/>
<log level="full">
<property name="################## Body - In Seq###################"
expression="$body"/>
</log>
<send>
<endpoint>
<address uri="http://localhost:8090/services/OracleStoredProcedure.SOAP12Endpoint/"/>
</endpoint>
</send>
</inSequence>
<outSequence>
<log level="full">
<property name="############## Res Seq ############" value="Response"/>
</log>
</outSequence>
</target>
<parameter name="ServiceURI">/GenericProxy</parameter>
<description/>
</proxy>
And logs as below
[2015-08-21 16:35:25,173] INFO - ProxyService Successfully created the Axis2 se
rvice for Proxy service : CustomProxy
[2015-08-21 16:35:41,405] INFO - LogMediator To: /GenericProxy, MessageID: urn:
uuid:8e35439e-4d28-95be-4e25eb385843, Direction: request, #################
# Body - In Seq################### = <soapenv:Body xmlns:soapenv="http://schemas
.xmlsoap.org/soap/envelope/"><emp>
<ID>sample</ID>
</emp></soapenv:Body>, Envelope: <?xml version="1.0" encoding="utf-8"?><soapenv:
Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"><soapenv:Body
><emp>
<ID>sample</ID>
</emp></soapenv:Body></soapenv:Envelope>
[2015-08-21 16:35:42,200] INFO - LogMediator To: http://www.w3.org/2005/08/addr
essing/anonymous, WSAction: , SOAPAction: , MessageID: urn:uuid:0431fb5f-4b03-47
41-727bdb3637d8, Direction: response, ############## Res Seq ############ =
Response, Envelope: <?xml version="1.0" encoding="utf-8"?><soapenv:Envelope xml
ns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"><soapenv:Body><soapenv:En
velope>

Can Same spring batch invoked with different job parameter from different multiple thread at the same time

Our requirement is to write multiple files at the same time. we are using spring batch to write file and we are lunching the spring batch from different thread. Each thread will have it is own application context. So we can assure that the singletone beans will not be shared across multiple thread. Below is my code snippet.
Spring batch config.
<bean id="reportDataReader" class="com.test.ist.batch2.rrm.batch.readers.RRMItmeReader"
scope="step">
<property name="verifyCursorPosition" value="false" />
<property name="dataSource" ref="dataSource" />
<property name="sql" value="#{jobParameters['sqlquery']}" />
<property name="rowMapper" ref="valueMapper" />
<property name="fetchSize" value="5000" />
</bean>
<bean id="valueMapper" class="com.test.ist.batch2.rrm.batch.mappers.DBValueMapper" scope="step"></bean>
<bean id="velocityFileWritter"
class="com.test.ist.batch2.rrm.batch.writers.RRMVelocityFileWriter"
scope="step">
</bean>
<bean id="velocityEngine"
class="org.springframework.ui.velocity.VelocityEngineFactoryBean">
<property name="velocityProperties">
<value>
resource.loader = class
class.resource.loader.class = org.apache.velocity.runtime.resource.loader.ClasspathResourceLoader
class.resource.loader.cache = true
class.resource.loader.modificationCheckInterval = 0
</value>
</property>
</bean>
<batch:job id="rrmReportGenJob">
<batch:step id="rrmReportGenStep">
<batch:tasklet>
<batch:chunk reader="reportDataReader" writer="velocityFileWritter"
commit-interval="${reportData.reader.commit-interval}">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
This how we are invoking the spring batch.
ThreadPoolExecutor tpe=new ThreadPoolExecutor(10, 10, 1000000, TimeUnit.MILLISECONDS, new LinkedBlockingQueue());
PetReportGenerator rrg=new PetReportGenerator(null);
ThreadTest tt=new ThreadTest(new PetReportGenerator(null), "161");
ThreadTest tt2=new ThreadTest(new PetReportGenerator(null), "162");
ThreadTest tt3=new ThreadTest(new PetReportGenerator(null), "163");
ThreadTest tt4=new ThreadTest(new PetReportGenerator(null), "165");
tpe.execute(tt);
tpe.execute(tt2);
tpe.execute(tt3);
tpe.execute(tt4);
In the constructor of PetReportGenerator we are initializing the bean config.
Below is the code snippet
private ApplicationContext appContext;
public PetReportGenerator(ApplicationContext reportContext){
if(null == reportContext){
//if(null == appContext){
appContext=new ClassPathXmlApplicationContext("spring-batch-jobs.xml");
//}
}else{
setAppContext(reportContext);;
}
}
Below is the code extract of how we invoke the spring batch
Job jobToExecute = (Job)SpringUtils.getBean(jobName);
JobParametersBuilder paramsBuilder = new JobParametersBuilder();
//By default add the Data time. This will help to lauch the same job again with same parameters
paramsBuilder.addLong("JOB_TIME", System.currentTimeMillis());
if(!jobParams.isEmpty()){
//Validate input fields.
String sqlToUse = validator.validateInput(jobParams);
for(Map.Entry entry:jobParams.entrySet()){
paramsBuilder.addString(entry.getKey(), entry.getValue());
}
}else{
throw new ReportGenerationException("Job input parameter is Empty");
}
jobexe=jobLauncher.run(jobToExecute, paramsBuilder.toJobParameters());
If it is run in a single thread it is working fine.
When it is invoked by multiple threads we are getting below error
09:09:26,742 ERROR pool-1-thread-3 job.AbstractJob:329 - Encountered fatal error executing job
java.lang.NullPointerException
at org.springframework.batch.core.repository.dao.MapJobExecutionDao.synchronizeStatus(MapJobExecutionDao.java:158)
at org.springframework.batch.core.repository.support.SimpleJobRepository.update(SimpleJobRepository.java:161)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:98)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:262)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:95)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy14.update(Unknown Source)
at org.springframework.batch.core.job.AbstractJob.updateStatus(AbstractJob.java:416)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:299)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:135)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:128)
Can any one please help me to understand what could be wrong ?
The MapJobRepository is NOT intended for production use. It is NOT threadsafe. If you need the performance of in memory job repositories (loosing restartability, etc), use an in memory database like HSQLDB.
That note aside, if you are using thread safe components, there is no reason you can't launch multiple job instances with multiple threads.
Are you sure, that the MapJobExecutionDao is threadsafe in all aspects? I see, that a ConcurrentMap is used in side MapJobExecutionDao, but I'm not sure if this is enough. I once had a problem getting also a NullPointer from a Map that was accessed from different threads. The problem there was, that one thread caused a rehashing and when the second thread did access the map at that very moment, it received a nullpointer.
Are you sure, that the combinations of your identifying jobparameters is unique? I see, that you add a parameter Job_Time with System.currentTimeMillis(), but do you know, if that really resolves in an unique timestamp?
Have you tried to use the table based versions of JobExecutionDao and so on?

Mule HTTP java.io.IOException: Attempted read on closed stream

When I used two HTTP endpoints in a flow, Mule throws this exception. I've found the way to deal with this problem: use the second HTTP endpoint in asynchronous mode, but it is not a good way.
ERROR DefaultSystemExceptionStrategy [[testdemo].connector.http.mule.default.receiver.03]: Caught exception in Exception Strategy: Attempted read on closed stream.
java.io.IOException: Attempted read on closed stream.
at org.apache.commons.httpclient.AutoCloseInputStream.isReadAllowed(AutoCloseInputStream.java:183)
at org.apache.commons.httpclient.AutoCloseInputStream.read(AutoCloseInputStream.java:126)
at org.mule.model.streaming.DelegatingInputStream.read(DelegatingInputStream.java:58)
at org.apache.commons.io.IOUtils.copyLarge(IOUtils.java:1025)
at org.mule.transformer.simple.ObjectToOutputHandler$3.write(ObjectToOutputHandler.java:76)
at org.mule.transport.http.HttpServerConnection.writeResponse(HttpServerConnection.java:315)
at org.mule.transport.http.HttpMessageReceiver$HttpWorker.run(HttpMessageReceiver.java:164)
at org.mule.work.WorkerContext.run(WorkerContext.java:311)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
my code :
<flow doc:name="httpPost" name="httpPost">
<http:inbound-endpoint doc:name="HTTP" keep-alive="true"
exchange-pattern="request-response" host="localhost" path="service"
port="8082" />
<http:body-to-parameter-map-transformer />
<!-- This component is just set to show the message accecpted from the request -->
<scripting:component>
<scripting:script engine="groovy">
println payload['name']
return payload['name']+'123'
</scripting:script>
</scripting:component>
</flow>
<flow doc:name="httpPost" name="client">
<http:inbound-endpoint doc:name="HTTP" name="httpClient"
exchange-pattern="request-response" host="localhost" path="client"
port="8082" encoding="UTF-8" />
<http:body-to-parameter-map-transformer />
<scripting:component>
<scripting:script engine="groovy">
payload['name'] = 'opasso'
def paramstr = ""
for( param in payload){
paramstr = paramstr + "&" + param.key+ "=" + param.value
}
println "querystr:$paramstr"
return paramstr.substring(1)
</scripting:script>
</scripting:component>
<http:outbound-endpoint address="http://localhost:8082/service"
exchange-pattern="request-response" contentType="application/x-www-form-urlencoded"
method="POST" encoding="UTF-8" />
<!-- This component is just set to show the message accecpted from the request -->
<scripting:component>
<scripting:script engine="groovy">
def msg = "return payload:$payload;".toString()
println msg
return payload
</scripting:script>
</scripting:component>
</flow>
The problem is related to the funky business you're doing in the scripting:component with the streaming message payload generated by the http:outbound-endpoint. The script probably consumes the input stream, leaving it in a state where it can't used anymore.
Try adding a <object-to-string-transformer /> right after the http:outbound-endpoint to deserialize the stream into a string so the payload can be used both in the scripting:component and by Mule.