Need to read value of multiple node: ORA-19025: EXTRACTVALUE returns value of only one node - oracle12c

I can successfully read from xml file : DataTransfer_HH_TWWholesale_001_004_12142020113003.xml
which has only one node.
But when I am trying to read multiple node from xml file: DataTransfer_HH_TWWholesale_001_009_09282020103349.xml , I get errorORA-19025: EXTRACTVALUE returns value of only one node.
How can I read values from multiple node?
Below is my file and code:
File: DataTransfer_HH_TWWholesale_001_004_12142020113003.xml
<MeterReadsReplyMessage xmlns="http://www.emeter.com/energyip/amiinterface">
<Header>
<verb>create</verb>
<noun>DTSMeterReads</noun>
<revision>2</revision>
<source>EIP</source>
</Header>
<payload>
<MeterReading>
<ServiceDeliveryPoint>
<mRID>901291331_0001</mRID>
<idType>SDP_X_UDC_ASSET_ID</idType>
</ServiceDeliveryPoint>
<Meter>
<mRID>SIE_640 C_310149563</mRID>
<idType>METER_X_UDC_ASSET_ID</idType>
</Meter>
<IntervalBlock>
<readingTypeId>LREG</readingTypeId>
<ReadingType>
<measurementType>Register</measurementType>
<touBinNumber>0</touBinNumber>
<unit>L</unit>
<channelNumber>1</channelNumber>
<direction>Delivered</direction>
</ReadingType>
<IReading>
<endTime>2020-10-08T00:00:00.000Z</endTime>
<value>0.0</value>
<quality>
<validationStatus>EST</validationStatus>
<locked>false</locked>
</quality>
</IReading>
</IntervalBlock>
</MeterReading>
</payload>
</MeterReadsReplyMessage>
Code:
SELECT EXTRACTVALUE (VALUE (a1),'/Header/verb','xmlns="http://www.emeter.com/energyip/amiinterface')verb,
EXTRACTVALUE (VALUE (a1),'/Header/noun','xmlns="http://www.emeter.com/energyip/amiinterface')noun,
EXTRACTVALUE (VALUE (a1),'/Header/source','xmlns="http://www.emeter.com/energyip/amiinterface')source
FROM xml_tab,TABLE (XMLSEQUENCE (EXTRACT (xml_data,
'/MeterReadsReplyMessage/Header',
'xmlns="http://www.emeter.com/energyip/amiinterface"'
))) a1
WHERE file_name = 'DataTransfer_HH_TWWholesale_001_004_12142020113003.xml';
Query:
File: DataTransfer_HH_TWWholesale_001_009_09282020103349.xml
<MeterReadsReplyMessage xmlns="http://www.emeter.com/energyip/amiinterface">
<Header>
<verb>create</verb>
<noun>DTSMeterReads</noun>
<revision>2</revision>
<source>EIP</source>
</Header>
<payload>
<MeterReading>
<ServiceDeliveryPoint>
<mRID>901291331_0001</mRID>
<idType>SDP_X_UDC_ASSET_ID</idType>
</ServiceDeliveryPoint>
<Meter>
<mRID>SIE_640 C_310149563</mRID>
<idType>METER_X_UDC_ASSET_ID</idType>
</Meter>
<IntervalBlock>
<readingTypeId>LREG</readingTypeId>
<ReadingType>
<measurementType>Register</measurementType>
<touBinNumber>0</touBinNumber>
<unit>L</unit>
<channelNumber>1</channelNumber>
<direction>Delivered</direction>
</ReadingType>
<IReading>
<endTime>2020-10-08T00:00:00.000Z</endTime>
<value>0.0</value>
<quality>
<validationStatus>EST</validationStatus>
<locked>false</locked>
</quality>
</IReading>
</IntervalBlock>
</MeterReading>
<MeterReading>
<ServiceDeliveryPoint>
<mRID>112448526_0001</mRID>
<idType>SDP_X_UDC_ASSET_ID</idType>
</ServiceDeliveryPoint>
<Meter>
<mRID>SCE_640 Concentric_310037947</mRID>
<idType>METER_X_UDC_ASSET_ID</idType>
</Meter>
<IntervalBlock>
<readingTypeId>LREG</readingTypeId>
<ReadingType>
<measurementType>Register</measurementType>
<touBinNumber>0</touBinNumber>
<unit>L</unit>
<channelNumber>1</channelNumber>
<direction>Delivered</direction>
</ReadingType>
<IReading>
<endTime>2015-12-21T01:00:00.000Z</endTime>
<value>0.0</value>
<flags>0</flags>
<quality>
<validationStatus>VAL</validationStatus>
<locked>false</locked>
</quality>
</IReading>
</IntervalBlock>
</MeterReading>
</payload>
</MeterReadsReplyMessage>
query:
Thanks for reading my question.
Expected result:
mRID mfg_serial_num readingTypeId measurementType Read_time
901291331_0001 SIE_640 C_310149563 LREG Register 2020-10-08T00:00:00.000Z
112448526_0001 SCE_640 Concentric_310037947 LREG Register 2015-12-21T01:00:00.000Z

The EXTRACT and EXTRACTVALUE XML functions are deprecated. Use XMLTABLE instead:
SELECT t.file_name,
x.*
FROM xml_tab t
CROSS APPLY XMLTABLE(
XMLNAMESPACES( DEFAULT 'http://www.emeter.com/energyip/amiinterface' ),
'/MeterReadsReplyMessage/payload/MeterReading'
PASSING t.xml_data
COLUMNS
mRID VARCHAR2(20) PATH './ServiceDeliveryPoint/mRID',
mfg_serial_num VARCHAR2(20) PATH './Meter/mRID',
readingTypeID VARCHAR2(10) PATH './IntervalBlock/readingTypeId',
measurement_type VARCHAR2(10) PATH './IntervalBlock/ReadingType/measurementType',
reading_time TIMESTAMP WITH TIME ZONE PATH '//IReading/endTime'
) x;
Which, for your sample data:
CREATE TABLE xml_tab ( file_name, xml_data ) AS
SELECT 'DataTransfer_HH_TWWholesale_001_004_12142020113003.xml',
XMLTYPE(
'<MeterReadsReplyMessage xmlns="http://www.emeter.com/energyip/amiinterface">
<Header>
<verb>create</verb>
<noun>DTSMeterReads</noun>
<revision>2</revision>
<source>EIP</source>
</Header>
<payload>
<MeterReading>
<ServiceDeliveryPoint>
<mRID>901291331_0001</mRID>
<idType>SDP_X_UDC_ASSET_ID</idType>
</ServiceDeliveryPoint>
<Meter>
<mRID>SIE_640 C_310149563</mRID>
<idType>METER_X_UDC_ASSET_ID</idType>
</Meter>
<IntervalBlock>
<readingTypeId>LREG</readingTypeId>
<ReadingType>
<measurementType>Register</measurementType>
<touBinNumber>0</touBinNumber>
<unit>L</unit>
<channelNumber>1</channelNumber>
<direction>Delivered</direction>
</ReadingType>
<IReading>
<endTime>2020-10-08T00:00:00.000Z</endTime>
<value>0.0</value>
<quality>
<validationStatus>EST</validationStatus>
<locked>false</locked>
</quality>
</IReading>
</IntervalBlock>
</MeterReading>
</payload>
</MeterReadsReplyMessage>'
) FROM DUAL UNION ALL
SELECT 'DataTransfer_HH_TWWholesale_001_009_00000000000000.xml',
XMLType( '<MeterReadsReplyMessage xmlns="http://www.emeter.com/energyip/amiinterface">
<Header>
<verb>create</verb>
<noun>DTSMeterReads</noun>
<revision>2</revision>
<source>EIP</source>
</Header>
<payload>
<MeterReading>
<ServiceDeliveryPoint>
<mRID>901291331_0001</mRID>
<idType>SDP_X_UDC_ASSET_ID</idType>
</ServiceDeliveryPoint>
<Meter>
<mRID>SIE_640 C_310149563</mRID>
<idType>METER_X_UDC_ASSET_ID</idType>
</Meter>
<IntervalBlock>
<readingTypeId>LREG</readingTypeId>
<ReadingType>
<measurementType>Register</measurementType>
<touBinNumber>0</touBinNumber>
<unit>L</unit>
<channelNumber>1</channelNumber>
<direction>Delivered</direction>
</ReadingType>
<IReading>
<endTime>2020-10-08T00:00:00.000Z</endTime>
<value>0.0</value>
<quality>
<validationStatus>EST</validationStatus>
<locked>false</locked>
</quality>
</IReading>
</IntervalBlock>
</MeterReading>
<MeterReading>
<ServiceDeliveryPoint>
<mRID>112448526_0001</mRID>
<idType>SDP_X_UDC_ASSET_ID</idType>
</ServiceDeliveryPoint>
<Meter>
<mRID>SCE_640 Concentric_310037947</mRID>
<idType>METER_X_UDC_ASSET_ID</idType>
</Meter>
<IntervalBlock>
<readingTypeId>LREG</readingTypeId>
<ReadingType>
<measurementType>Register</measurementType>
<touBinNumber>0</touBinNumber>
<unit>L</unit>
<channelNumber>1</channelNumber>
<direction>Delivered</direction>
</ReadingType>
<IReading>
<endTime>2015-12-21T01:00:00.000Z</endTime>
<value>0.0</value>
<flags>0</flags>
<quality>
<validationStatus>VAL</validationStatus>
<locked>false</locked>
</quality>
</IReading>
</IntervalBlock>
</MeterReading>
</payload>
</MeterReadsReplyMessage>'
) FROM DUAL;
Outputs:
FILE_NAME | MRID | MFG_SERIAL_NUM | READINGTYPEID | MEASUREMENT_TYPE | READING_TIME
:----------------------------------------------------- | :------------- | :------------------- | :------------ | :--------------- | :----------------------------------
DataTransfer_HH_TWWholesale_001_004_12142020113003.xml | 901291331_0001 | SIE_640 C_310149563 | LREG | Register | 2020-10-08T00:00:00.000000000+00:00
DataTransfer_HH_TWWholesale_001_009_00000000000000.xml | 901291331_0001 | SIE_640 C_310149563 | LREG | Register | 2020-10-08T00:00:00.000000000+00:00
DataTransfer_HH_TWWholesale_001_009_00000000000000.xml | 112448526_0001 | SCE_640 Concentric_3 | LREG | Register | 2015-12-21T01:00:00.000000000+00:00
db<>fiddle here

Related

org.postgresql.util.PSQLException: Large Objects may not be used in auto-commit mode Error with JBPM 4.4

We are using Jbpm 4.4 as our 3rd party Business Process Management tool with Java 6.x. However So far we used it with Oracle DB and it worked well, but now we want to run it with PostgreSQL 12.x version DB.
So we integrated postgresql-42.2.19.jre6.jar (JDBC driver) and try to run it.
We have encountered below error in the run time.
Can anyone suggest what need to be done here to resolve the issue, specially with JBPM 4.4
We have already set
<prop key="hibernate.connection.autocommit">false</prop>
But that did not resolved our issue.
2021-05-05 06:41:57,670 ERROR [o-8443-exec-154] .AbstractFlushingEventListener portaladmin#10.100.250.41 - Could not synchronize database state with session
org.hibernate.exception.GenericJDBCException: could not insert: [org.jbpm.pvm.internal.lob.Lob]
at org.hibernate.exception.SQLStateConverter.handledNonSpecificException(SQLStateConverter.java:126) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:114) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:66) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2295) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2688) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.action.EntityInsertAction.execute(EntityInsertAction.java:79) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.engine.ActionQueue.execute(ActionQueue.java:279) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:263) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:167) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:321) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.event.def.DefaultAutoFlushEventListener.onAutoFlush(DefaultAutoFlushEventListener.java:64) [hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.impl.SessionImpl.autoFlushIfRequired(SessionImpl.java:996) [hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1141) [hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.impl.QueryImpl.list(QueryImpl.java:102) [hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.jbpm.pvm.internal.query.AbstractQuery.execute(AbstractQuery.java:93) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.query.ProcessDefinitionQueryImpl.execute(ProcessDefinitionQueryImpl.java:67) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.query.AbstractQuery.untypedList(AbstractQuery.java:67) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.query.ProcessDefinitionQueryImpl.list(ProcessDefinitionQueryImpl.java:157) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.repository.ProcessDeployer.checkKey(ProcessDeployer.java:133) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.repository.ProcessDeployer.deploy(ProcessDeployer.java:92) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.repository.DeployerManager.deploy(DeployerManager.java:46) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.repository.RepositorySessionImpl.deploy(RepositorySessionImpl.java:62) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.cmd.DeployCmd.execute(DeployCmd.java:47) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.cmd.DeployCmd.execute(DeployCmd.java:33) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.svc.DefaultCommandService.execute(DefaultCommandService.java:42) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.tx.SpringCommandCallback.doInTransaction(SpringCommandCallback.java:45) [jbpm-pvm-4.4.jar:4.4]
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:130) [spring-tx-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.jbpm.pvm.internal.tx.SpringTransactionInterceptor.execute(SpringTransactionInterceptor.java:49) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.svc.EnvironmentInterceptor.executeInNewEnvironment(EnvironmentInterceptor.java:53) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.svc.EnvironmentInterceptor.execute(EnvironmentInterceptor.java:40) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.svc.RetryInterceptor.execute(RetryInterceptor.java:56) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.svc.SkipInterceptor.execute(SkipInterceptor.java:43) [jbpm-pvm-4.4.jar:4.4]
at org.jbpm.pvm.internal.repository.DeploymentImpl.deploy(DeploymentImpl.java:90) [jbpm-pvm-4.4.jar:4.4]
at com.abc.def.portal.processes.jbpm.JbpmProcessDefinitionRepository.deployProcess_aroundBody18(JbpmProcessDefinitionRepository.java:108) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessDefinitionRepository.deployProcess_aroundBody19$advice(JbpmProcessDefinitionRepository.java:92) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessDefinitionRepository.deployProcess_aroundBody20(JbpmProcessDefinitionRepository.java:1) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessDefinitionRepository.deployProcess_aroundBody22(JbpmProcessDefinitionRepository.java:106) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessDefinitionRepository.deployProcess_aroundBody23$advice(JbpmProcessDefinitionRepository.java:80) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessDefinitionRepository.deployProcess(JbpmProcessDefinitionRepository.java:1) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessService.deployProcess_aroundBody46(JbpmProcessService.java:178) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessService.deployProcess_aroundBody47$advice(JbpmProcessService.java:92) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessService.deployProcess_aroundBody48(JbpmProcessService.java:1) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessService.deployProcess_aroundBody50(JbpmProcessService.java:178) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessService.deployProcess_aroundBody51$advice(JbpmProcessService.java:80) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessService.deployProcess_aroundBody52(JbpmProcessService.java:1) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessService.deployProcess_aroundBody53$advice(JbpmProcessService.java:61) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.processes.jbpm.JbpmProcessService.deployProcess(JbpmProcessService.java:1) [com.abc.def.portal.processes-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload_aroundBody128(TaskController.java:611) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload_aroundBody129$advice(TaskController.java:58) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload_aroundBody130(TaskController.java:1) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload_aroundBody131$advice(TaskController.java:92) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload_aroundBody132(TaskController.java:1) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload_aroundBody134(TaskController.java:605) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload_aroundBody135$advice(TaskController.java:102) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload_aroundBody136(TaskController.java:1) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload_aroundBody137$advice(TaskController.java:55) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController.handleFormUpload(TaskController.java:1) [TaskController.class:na]
at com.abc.def.portal.partner.client.task.TaskController$$FastClassByCGLIB$$2349406.invoke(<generated>) [cglib-nodep-2.1_3.jar:na]
at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:149) [cglib-nodep-2.1_3.jar:na]
at org.springframework.aop.framework.Cglib2AopProxy$CglibMethodInvocation.invokeJoinpoint(Cglib2AopProxy.java:689) [spring-aop-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) [spring-aop-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.security.access.intercept.aopalliance.MethodSecurityInterceptor.invoke(MethodSecurityInterceptor.java:67) [spring-security-core-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) [spring-aop-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:622) [spring-aop-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at com.abc.def.portal.partner.client.task.TaskController$$EnhancerByCGLIB$$4f295537.handleFormUpload(<generated>) [cglib-nodep-2.1_3.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.6.0_45]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) ~[na:1.6.0_45]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) ~[na:1.6.0_45]
at java.lang.reflect.Method.invoke(Method.java:597) ~[na:1.6.0_45]
at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:213) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:126) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:96) [spring-webmvc-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:617) [spring-webmvc-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:578) [spring-webmvc-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:80) [spring-webmvc-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:923) [spring-webmvc-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:852) [spring-webmvc-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:882) [spring-webmvc-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:789) [spring-webmvc-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:646) [servlet-api.jar:na]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727) [servlet-api.jar:na]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303) [catalina.jar:7.0.53]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) [catalina.jar:7.0.53]
at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:77) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) [catalina.jar:7.0.53]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) [catalina.jar:7.0.53]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:369) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at com.abc.def.portal.partner.client.security.IncompleteUserProfileFilter.doFilterInternal_aroundBody4(IncompleteUserProfileFilter.java:108) [IncompleteUserProfileFilter.class:na]
at com.abc.def.portal.partner.client.security.IncompleteUserProfileFilter.doFilterInternal(IncompleteUserProfileFilter.java:89) [IncompleteUserProfileFilter.class:na]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:109) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:83) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:97) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:100) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:78) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:54) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:35) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:187) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at com.abc.def.portal.ui.servlet.SsoRequestHeaderAuthenticationFilter.doFilter_aroundBody2(SsoRequestHeaderAuthenticationFilter.java:63) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.ui.servlet.SsoRequestHeaderAuthenticationFilter.doFilter(SsoRequestHeaderAuthenticationFilter.java:58) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:105) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:79) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.session.ConcurrentSessionFilter.doFilter(ConcurrentSessionFilter.java:109) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:381) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:168) [spring-security-web-3.0.7.RELEASE.jar:3.0.7.RELEASE]
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:346) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:259) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) [catalina.jar:7.0.53]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) [catalina.jar:7.0.53]
at com.abc.def.portal.partner.client.security.SSOAutoLoginFilter.doFilterInternal_aroundBody0(SSOAutoLoginFilter.java:67) [SSOAutoLoginFilter.class:na]
at com.abc.def.portal.partner.client.security.SSOAutoLoginFilter.doFilterInternal(SSOAutoLoginFilter.java:63) [SSOAutoLoginFilter.class:na]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) [catalina.jar:7.0.53]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) [catalina.jar:7.0.53]
at com.abc.def.portal.ui.csrf.CsrfFilter.doFilterInternal_aroundBody0(CsrfFilter.java:86) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.ui.csrf.CsrfFilter.doFilterInternal(CsrfFilter.java:57) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) [catalina.jar:7.0.53]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) [catalina.jar:7.0.53]
at com.abc.def.portal.ui.csrf.AjaxTimeoutFilter.doFilterInternal_aroundBody0(AjaxTimeoutFilter.java:45) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.ui.csrf.AjaxTimeoutFilter.doFilterInternal(AjaxTimeoutFilter.java:31) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) [catalina.jar:7.0.53]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) [catalina.jar:7.0.53]
at com.abc.def.portal.ui.timing.TimingServletFilter.doFilter_aroundBody2(TimingServletFilter.java:71) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.ui.timing.TimingServletFilter.doFilter(TimingServletFilter.java:63) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) [catalina.jar:7.0.53]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) [catalina.jar:7.0.53]
at com.abc.def.portal.ui.servlet.XFilter.doFilterInternal_aroundBody0(XFilter.java:56) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.ui.servlet.XFilter.doFilterInternal_aroundBody1$advice(XFilter.java:64) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at com.abc.def.portal.ui.servlet.XFilter.doFilterInternal(XFilter.java:51) [com.abc.def.portal.ui-2.1.NOPSE19C.1.jar:na]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) [catalina.jar:7.0.53]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) [catalina.jar:7.0.53]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) [spring-web-3.1.2.RELEASE.jar:3.1.2.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241) [catalina.jar:7.0.53]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208) [catalina.jar:7.0.53]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220) [catalina.jar:7.0.53]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122) [catalina.jar:7.0.53]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:610) [catalina.jar:7.0.53]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170) [catalina.jar:7.0.53]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98) [catalina.jar:7.0.53]
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950) [catalina.jar:7.0.53]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116) [catalina.jar:7.0.53]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408) [catalina.jar:7.0.53]
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1040) [tomcat-coyote.jar:7.0.53]
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:607) [tomcat-coyote.jar:7.0.53]
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:313) [tomcat-coyote.jar:7.0.53]
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895) [na:1.6.0_45]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918) [na:1.6.0_45]
at java.lang.Thread.run(Thread.java:662) [na:1.6.0_45]
Caused by: org.postgresql.util.PSQLException: Large Objects may not be used in auto-commit mode.
at org.postgresql.largeobject.LargeObjectManager.createLO(LargeObjectManager.java:284) ~[postgresql-42.2.19.jre6.jar:42.2.19.jre6]
at org.postgresql.largeobject.LargeObjectManager.createLO(LargeObjectManager.java:272) ~[postgresql-42.2.19.jre6.jar:42.2.19.jre6]
at org.postgresql.jdbc.PgPreparedStatement.createBlob(PgPreparedStatement.java:1159) ~[postgresql-42.2.19.jre6.jar:42.2.19.jre6]
at org.postgresql.jdbc.PgPreparedStatement.setBlob(PgPreparedStatement.java:1200) ~[postgresql-42.2.19.jre6.jar:42.2.19.jre6]
at com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.setBlob(NewProxyPreparedStatement.java:495) ~[c3p0-0.9.1.2.jar:0.9.1.2]
at org.hibernate.type.BlobType.set(BlobType.java:72) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.type.BlobType.nullSafeSet(BlobType.java:140) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.persister.entity.AbstractEntityPersister.dehydrate(AbstractEntityPersister.java:2025) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2271) ~[hibernate-core-3.3.1.GA.jar:3.3.1.GA]
... 160 common frames omitted
Though jbpm 4.4 is a very old version (currently at 7.54), try to update your schema and use bytea type for postgresql large objects.
If you're using JTA datasource, auto-commit setting is always true and it can not be changed. Try to change it to an xa-datasource
Solution:
I was able to find out a solution for this.
I have changed the DB column to Bytea type in PostgreSQL and change the JBPM 4.4 implementation to use byte[] over java.sql.Blob (in org.jbpm.pvm.internal.lob.Lob class).

Apache spark inner join 2 dataframes getting TreeNodeException

I'm very new to Apache spark and I'm trying to inner join a Book table with a table which has book_id along with read_count. The goal is to generate a table of book names and its corresponding read count.
To start with I have a table booksRead which contains a record of users reading books, grouped by the book_id as has been ordered by their read frequency.
booksReadDF.groupBy($"book_id").agg(count("book_id").as("read_count"))
.orderBy($"read_count".desc)
.show()
+-------+----------+
|book_id|read_count|
+-------+----------+
| 8611| 565|
| 14| 436|
| 11850| 394|
| 15| 357|
| 11803| 324|
+-------+----------+
only showing top 5 rows
and I'm trying to inner join the table books which looks like this:
+------+--------------------+--------------------+--------------------+-------------+----------+--------------------+--------------------+--------------------+---------+----+--------------+------------+--------------------+--------------------+--------+--------+-----------------+---------------+----------------+--------+----------+
| id| created_at| updated_at| title| isbn_13| isbn_10| image_url| description| publisher|author_id|year|overall_rating|audible_link| google_url| query_title|category|language|number_of_reviews|waterstone_link|amazon_available|is_ebook|page_count|
+------+--------------------+--------------------+--------------------+-------------+----------+--------------------+--------------------+--------------------+---------+----+--------------+------------+--------------------+--------------------+--------+--------+-----------------+---------------+----------------+--------+----------+
|115442|2018-07-25 00:59:...|2018-07-25 00:59:...|Representation of...|9781361479278|1361479272|http://books.goog...|This dissertation...|Open Dissertation...| 62130|2017| null| null|http://books.goog...|representation of...| | en| 0| null| true| false| null|
|115450|2018-07-25 00:59:...|2018-07-25 00:59:...|Imag(in)ing the W...|9789004182981|9004182985|http://books.goog...|This study examin...| BRILL| 73131|2010| null| null|http://books.goog...|imagining the war...| | en| 0| null| true| false| null|
|218332|2018-08-19 14:48:...|2018-08-19 14:48:...|My Life With Tibe...|9781462802357|1462802354|http://books.goog...|Your child is a m...| Xlibris Corporation| 118091|2008| null| null|https://play.goog...|my life with tibe...| | en| 0| null| true| false| null|
|186991|2018-08-11 11:08:...|2018-08-11 11:08:...| NOT "Just Friends"|9781416586401|1416586407|http://books.goog...|One of the world’...| Simon and Schuster| 7687|2007| null| null|https://play.goog...| not just friends| | en| 0| null| true| false| null|
|247317|2018-09-06 08:23:...|2018-09-06 08:23:...|OCR AS and A Leve...|9781910523056|1910523054|https://images-eu...|A complete course...| PG Online Limited| 128220|2016| null| null| null|ocr as and a leve...| null| English| null| null| true| false| null|
+------+--------------------+--------------------+--------------------+-------------+----------+--------------------+--------------------+--------------------+---------+----+--------------+------------+--------------------+--------------------+--------+--------+-----------------+---------------+----------------+--------+----------+
only showing top 5 rows
where book_id joins with id on the book table with this command:
booksReadDF.groupBy($"book_id").agg(count("book_id").as("read_count"))
.orderBy($"read_count".desc)
.join(booksDF, booksReadDF.col("book_id") === booksDF.col("id"), "inner")
.show()
but I'm getting this error:
Exception in thread "main" org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
Exchange hashpartitioning(book_id#4, 200)
+- *(3) Sort [read_count#67L DESC NULLS LAST], true, 0
+- Exchange rangepartitioning(read_count#67L DESC NULLS LAST, 200)
+- *(2) HashAggregate(keys=[book_id#4], functions=[count(book_id#4)], output=[book_id#4, read_count#67L])
+- Exchange hashpartitioning(book_id#4, 200)
+- *(1) HashAggregate(keys=[book_id#4], functions=[partial_count(book_id#4)], output=[book_id#4, count#215L])
+- *(1) Scan JDBCRelation(books_readbook) [numPartitions=1] [book_id#4] PushedFilters: [*IsNotNull(book_id)], ReadSchema: struct<book_id:int>
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:391)
at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:627)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.InputAdapter.doExecute(WholeStageCodegenExec.scala:383)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.joins.SortMergeJoinExec.inputRDDs(SortMergeJoinExec.scala:386)
at org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:41)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:627)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:339)
at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3383)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2544)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2544)
at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3364)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3363)
at org.apache.spark.sql.Dataset.head(Dataset.scala:2544)
at org.apache.spark.sql.Dataset.take(Dataset.scala:2758)
at org.apache.spark.sql.Dataset.getRows(Dataset.scala:254)
at org.apache.spark.sql.Dataset.showString(Dataset.scala:291)
at org.apache.spark.sql.Dataset.show(Dataset.scala:745)
at org.apache.spark.sql.Dataset.show(Dataset.scala:704)
at org.apache.spark.sql.Dataset.show(Dataset.scala:713)
at DBConn$.main(DBConn.scala:36)
at DBConn.main(DBConn.scala)
Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute, tree:
Exchange rangepartitioning(read_count#67L DESC NULLS LAST, 200)
+- *(2) HashAggregate(keys=[book_id#4], functions=[count(book_id#4)], output=[book_id#4, read_count#67L])
+- Exchange hashpartitioning(book_id#4, 200)
+- *(1) HashAggregate(keys=[book_id#4], functions=[partial_count(book_id#4)], output=[book_id#4, count#215L])
+- *(1) Scan JDBCRelation(books_readbook) [numPartitions=1] [book_id#4] PushedFilters: [*IsNotNull(book_id)], ReadSchema: struct<book_id:int>
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.doExecute(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:391)
at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:627)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:92)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
... 52 more
Caused by: java.lang.IllegalArgumentException: Unsupported class file major version 56
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:166)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:148)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:136)
at org.apache.xbean.asm6.ClassReader.<init>(ClassReader.java:237)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:49)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:517)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:500)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:134)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:134)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:500)
at org.apache.xbean.asm6.ClassReader.readCode(ClassReader.java:2175)
at org.apache.xbean.asm6.ClassReader.readMethod(ClassReader.java:1238)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:631)
at org.apache.xbean.asm6.ClassReader.accept(ClassReader.java:355)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:307)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:306)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:306)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2100)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:945)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.collect(RDD.scala:944)
at org.apache.spark.RangePartitioner$.sketch(Partitioner.scala:309)
at org.apache.spark.RangePartitioner.<init>(Partitioner.scala:171)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$.prepareShuffleDependency(ShuffleExchangeExec.scala:224)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.prepareShuffleDependency(ShuffleExchangeExec.scala:91)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:128)
at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$doExecute$1.apply(ShuffleExchangeExec.scala:119)
at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
... 72 more
19/06/17 16:21:47 INFO SparkContext: Invoking stop() from shutdown hook
19/06/17 16:21:47 INFO SparkUI: Stopped Spark web UI at http://10.245.65.12:4040
19/06/17 16:21:47 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/06/17 16:21:47 INFO MemoryStore: MemoryStore cleared
19/06/17 16:21:47 INFO BlockManager: BlockManager stopped
19/06/17 16:21:48 INFO BlockManagerMaster: BlockManagerMaster stopped
19/06/17 16:21:48 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/06/17 16:21:48 INFO SparkContext: Successfully stopped SparkContext
19/06/17 16:21:48 INFO ShutdownHookManager: Shutdown hook called
19/06/17 16:21:48 INFO ShutdownHookManager: Deleting directory /private/var/folders/ql/dpk0v2gs15z83pvwt_g3n7lh0000gn/T/spark-9368b8cb-0cf6-45a5-9548-a9c1975dab46
Your core exception is
java.lang.IllegalArgumentException: Unsupported class file major version 56
which indicates that at some point you are attempting to run bytecode that was compiled for a different version of Java than your runtime. Make sure that you are running Spark on a Java 8 JRE and make sure that any dependencies (e.g. your Postgres JDBC driver) are also built for Java 8.

How to create a xml file per records in spark scala

I have a file that has records like below
1_107570667_ANA_2C68EF2F-AB17-40EF-9095-387DE1D5D745_App.xml|<CAudit><ai2aiinst nT="LevFcf#A0" auNdSTy="Analytics" auNdTy="Identifier" ndNo="1" aId="1" conDes="Levered Free Cash Flow" conCd="LevFcf" aiaGUId="1_107570667_ANA_2C68EF2F-AB17-40EF-9095-387DE1D5D745" aiaId="1" aiKey="2990569588" aiId="14" pEndDt="2013-Dec-31" perCd="A" isYr2Dt="False" ><AudNode aId="1" ndNo="2" auNdTy="Operation" auNdSTy="-" nV="2626287569.000000000000000" ><AudNode aId="1" ndNo="3" auNdTy="Operation" auNdSTy="-" nV="2825849069.000000000000000" ><AudNode aId="1" ndNo="4" auNdTy="Identifier" auNdSTy="Standardized" nT="STD.SEBITDA#A0" nV="3130019939.000000000000000" ><ai2si nV="3130019939.00000" nT="STD.SEBITDA#A0" auNdSTy="Standardized" auNdTy="Identifier" ndNo="4" aId="1" inId="1035" conDes="Earnings before Interest, Taxes, Depreciation & Amortization (EBITDA)" conCd="SEBITDA" stdaGUId="841_107570667_STD_2C68EF2F-AB17-40EF-9095-387DE1D5D745" stdIaId="841" siKey="12004131416271429" siId="413" sLiCurIso="KRW" sCurIso="KRW" stCurIso="KRW" stTyCd="INC" sId="1" pEndDt="2013-Dec-31" pId="2" fId="192730348494" fbId="1" /></AudNode><AudNode aId="1" ndNo="5" auNdTy="Identifier" auNdSTy="Standardized" nT="STD.STAX#A0" nV="304170870.000000000000000" ><ai2si nV="304170870.00000" nT="STD.STAX#A0" auNdSTy="Standardized" auNdTy="Identifier" ndNo="5" aId="1" inId="968" conDes="Income Taxes" conCd="STAX" stdaGUId="807_107570667_STD_2C68EF2F-AB17-40EF-9095-387DE1D5D745" stdIaId="807" siKey="120038112041962629" siId="381" sLiCurIso="KRW" sCurIso="KRW" stCurIso="KRW" stTyCd="INC" sId="1" pEndDt="2013-Dec-31" pId="2" fId="192730348494" fbId="1" /></AudNode></AudNode><AudNode aId="1" ndNo="6" auNdTy="Operation" auNdSTy="SUM" nV="199561500.000000000000000" ><AudNode aId="1" ndNo="7" auNdTy="Identifier" auNdSTy="Standardized" nT="STD.SCEX#A0" nV="199561500.000000000000000" ><ai2si nV="199561500.00000" nT="STD.SCEX#A0" auNdSTy="Standardized" auNdTy="Identifier" ndNo="7" aId="1" inId="888" conDes="Capital Expenditures - Total" conCd="SCEX" stdaGUId="704_107570667_STD_2C68EF2F-AB17-40EF-9095-387DE1D5D745" stdIaId="704" siKey="12002771860094347" siId="277" sLiCurIso="KRW" sCurIso="KRW" stCurIso="KRW" stTyCd="CAS" sId="1" pEndDt="2013-Dec-31" pId="2" fId="192730348494" fbId="1" /></AudNode><AudNode aId="1" ndNo="8" auNdTy="Constant" nV="0.000000000000000" /></AudNode></AudNode></ai2aiinst></CAudit>
3_107570667_ANA_2C68EF2F-AB17-40EF-9095-387DE1D5D745_App.xml|<CAudit><ai2aiinst nT="ExcessCashMargin#A0" auNdSTy="Analytics" auNdTy="Identifier" ndNo="1" aId="3" conDes="Excess Cash Margin - %" conCd="ExcessCashMargin" aiaGUId="3_107570667_ANA_2C68EF2F-AB17-40EF-9095-387DE1D5D745" aiaId="3" aiKey="2990569579" aiId="5" pEndDt="2013-Dec-31" perCd="A" isYr2Dt="False" ><AudNode aId="3" ndNo="2" auNdTy="Operation" auNdSTy="*" nV="2.257160458878393" ><AudNode aId="3" ndNo="8" auNdTy="Identifier" auNdSTy="PseudoFinancialConcept" nT="PERCENTSCALE#A0" nV="100.000000000000000" /><AudNode aId="3" ndNo="3" auNdTy="Operation" auNdSTy="//" nV="0.022571604588784" ><AudNode aId="3" ndNo="7" auNdTy="Identifier" auNdSTy="Standardized" nT="STD.STLR#A0" nV="68201182151.000000000000000" ><ai2si nV="68201182151.00000" nT="STD.STLR#A0" auNdSTy="Standardized" auNdTy="Identifier" ndNo="7" aId="3" inId="990" conDes="Revenue from Business Activities - Total" conCd="STLR" stdaGUId="813_107570667_STD_2C68EF2F-AB17-40EF-9095-387DE1D5D745" stdIaId="813" siKey="12003871970759396" siId="387" sLiCurIso="KRW" sCurIso="KRW" stCurIso="KRW" stTyCd="INC" sId="1" pEndDt="2013-Dec-31" pId="2" fId="192730348494" fbId="1" /></AudNode><AudNode aId="3" ndNo="4" auNdTy="Operation" auNdSTy="-" nV="1539410116.000000000000000" ><AudNode aId="3" ndNo="6" auNdTy="Identifier" auNdSTy="Standardized" nT="STD.SNIC#A0" nV="438846856.000000000000000" ><ai2si nV="438846856.00000" nT="STD.SNIC#A0" auNdSTy="Standardized" auNdTy="Identifier" ndNo="6" aId="3" inId="1055" conDes="Net Income after Minority Interest" conCd="SNIC" stdaGUId="856_107570667_STD_2C68EF2F-AB17-40EF-9095-387DE1D5D745" stdIaId="856" siKey="120043012135950005" siId="430" sLiCurIso="KRW" sCurIso="KRW" stCurIso="KRW" stTyCd="INC" sId="1" pEndDt="2013-Dec-31" pId="2" fId="192730348494" fbId="1" /></AudNode><AudNode aId="3" ndNo="5" auNdTy="Identifier" auNdSTy="Standardized" nT="STD.STLO#A0" nV="1978256972.000000000000000" ><ai2si nV="1978256972.00000" nT="STD.STLO#A0" auNdSTy="Standardized" auNdTy="Identifier" ndNo="5" aId="3" inId="924" conDes="Net Cash Flow from Operating Activities" conCd="STLO" stdaGUId="719_107570667_STD_2C68EF2F-AB17-40EF-9095-387DE1D5D745" stdIaId="719" siKey="12002951348701451" siId="295" sLiCurIso="KRW" sCurIso="KRW" stCurIso="KRW" stTyCd="CAS" sId="1" pEndDt="2013-Dec-31" pId="2" fId="192730348494" fbId="1" /></AudNode></AudNode></AudNode></AudNode></ai2aiinst></CAudit>
5_107570667_ANA_2C68EF2F-AB17-40EF-9095-387DE1D5D745_App.xml|<CAudit><ai2aiinst nT="Cf#A0" auNdSTy="Analytics" auNdTy="Identifier" ndNo="1" aId="5" conDes="Cash Flow" conCd="Cf" aiaGUId="5_107570667_ANA_2C68EF2F-AB17-40EF-9095-387DE1D5D745" aiaId="5" aiKey="2990569577" aiId="3" pEndDt="2013-Dec-31" perCd="A" isYr2Dt="False" ><AudNode aId="5" ndNo="2" auNdTy="Operation" auNdSTy="-" nV="898935497.000000000000000" ><AudNode aId="5" ndNo="6" auNdTy="Constant" nV="0.000000000000000" /><AudNode aId="5" ndNo="3" auNdTy="Operation" auNdSTy="+" nV="898935497.000000000000000" ><AudNode aId="5" ndNo="5" auNdTy="Identifier" auNdSTy="Standardized" nT="STD.STDAE#A0" nV="460088641.000000000000000" ><ai2si nV="460088641.00000" nT="STD.STDAE#A0" auNdSTy="Standardized" auNdTy="Identifier" ndNo="5" aId="5" inId="956" conDes="Depreciation, Depletion & Amortization - Total" conCd="STDAE" stdaGUId="796_107570667_STD_2C68EF2F-AB17-40EF-9095-387DE1D5D745" stdIaId="796" siKey="120036611860540497" siId="366" sLiCurIso="KRW" sCurIso="KRW" stCurIso="KRW" stTyCd="INC" sId="1" pEndDt="2013-Dec-31" pId="2" fId="192730348494" fbId="1" /></AudNode><AudNode aId="5" ndNo="4" auNdTy="Identifier" auNdSTy="Standardized" nT="STD.SIAT#A0" nV="438846856.000000000000000" ><ai2si nV="438846856.00000" nT="STD.SIAT#A0" auNdSTy="Standardized" auNdTy="Identifier" ndNo="4" aId="5" inId="1018" conDes="Net Income after Tax" conCd="SIAT" stdaGUId="831_107570667_STD_2C68EF2F-AB17-40EF-9095-387DE1D5D745" stdIaId="831" siKey="120040511473155197" siId="405" sLiCurIso="KRW" sCurIso="KRW" stCurIso="KRW" stTyCd="INC" sId="1" pEndDt="2013-Dec-31" pId="2" fId="192730348494" fbId="1" /></AudNode></AudNode></AudNode></ai2aiinst></CAudit>
I need to make a xml file for each row .
The name of the xml file wouldl be the first column before |
So in this case i will have 3 xml file like below
1_107570667_ANA_2C68EF2F-AB17-40EF-9095-387DE1D5D745_App.xml
3_107570667_ANA_2C68EF2F-AB17-40EF-9095-387DE1D5D745_App.xml
5_107570667_ANA_2C68EF2F-AB17-40EF-9095-387DE1D5D745_App.xml
And each xml file will contain the record after |.
Like this i will have 500000 rows and i need to create xml file for each row .
First, you need to create a paired RDD containing file name and file content as tuple and then use that paired RDD to write the individual files to disk/hadoop.
You can have a look at the following code snippet:
val input = sparkSession.sparkContext.textFile("<your_input_file>")
val pairedRDD = input.map(row => {
val split = row.split("\\|")
val fileName = split(0)
val fileContent = split(1)
(fileName, fileContent)
})
import org.apache.hadoop.io.NullWritable
import org.apache.spark.HashPartitioner
import org.apache.hadoop.mapred.lib.MultipleTextOutputFormat
class RddMultiTextOutputFormat extends MultipleTextOutputFormat[Any, Any] {
override def generateActualKey(key: Any, value: Any): Any = NullWritable.get()
override def generateFileNameForKeyValue(key: Any, value: Any, name: String): String = key.asInstanceOf[String]
}
pairedRDD.partitionBy(new HashPartitioner(1000)).saveAsHadoopFile("<output_path>", classOf[String], classOf[String], classOf[RddMultiTextOutputFormat])
output:

How to load a directed graphML file in NetLogo?

I have used nw:load-graphml "filename.graphml" to load a directed graph in NetLogo, but it is loading undirected graph. Is there any other command to load a directed graphml file in NetLogo?
Below is the code, which I have used to load GraphML file. I have tried both the load-graph and load-graph1 procedures given below.
I also have added directed = "true" attribute to links in the graphml file, for example:
edge directed="true" id="2" source="14341" target="8312"
However, still network loaded is undirected.
to load-graph
let filename user-file
if (filename != false) [
nw:load-graphml filename [
set shape "circle"
set size 1
]
nw:set-context turtles links
]
end
to load-graph1
nw:load-graphml "myfile.graphml"
end
GraphML file:
<?xml version="1.0" encoding="UTF-8"?><graphml xmlns="http://graphml.graphdrawing.org/xmlns">
<key attr.name="label" attr.type="string" for="node" id="label"/>
<key attr.name="Edge Label" attr.type="string" for="edge" id="edgelabel"/>
<key attr.name="weight" attr.type="double" for="edge" id="weight"/>
<key attr.name="r" attr.type="int" for="node" id="r"/>
<key attr.name="g" attr.type="int" for="node" id="g"/>
<key attr.name="b" attr.type="int" for="node" id="b"/>
<key attr.name="x" attr.type="float" for="node" id="x"/>
<key attr.name="y" attr.type="float" for="node" id="y"/>
<key attr.name="size" attr.type="float" for="node" id="size"/>
<graph edgedefault="directed">
<node id="16">
<data key="label">v16</data>
<data key="size">100.0</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">4.917384E-7</data>
<data key="y">48.0</data>
</node>
<node id="15">
<data key="label">v15</data>
<data key="size">97.648</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">14.832003</data>
<data key="y">45.648003</data>
</node>
<node id="17">
<data key="label">v17</data>
<data key="size">97.648</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">-14.832001</data>
<data key="y">45.648003</data>
</node>
<node id="14">
<data key="label">v14</data>
<data key="size">90.832</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">28.211998</data>
<data key="y">38.832</data>
</node>
<node id="18">
<data key="label">v18</data>
<data key="size">90.832</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">-28.212002</data>
<data key="y">38.832</data>
</node>
<node id="13">
<data key="label">v13</data>
<data key="size">80.212</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">38.832</data>
<data key="y">28.211998</data>
</node>
<node id="19">
<data key="label">v19</data>
<data key="size">80.212</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">-38.832</data>
<data key="y">28.211998</data>
</node>
<node id="12">
<data key="label">v12</data>
<data key="size">66.832</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">45.648003</data>
<data key="y">14.832003</data>
</node>
<node id="20">
<data key="label">v20</data>
<data key="size">66.832</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">-45.648003</data>
<data key="y">14.832003</data>
</node>
<node id="1">
<data key="label">v1</data>
<data key="size">52.000004</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">-48.000004</data>
<data key="y">4.917384E-7</data>
</node>
<node id="11">
<data key="label">v11</data>
<data key="size">52.000004</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">48.0</data>
<data key="y">4.917384E-7</data>
</node>
<node id="2">
<data key="label">v2</data>
<data key="size">37.168003</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">-45.648003</data>
<data key="y">-14.832001</data>
</node>
<node id="10">
<data key="label">v10</data>
<data key="size">37.168003</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">45.648003</data>
<data key="y">-14.832001</data>
</node>
<node id="3">
<data key="label">v3</data>
<data key="size">23.788002</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">-38.832</data>
<data key="y">-28.212002</data>
</node>
<node id="9">
<data key="label">v9</data>
<data key="size">23.788002</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">38.832</data>
<data key="y">-28.212002</data>
</node>
<node id="4">
<data key="label">v4</data>
<data key="size">13.168001</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">-28.212002</data>
<data key="y">-38.832</data>
</node>
<node id="8">
<data key="label">v8</data>
<data key="size">13.168001</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">28.211998</data>
<data key="y">-38.832</data>
</node>
<node id="5">
<data key="label">v5</data>
<data key="size">6.3519998</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">-14.832001</data>
<data key="y">-45.648003</data>
</node>
<node id="7">
<data key="label">v7</data>
<data key="size">6.3519998</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">14.832003</data>
<data key="y">-45.648003</data>
</node>
<node id="6">
<data key="label">v6</data>
<data key="size">4.0</data>
<data key="r">0</data>
<data key="g">0</data>
<data key="b">0</data>
<data key="x">4.917384E-7</data>
<data key="y">-48.000004</data>
</node>
<edge id="0" source="20" target="9">
<data key="weight">1.0</data>
</edge>
<edge id="1" source="17" target="8">
<data key="weight">1.0</data>
</edge>
<edge id="2" source="18" target="2">
<data key="weight">1.0</data>
</edge>
<edge id="3" source="7" target="11">
<data key="weight">1.0</data>
</edge>
<edge id="4" source="16" target="4">
<data key="weight">1.0</data>
</edge>
<edge id="5" source="12" target="9">
<data key="weight">1.0</data>
</edge>
<edge id="6" source="5" target="16">
<data key="weight">1.0</data>
</edge>
<edge id="7" source="9" target="1">
<data key="weight">1.0</data>
</edge>
<edge id="8" source="10" target="4">
<data key="weight">1.0</data>
</edge>
<edge id="9" source="2" target="1">
<data key="weight">1.0</data>
</edge>
<edge id="10" source="18" target="15">
<data key="weight">1.0</data>
</edge>
<edge id="11" source="4" target="11">
<data key="weight">1.0</data>
</edge>
<edge id="12" source="4" target="2">
<data key="weight">1.0</data>
</edge>
<edge id="13" source="9" target="8">
<data key="weight">1.0</data>
</edge>
<edge id="14" source="6" target="8">
<data key="weight">1.0</data>
</edge>
<edge id="15" source="4" target="1">
<data key="weight">1.0</data>
</edge>
<edge id="16" source="2" target="18">
<data key="weight">1.0</data>
</edge>
<edge id="17" source="19" target="6">
<data key="weight">1.0</data>
</edge>
<edge id="18" source="1" target="6">
<data key="weight">1.0</data>
</edge>
<edge id="19" source="8" target="9">
<data key="weight">1.0</data>
</edge>
<edge id="20" source="10" target="3">
<data key="weight">1.0</data>
</edge>
<edge id="21" source="12" target="19">
<data key="weight">1.0</data>
</edge>
<edge id="22" source="11" target="4">
<data key="weight">1.0</data>
</edge>
<edge id="23" source="3" target="18">
<data key="weight">1.0</data>
</edge>
<edge id="24" source="13" target="14">
<data key="weight">1.0</data>
</edge>
<edge id="25" source="10" target="2">
<data key="weight">1.0</data>
</edge>
<edge id="26" source="17" target="6">
<data key="weight">1.0</data>
</edge>
<edge id="27" source="6" target="7">
<data key="weight">1.0</data>
</edge>
<edge id="28" source="12" target="16">
<data key="weight">1.0</data>
</edge>
<edge id="29" source="20" target="3">
<data key="weight">1.0</data>
</edge>
<edge id="30" source="13" target="5">
<data key="weight">1.0</data>
</edge>
<edge id="31" source="20" target="11">
<data key="weight">1.0</data>
</edge>
<edge id="32" source="11" target="15">
<data key="weight">1.0</data>
</edge>
<edge id="33" source="20" target="15">
<data key="weight">1.0</data>
</edge>
<edge id="34" source="15" target="17">
<data key="weight">1.0</data>
</edge>
<edge id="35" source="10" target="8">
<data key="weight">1.0</data>
</edge>
<edge id="36" source="14" target="7">
<data key="weight">1.0</data>
</edge>
<edge id="37" source="8" target="2">
<data key="weight">1.0</data>
</edge>
<edge id="38" source="16" target="7">
<data key="weight">1.0</data>
</edge>
<edge id="39" source="1" target="19">
<data key="weight">1.0</data>
</edge>
<edge id="40" source="5" target="13">
<data key="weight">1.0</data>
</edge>
<edge id="41" source="1" target="20">
<data key="weight">1.0</data>
</edge>
<edge id="42" source="17" target="4">
<data key="weight">1.0</data>
</edge>
<edge id="43" source="13" target="9">
<data key="weight">1.0</data>
</edge>
<edge id="44" source="14" target="12">
<data key="weight">1.0</data>
</edge>
<edge id="45" source="8" target="13">
<data key="weight">1.0</data>
</edge>
<edge id="46" source="8" target="1">
<data key="weight">1.0</data>
</edge>
<edge id="47" source="9" target="2">
<data key="weight">1.0</data>
</edge>
<edge id="48" source="5" target="14">
<data key="weight">1.0</data>
</edge>
<edge id="49" source="16" target="11">
<data key="weight">1.0</data>
</edge>
<edge id="50" source="4" target="9">
<data key="weight">1.0</data>
</edge>
<edge id="51" source="1" target="8">
<data key="weight">1.0</data>
</edge>
<edge id="52" source="7" target="12">
<data key="weight">1.0</data>
</edge>
<edge id="53" source="20" target="16">
<data key="weight">1.0</data>
</edge>
</graph>
</graphml>
nw:load-graphml can load directed graphs.
Make sure the edgedefault="directed" is set for the <graph> element in your GraphML file. Also make sure that, if you use a link breed, that breed is defined as directed inside NetLogo.
If that still doesn't work, you will have to show us a sample GraphML file and the exact code you use to load it...
Edit
I have investigated further, and I think this might be a bug in the way nw loads a GraphML file with unbreeded links.
The elegant way to get around it would be to create a breed of directed links and use a breed attribute inside your GraphML file to set the links' breed.
A much quicker, more hackish way to get around the problem is to add something like this before you load your network:
let dummies []
create-turtles 2 [
create-links-to other turtles
set dummies fput self dummies
]
And then, after you have loaded your network:
foreach dummies [ t -> ask t [ die ] ]
This relies on the fact that NetLogo's unbreeded links can be either directed or undirected, but not both. By created dummy directed links before the network is loaded, we force NetLogo to create all other links as directed.
I have opened an issue on GitHub about the problem and will investigate further. Until then, that hack should save you.

Weird behaviour editing tt_news_template

I am developing my webpage with the ttnews plugin, and I edited the tt_news_v3_template.html so it shows the first 3 news in my main page. I want it with this format, so I updated the ###TEMPLATE_LIST3###. Thats the new code
<!-- ###TEMPLATE_LIST3### begin
This is the template for the list of news in the archive or news page or search
-->
<div class="news-list3-container">
###NEWS_CATEGORY_ROOTLINE###
<!-- ###CONTENT### begin
This is the part of the template substituted with the list of news:
-->
<!-- ###NEWS### begin -->
<article class="box">
<div class="image-border">
<a href="#" class="image image-left">
<!--###LINK_ITEM###-->###NEWS_IMAGE###<!--###LINK_ITEM###-->
</a>
</div>
<h2><!--###LINK_ITEM###-->###NEWS_TITLE###<!--###LINK_ITEM###--></h2>
<p class="subtitle">###NEWS_SUBHEADER###</p>
<p>###NEWS_CONTENT###</p>
<p>###CATWRAP_B### ###TEXT_CAT### ###NEWS_CATEGORY### ###NEWS_CATEGORY_IMAGE### ###CATWRAP_E###</p>
<p>
Ver Más
</p>
</article>
<!-- ###NEWS### end-->
<!-- ###NEWS_1### begin -->
<div class="row" style="margin-top:15px; display: inline-block; width:100%">
<section class="box" style="width:49%; float:left">
<h2><!--###LINK_ITEM###-->###NEWS_TITLE###<!--###LINK_ITEM###--></h2>
<p class="subtitle">###NEWS_SUBHEADER###</p>
<div class="image-border">
<a href="#" class="image image-full">
<!--###LINK_ITEM###-->###NEWS_IMAGE###<!--###LINK_ITEM###-->
</a>
</div>
<p>###NEWS_CONTENT###</p>
<p>###CATWRAP_B### ###TEXT_CAT### ###NEWS_CATEGORY### ###NEWS_CATEGORY_IMAGE### ###CATWRAP_E###</p>
<p>
Ver Más
</p>
</section>
<!-- ###NEWS_1### end-->
<!-- ###NEWS_2### begin -->
<section class="box" style="width:49%; float:left">
<h2><!--###LINK_ITEM###-->###NEWS_TITLE###<!--###LINK_ITEM###--></h2>
<p class="subtitle">###NEWS_SUBHEADER###</p>
<div class="image-border">
<a href="#" class="image image-full">
<!--###LINK_ITEM###-->###NEWS_IMAGE###<!--###LINK_ITEM###-->
</a>
</div>
<p>###NEWS_CONTENT###</p>
<p>###CATWRAP_B### ###TEXT_CAT### ###NEWS_CATEGORY### ###NEWS_CATEGORY_IMAGE### ###CATWRAP_E###</p>
<p>
Ver Más
</p>
</section>
<!-- ###NEWS_2### end-->
<!-- ###CONTENT### end -->
</div>
</div>
but when I see the page, the two first news are displayed well, with the divs and the labels that I assigned, but the last new duplicate the structure of the first new, instead of using it own. This is the generated code
<div class="news-list3-container">
<article class="box">
<div class="image-border">
<a href="#" class="image image-left">
</a>
</div>
<h2>Noticia 3</h2>
<p class="subtitle"></p><p>subtitle noticia 3<span class="news-list-morelink">[Leer más]</span></p><p></p>
<p>texto noticia 3</p>
<p></p><div class="news-list-category"> Categoría: News </div><p></p>
<p>
Ver Más
</p>
</article>
<div class="row" style="margin-top:15px; display: inline-block; width:100%">
<section class="box" style="width:49%; float:left">
<h2>dsadasdasd</h2>
<p class="subtitle"></p><p>dasdasdasdas<span class="news-list-morelink">[Leer más]</span></p><p></p>
<div class="image-border">
<a href="#" class="image image-full">
</a>
</div>
<p><span style="font-family: sans-serif; line-height: 17.265625px; background-color: rgb(249, 249, 249); ">Lorem ipsum ad his scripta blandit partiendo, eum fastidii accumsan euripidis in, eum liber hendrerit an. Qui ut wisi vocibus suscipiantur, quo dicit ridens inciderint id. Quo mundi lobortis reformidans eu, legimus senserit definiebas an eos. Eu sit tincidunt incorrupte definitionem, vis mutat affert percipit cu, eirmod consectetuer signiferumque eu per. In usu latine equidem dolores. Quo no falli viris intellegam, ut fugit veritus placerat per. Ius id vidit volumus mandamus, vide veritus democritum te nec, ei eos debet libris consulatu. No mei ferri graeco dicunt, ad cum veri accommodare. Sed at malis omnesque delicata, usu et iusto zzril meliore. Dicunt maiorum eloquentiam cum cu, sit summo dolor essent te. Ne quodsi nusquam legendos has, ea dicit voluptua eloquentiam pro, ad sit quas qualisque. Eos vocibus deserunt quaestio ei. Blandit incorrupte quaerendum in quo, nibh impedit id vis, vel no nullam semper audiam. Ei populo graeci consulatu mei, has ea stet modus phaedrum. Inani oblique ne has, duo et veritus detraxit. Tota ludus oratio ea mel, offendit persequeris ei vim. Eos dicat oratio partem ut, id cum ignota senserit intellegat. Sit inani ubique graecis ad, quando graecis liberavisse et cum, dicit option eruditi at duo. Homero salutatus suscipiantur eum id, tamquam voluptaria expetendis ad sed, nobis feugiat similique usu ex. Eum hinc argumentum te, no sit percipit adversarium, ne qui feugiat persecuti. Odio omnes scripserit ad est, ut vidit lorem maiestatis his, putent mandamus gloriatur ne pro. Oratio iriure rationibus ne his, ad est corrumpit splendide. Ad duo appareat moderatius, ei falli tollit denique eos. Dicant evertitur mei in, ne his deserunt perpetua sententiae, ea sea omnes similique vituperatoribus. Ex mel errem intellegebat comprehensam, vel ad tantas antiopam delicatissimi, tota ferri affert eu nec. Legere expetenda pertinacia ne pro, et pro impetus persius assueverit. Ea mei nullam facete, omnis oratio offendit ius cu. Doming takimata repudiandae usu an, mei dicant takimata id, pri eleifend inimicus euripidis at. His vero singulis ea, quem euripidis abhorreant mei ut, et populo iriure vix. Usu ludus affert voluptaria ei, vix ea error definitiones, movet fastidii signiferumque in qui. Vis prodesset adolescens adipiscing te, usu mazim perfecto recteque at, assum putant erroribus mea in. Vel facete imperdiet id, cum an libris luptatum perfecto, vel fabellas inciderint ut. Veri facete debitis ea vis, ut eos oratio erroribus. Sint facete perfecto no vel, vim id omnium insolens. Vel dolores perfecto pertinacia ut, te mel meis ullum dicam, eos assum facilis corpora in. Mea te unum viderer dolores, nostrum detracto nec in, vis no partem definiebas constituam. Dicant utinam philosophia has cu, hendrerit prodesset at nam, eos an bonorum dissentiet. Has ad placerat intellegam consectetuer, no adipisci mandamus senserit pro, torquatos similique percipitur est ex. Pro ex putant deleniti repudiare, vel an aperiam sensibus suavitate. Ad vel epicurei convenire, ea soluta aliquid deserunt ius, pri in errem putant feugiat. Sed iusto nihil populo an, ex pro novum homero cotidieque. Te utamur civibus eleifend qui, nam ei brute doming concludaturque, modo aliquam facilisi nec no. Vidisse maiestatis constituam eu his, esse pertinacia intellegam ius cu. Eos ei odio veniam, eu sumo altera adipisci eam, mea audiam prodesset persequeris ea. Ad vitae dictas vituperata sed, eum posse labore postulant id. Te eligendi principes dignissim sit, te vel dicant officiis repudiandae. Id vel sensibus honestatis omittantur, vel cu nobis commune patrioque. In accusata definiebas qui, id tale malorum dolorem sed, solum clita phaedrum ne his. Eos mutat ullum forensibus ex, wisi perfecto urbanitas cu eam, no vis dicunt impetus. Assum novum in pri, vix an suavitate moderatius, id has reformidans referrentur. Elit inciderint omittantur duo ut, dicit democritum signiferumque eu est, ad suscipit delectus mandamus duo. An harum equidem maiestatis nec. At has veri feugait placerat, in semper offendit praesent his. Omnium impetus facilis sed at, ex viris tincidunt ius. Unum eirmod dignissim id quo. Sit te atomorum quaerendum neglegentur, his primis tamquam et. Eu quo quot veri alienum, ea eos nullam luptatum accusamus. Ea mel causae phaedrum reprimique, at vidisse dolores ocurreret nam.</span></p>
<p></p><div class="news-list-category"> Categoría: News </div><p></p>
<p>
Ver Más
</p>
</section>
<article class="box">
<div class="image-border">
<a href="#" class="image image-left">
</a>
</div>
<h2>Noticia 2</h2>
<p class="subtitle"></p><p>subtitulo de la noticia 2<span class="news-list-morelink">[Leer más]</span></p><p></p>
<p>Contenido de la noticia 2</p>
<p></p><div class="news-list-category"> Categoría: News, Blog </div><p></p>
<p>
Ver Más
</p>
</article>
</div>
</div>
Note that the 3th new is wrapped in a article class="box" like the first new, instead of being using it own structure. I googled a lot and check the labels, but I have no clue about how to solve it. Someone can give me some advices or tips?
I am using typo3 4.5 and tt_news 3.0.1
Thanks!
You have to set
plugin.tt_news.alternatingLayouts = 3
In your Typoscript Setup. tt_news will use only 2 alternating Layouts by default.