ArrayOutOfBoundIndex on xls Rules - drools

ArrayOutOfBound Error
enter image description here
I have an xls rule [1 & 2], when I run it as JUnit, it works fine, but when I run it as Maven test, an ArrayOutOfBounds [3] error appears and I couldn't find any explanation. Any hint ?
========================================================
The pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.ibm.cio.cloud.cost</groupId>
<artifactId>bluecost</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>bluecost</name>
<url>http://maven.apache.org</url>
<profiles>
<profile>
<id>dev</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<properties>
<build.profile.id>dev</build.profile.id>
</properties>
</profile>
<profile>
<id>test</id>
<properties>
<build.profile.id>test</build.profile.id>
</properties>
<build>
<plugins>
<plugin>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>prod</id>
<properties>
<build.profile.id>prod</build.profile.id>
</properties>
</profile>
</profiles>
<properties>
<jdk.version>1.8</jdk.version>
<spring.version>4.3.9.RELEASE</spring.version>
<spring.batch.version>3.0.8.RELEASE</spring.batch.version>
<mysql.driver.version>5.1.44</mysql.driver.version>
<db2.driver.version>10.1.0</db2.driver.version>
</properties>
<dependencies>
<!-- Spring Core -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${spring.version}</version>
</dependency>
<!-- Spring Core -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<version>${spring.version}</version>
</dependency>
<!-- Spring Test -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${spring.version}</version>
</dependency>
<!-- Spring Batch dependencies -->
<dependency>
<groupId>org.springframework.batch</groupId>
<artifactId>spring-batch-core</artifactId>
<version>${spring.batch.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.batch</groupId>
<artifactId>spring-batch-infrastructure</artifactId>
<version>${spring.batch.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.batch</groupId>
<artifactId>spring-batch-test</artifactId>
<version>${spring.batch.version}</version>
<scope>test</scope>
</dependency>
<!-- MySQL database driver -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>${mysql.driver.version}</version>
</dependency>
<dependency>
<groupId>com.ibm.db2.jcc</groupId>
<artifactId>db2jcc_license_cisuz</artifactId>
<version>${db2.driver.version}</version>
</dependency>
<dependency>
<groupId>com.ibm.db2.jcc</groupId>
<artifactId>db2jcc</artifactId>
<version>${db2.driver.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<version>2.4.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>commons-pool</groupId>
<artifactId>commons-pool</artifactId>
<version>1.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.drools</groupId>
<artifactId>drools-spring</artifactId>
<version>5.4.0.Final</version>
</dependency>
<dependency>
<groupId>org.eclipse.jdt</groupId>
<artifactId>org.eclipse.jdt.core</artifactId>
<version>3.7.1</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.14</version>
</dependency>
</dependencies>
<build>
<finalName>bluecost-batch</finalName>
<filters>
<filter>profiles/${build.profile.id}/config.properties</filter>
</filters>
<resources>
<resource>
<filtering>true</filtering>
<directory>src/main/resources</directory>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<version>2.9</version>
<configuration>
<downloadSources>true</downloadSources>
<downloadJavadocs>false</downloadJavadocs>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>${jdk.version}</source>
<target>${jdk.version}</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.20.1</version>
<configuration>
<excludes>
<exclude></exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
<organization>
<name>IBM Corporation</name>
</organization>`enter code here`
</project>
========================
=====================
And the JUnit I created...
#Before
public void setUp() {
final Logger logger = Logger.getLogger("setup()");
logger.info("-----------------------------");
}
#Test
public void testDefaultValuesbyChannelName(){
BlueReport testBlueReport = new BlueReport();
// Arbitrary value, this should be "SOFTLYER"
testBlueReport.setAccount_id("29302"); // Must exist
testBlueReport.setOrg_co("SOFTLYER");
SSCData testSSCdataOutput = new SSCData();
testSSCdataOutput.setORIG_LOC_CD(null); // values will be filled by the rule
testSSCdataOutput.setSERVICE_CD(null); // this too will be filled by the rule
// values will be filled by the rules
String expectedOrigLocCd = "SLR";
//SSCDataOutput.setORIG_LOC_CD("SLR");
String expectedServiceTypCd = "SLR";
//SSCDataOutput.setSERVICE_TYP_CD("SLR");
String expectedServiceCd = "SLIC";
//SSCDataOutput.setSERVICE_CD("SLIC");
String expectedServiceGroupId = "BASE";
//SSCDataOutput.setSERVICE_GROUP_ID("BASE");
String expectedRateClasCd = "OGS";
//SSCDataOutput.setRATECLAS_CD("OGS");
String expectedLocalField1 = "INVCE ID";
//SSCDataOutput.setLOCAL_FIELD_1("INVCE ID");
//SSCDataOutput.setLOCAL_FIELD_2(null);//Not anymore at the Invoice level. can be Nulled.
String expectedLocalField3 = "ACCT ID";
//SSCDataOutput.setLOCAL_FIELD_3("ACCT ID");
//SSCDataOutput.setLOCAL_FIELD_5(null);
//SSCDataOutput.setLOCAL_FIELD_6(null);
final Logger logger = Logger.getLogger("testDefaultValuesbyChannelName");
dtSession.execute( Arrays.asList(new Object[] { testSSCdataOutput, testBlueReport }) );
logger.info("Rules that activated: " + testSSCdataOutput.getRuleAudit());
dtSession.setGlobal("logger", logger);
logger.info("Expecting ORIG_LOC_CD to be SLR");
assertTrue("I expected SLR", testSSCdataOutput.getORIG_LOC_CD().contains(expectedOrigLocCd));
logger.info("Expecting SERVICE_CD to be SLIC");
assertTrue("I expected SLIC", testSSCdataOutput.getSERVICE_CD().contains(expectedServiceCd));
logger.info("Expecting SERVICE_TYPE_CD to be SLR");
assertTrue("I expected SLR", testSSCdataOutput.getSERVICE_TYP_CD().contains(expectedServiceTypCd));
logger.info("Expecting SERVICE_GROUP_ID to be BASE");
assertTrue("I expected BASE", testSSCdataOutput.getSERVICE_GROUP_ID().contains(expectedServiceGroupId));
logger.info("Expecting RATECLAS_CD to be OGS");
assertTrue("I expected OGS", testSSCdataOutput.getRATECLAS_CD().contains(expectedRateClasCd));
logger.info("Expecting LOCAL_FIELD1 to be INVCE ID");
assertTrue("I expected INVCE ID", testSSCdataOutput.getLOCAL_FIELD_1().contains(expectedLocalField1));
logger.info("Expecting LOCAL_FIELD3 to be ACCT ID");
assertTrue("I expected ACCCT ID", testSSCdataOutput.getLOCAL_FIELD_3().contains(expectedLocalField3));
And this is the error on Run As ... 'Maven test'
======================================================
Running com.ibm.cio.cloud.cost.unit.DroolsDefaultValuesByChannelNameTest
[ERROR] TestContextManager - Caught exception while allowing TestExecutionListener [org.springframework.test.context.support.DependencyInjectionTestExecutionListener#1de6f05] to prepare test instance [com.ibm.cio.cloud.cost.unit.DroolsDefaultValuesByChannelNameTest#114e780] <java.lang.IllegalStateException: Failed to load ApplicationContext>java.lang.IllegalStateException: Failed to load ApplicationContext
at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContext(DefaultCacheAwareContextLoaderDelegate.java:124)
at org.springframework.test.context.support.DefaultTestContext.getApplicationContext(DefaultTestContext.java:83)
at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.injectDependencies(DependencyInjectionTestExecutionListener.java:117)
at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.prepareTestInstance(DependencyInjectionTestExecutionListener.java:83)
at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:230)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:228)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:287)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:289)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:247)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:94)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:70)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:191)
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:369)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:275)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:239)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:160)
at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:373)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:334)
at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:119)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:407)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dtBase': Invocation of init method failed; nested exception is java.lang.ArrayIndexOutOfBoundsException: 38851
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1628)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:483)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:306)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:197)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:742)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:760)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482)
at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:128)
at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:60)
at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.delegateLoading(AbstractDelegatingSmartContextLoader.java:108)
at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.loadContext(AbstractDelegatingSmartContextLoader.java:251)
at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContextInternal(DefaultCacheAwareContextLoaderDelegate.java:98)
at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContext(DefaultCacheAwareContextLoaderDelegate.java:116)
... 27 more
Caused by: java.lang.ArrayIndexOutOfBoundsException: 38851
at jxl.read.biff.Record.<init>(Record.java:79)
at jxl.read.biff.File.next(File.java:181)
at jxl.read.biff.WorkbookParser.parse(WorkbookParser.java:569)
at jxl.Workbook.getWorkbook(Workbook.java:271)
at org.drools.decisiontable.parser.xls.ExcelParser.parseFile(ExcelParser.java:77)
at org.drools.decisiontable.SpreadsheetCompiler.compile(SpreadsheetCompiler.java:89)
at org.drools.decisiontable.SpreadsheetCompiler.compile(SpreadsheetCompiler.java:68)
at org.drools.decisiontable.DecisionTableProviderImpl.compileStream(DecisionTableProviderImpl.java:37)
at org.drools.decisiontable.DecisionTableProviderImpl.loadFromInputStream(DecisionTableProviderImpl.java:20)
at org.drools.compiler.DecisionTableFactory.loadFromInputStream(DecisionTableFactory.java:15)
at org.drools.compiler.PackageBuilder.decisionTableToPackageDescr(PackageBuilder.java:454)
at org.drools.compiler.PackageBuilder.addPackageFromDecisionTable(PackageBuilder.java:448)
at org.drools.compiler.PackageBuilder.addKnowledgeResource(PackageBuilder.java:690)
at org.drools.builder.impl.KnowledgeBuilderImpl.add(KnowledgeBuilderImpl.java:45)
at org.drools.builder.impl.KnowledgeBuilderImpl.add(KnowledgeBuilderImpl.java:34)
at org.drools.container.spring.beans.KnowledgeBaseBeanFactory.afterPropertiesSet(KnowledgeBaseBeanFactory.java:110)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1687)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1624)
... 42 more

Related

Using Java1.8, JSF 8.0, wildfly 13 and maven 3.3.3. Issue is java.lang.IllegalArgumentException jersey.repackaged.org.objectweb.asm.ClassReader

We are working on jar updates and removing unwanted jars from the project. As you can see it's a very old project and many have added unwanted jars based on their requirment.
I am getting below error from past week, even if I revert my code to past production code. I have removed below jars from the project as we are having SCA scan errors on these. There are no compile issue but getting below exception on runtime.
from WEB-INF/lib
jackcess-1.2.10-SNAPSHOT.jar
gson-1.6.jar
xmlbeans-2.3.0.jar
below from a folder, not used in the project.
jsp-api-2.1-6.0.0.jar
jsr311-api-1.1.1.jar
json-20090211.jar
servlet-2.3.jar
Help would be much appreciated...
07:50:24,097 INFO [org.primefaces.webapp.PostConstructApplicationEventListener] (ServerService Thread Pool -- 99) Running on PrimeFaces 8.0
07:50:24,167 INFO [com.sun.jersey.api.core.servlet.WebAppResourceConfig] (ServerService Thread Pool -- 99) Scanning for root resource and provider classes in the Web app resource paths:
/WEB-INF/lib
/WEB-INF/classes
07:50:24,842 ERROR [org.jboss.msc.service.fail] (ServerService Thread Pool -- 99) MSC000001: Failed to start service jboss.undertow.deployment.default-server.default-host./CactusServer: org.jboss.msc.service.StartException in service jboss.undertow.deployment.default-server.default-host./CactusServer: java.lang.IllegalArgumentException
at org.wildfly.extension.undertow.deployment.UndertowDeploymentService$1.run(UndertowDeploymentService.java:81)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.jboss.threads.ContextClassLoaderSavingRunnable.run(ContextClassLoaderSavingRunnable.java:35)
at org.jboss.threads.EnhancedQueueExecutor.safeRun(EnhancedQueueExecutor.java:1985)
at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.doRunTask(EnhancedQueueExecutor.java:1487)
at org.jboss.threads.EnhancedQueueExecutor$ThreadBody.run(EnhancedQueueExecutor.java:1378)
at java.lang.Thread.run(Thread.java:750)
at org.jboss.threads.JBossThread.run(JBossThread.java:485)
Caused by: java.lang.IllegalArgumentException
at jersey.repackaged.org.objectweb.asm.ClassReader.<init>(ClassReader.java:170)
at jersey.repackaged.org.objectweb.asm.ClassReader.<init>(ClassReader.java:153)
at jersey.repackaged.org.objectweb.asm.ClassReader.<init>(ClassReader.java:424)
at com.sun.jersey.spi.scanning.AnnotationScannerListener.onProcess(AnnotationScannerListener.java:138)
at com.sun.jersey.core.spi.scanning.JarFileScanner.scan(JarFileScanner.java:97)
at com.sun.jersey.spi.scanning.servlet.WebAppResourcesScanner$1.f(WebAppResourcesScanner.java:94)
at com.sun.jersey.core.util.Closing.f(Closing.java:71)
at com.sun.jersey.spi.scanning.servlet.WebAppResourcesScanner.scan(WebAppResourcesScanner.java:92)
at com.sun.jersey.spi.scanning.servlet.WebAppResourcesScanner.scan(WebAppResourcesScanner.java:79)
at com.sun.jersey.api.core.ScanningResourceConfig.init(ScanningResourceConfig.java:80)
at com.sun.jersey.api.core.servlet.WebAppResourceConfig.init(WebAppResourceConfig.java:102)
at com.sun.jersey.api.core.servlet.WebAppResourceConfig.<init>(WebAppResourceConfig.java:89)
at com.sun.jersey.api.core.servlet.WebAppResourceConfig.<init>(WebAppResourceConfig.java:74)
at com.sun.jersey.spi.container.servlet.WebComponent.getWebAppResourceConfig(WebComponent.java:668)
at com.sun.jersey.spi.container.servlet.ServletContainer.getDefaultResourceConfig(ServletContainer.java:435)
at com.sun.jersey.spi.container.servlet.ServletContainer.getDefaultResourceConfig(ServletContainer.java:602)
at com.sun.jersey.spi.container.servlet.WebServletConfig.getDefaultResourceConfig(WebServletConfig.java:87)
at com.sun.jersey.spi.container.servlet.WebComponent.createResourceConfig(WebComponent.java:699)
at com.sun.jersey.spi.container.servlet.WebComponent.createResourceConfig(WebComponent.java:674)
at com.sun.jersey.spi.container.servlet.WebComponent.init(WebComponent.java:205)
at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:394)
at com.sun.jersey.spi.container.servlet.ServletContainer.init(ServletContainer.java:577)
at javax.servlet.GenericServlet.init(GenericServlet.java:244)
at io.undertow.servlet.core.LifecyleInterceptorInvocation.proceed(LifecyleInterceptorInvocation.java:117)
at org.wildfly.extension.undertow.security.RunAsLifecycleInterceptor.init(RunAsLifecycleInterceptor.java:78)
at io.undertow.servlet.core.LifecyleInterceptorInvocation.proceed(LifecyleInterceptorInvocation.java:103)
at io.undertow.servlet.core.ManagedServlet$DefaultInstanceStrategy.start(ManagedServlet.java:300)
at io.undertow.servlet.core.ManagedServlet.createServlet(ManagedServlet.java:140)
at io.undertow.servlet.core.DeploymentManagerImpl$2.call(DeploymentManagerImpl.java:584)
at io.undertow.servlet.core.DeploymentManagerImpl$2.call(DeploymentManagerImpl.java:555)
at io.undertow.servlet.core.ServletRequestContextThreadSetupAction$1.call(ServletRequestContextThreadSetupAction.java:42)
at io.undertow.servlet.core.ContextClassLoaderSetupAction$1.call(ContextClassLoaderSetupAction.java:43)
at org.wildfly.extension.undertow.security.SecurityContextThreadSetupAction.lambda$create$0(SecurityContextThreadSetupAction.java:105)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1514)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1514)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1514)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentInfoService$UndertowThreadSetupAction.lambda$create$0(UndertowDeploymentInfoService.java:1514)
at io.undertow.servlet.core.DeploymentManagerImpl.start(DeploymentManagerImpl.java:597)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentService.startContext(UndertowDeploymentService.java:97)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentService$1.run(UndertowDeploymentService.java:78)
... 8 more
07:50:24,868 ERROR [org.jboss.as.controller.management-operation] (DeploymentScanner-threads - 2) WFLYCTL0013: Operation ("deploy") failed - address: ([("deployment" => "CactusServer.war")]) - failure description: {
"WFLYCTL0080: Failed services" => {"jboss.undertow.deployment.default-server.default-host./CactusServer" => "java.lang.IllegalArgumentException
Caused by: java.lang.IllegalArgumentException"},
"WFLYCTL0288: One or more services were unable to start due to one or more indirect dependencies not being available." => {
"Services that were unable to start:" => ["jboss.deployment.unit.\"CactusServer.war\".deploymentCompleteService"],
"Services that may be the cause:" => [
"jboss.clustering.web.route.default-server",
"jboss.iiop-openjdk.poa-service.rootpoa",
"jboss.txn.service.remote",
"jboss.xts.handlers",
"org.wildfly.clustering.cache.default-service-provider-registry.ejb",
"org.wildfly.clustering.cache.default-service-provider-registry.web",
"org.wildfly.clustering.cache.group.ejb.passivation",
"org.wildfly.clustering.cache.group.hibernate.entity",
"org.wildfly.clustering.cache.group.hibernate.local-query",
"org.wildfly.clustering.cache.group.hibernate.timestamps",
"org.wildfly.clustering.cache.group.server.client-mappings",
"org.wildfly.clustering.cache.group.server.default",
"org.wildfly.clustering.cache.group.web.client-mappings",
"org.wildfly.clustering.cache.group.web.default-server",
"org.wildfly.clustering.cache.group.web.passivation",
"org.wildfly.clustering.cache.registry.ejb.passivation",
"org.wildfly.clustering.cache.registry.server.default",
"org.wildfly.clustering.cache.registry.web.passivation",
"org.wildfly.clustering.cache.registry-entry.ejb.passivation",
"org.wildfly.clustering.cache.registry-entry.hibernate.entity",
"org.wildfly.clustering.cache.registry-entry.hibernate.local-query",
"org.wildfly.clustering.cache.registry-entry.hibernate.timestamps",
"org.wildfly.clustering.cache.registry-entry.server.client-mappings",
"org.wildfly.clustering.cache.registry-entry.server.default",
"org.wildfly.clustering.cache.registry-entry.web.client-mappings",
"org.wildfly.clustering.cache.registry-entry.web.default-server",
"org.wildfly.clustering.cache.registry-entry.web.passivation",
"org.wildfly.clustering.cache.registry-factory.ejb.passivation",
"org.wildfly.clustering.cache.registry-factory.hibernate.entity",
"org.wildfly.clustering.cache.registry-factory.hibernate.local-query",
"org.wildfly.clustering.cache.registry-factory.hibernate.timestamps",
"org.wildfly.clustering.cache.registry-factory.server.client-mappings",
"org.wildfly.clustering.cache.registry-factory.server.default",
"org.wildfly.clustering.cache.registry-factory.web.client-mappings",
"org.wildfly.clustering.cache.registry-factory.web.default-server",
"org.wildfly.clustering.cache.registry-factory.web.passivation",
"org.wildfly.clustering.cache.service-provider-registry.ejb.client-mappings",
"org.wildfly.clustering.cache.service-provider-registry.ejb.passivation",
"org.wildfly.clustering.cache.service-provider-registry.hibernate.entity",
"org.wildfly.clustering.cache.service-provider-registry.hibernate.local-query",
"org.wildfly.clustering.cache.service-provider-registry.hibernate.timestamps",
"org.wildfly.clustering.cache.service-provider-registry.server.client-mappings",
"org.wildfly.clustering.cache.service-provider-registry.server.default",
"org.wildfly.clustering.cache.service-provider-registry.web.client-mappings",
"org.wildfly.clustering.cache.service-provider-registry.web.default-server",
"org.wildfly.clustering.cache.service-provider-registry.web.passivation",
"org.wildfly.clustering.command-dispatcher-factory.ejb",
"org.wildfly.clustering.command-dispatcher-factory.hibernate",
"org.wildfly.clustering.command-dispatcher-factory.local",
"org.wildfly.clustering.command-dispatcher-factory.server",
"org.wildfly.clustering.command-dispatcher-factory.web",
"org.wildfly.clustering.default-command-dispatcher-factory",
"org.wildfly.clustering.group.hibernate",
"org.wildfly.clustering.group.server",
"org.wildfly.clustering.group.web",
"org.wildfly.clustering.infinispan.cache.ejb.passivation",
"org.wildfly.clustering.infinispan.cache.hibernate.entity",
"org.wildfly.clustering.infinispan.cache.hibernate.local-query",
"org.wildfly.clustering.infinispan.cache.hibernate.timestamps",
"org.wildfly.clustering.infinispan.cache.server.client-mappings",
"org.wildfly.clustering.infinispan.cache.server.default",
"org.wildfly.clustering.infinispan.cache.store.hibernate.entity",
"org.wildfly.clustering.infinispan.cache.store.hibernate.local-query",
"org.wildfly.clustering.infinispan.cache.store.hibernate.timestamps",
"org.wildfly.clustering.infinispan.cache.store.server.default",
"org.wildfly.clustering.infinispan.cache.store.web.passivation",
"org.wildfly.clustering.infinispan.cache.web.client-mappings",
"org.wildfly.clustering.infinispan.cache.web.default-server",
"org.wildfly.clustering.infinispan.cache.web.passivation",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.entity",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.entity.expiration",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.entity.locking",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.entity.memory",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.entity.transaction",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.local-query",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.local-query.expiration",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.local-query.locking",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.local-query.memory",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.local-query.transaction",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.timestamps",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.timestamps.expiration",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.timestamps.locking",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.timestamps.memory",
"org.wildfly.clustering.infinispan.cache-configuration.hibernate.timestamps.transaction",
"org.wildfly.clustering.infinispan.cache-configuration.server.client-mappings",
"org.wildfly.clustering.infinispan.cache-configuration.server.default",
"org.wildfly.clustering.infinispan.cache-configuration.server.default.expiration",
"org.wildfly.clustering.infinispan.cache-configuration.server.default.locking",
"org.wildfly.clustering.infinispan.cache-configuration.server.default.memory",
"org.wildfly.clustering.infinispan.cache-configuration.server.default.transaction",
"org.wildfly.clustering.infinispan.cache-configuration.web.client-mappings",
"org.wildfly.clustering.infinispan.cache-configuration.web.default-server",
"org.wildfly.clustering.infinispan.cache-configuration.web.passivation",
"org.wildfly.clustering.infinispan.cache-configuration.web.passivation.expiration",
"org.wildfly.clustering.infinispan.cache-configuration.web.passivation.locking",
"org.wildfly.clustering.infinispan.cache-configuration.web.passivation.memory",
"org.wildfly.clustering.infinispan.cache-configuration.web.passivation.transaction",
"org.wildfly.clustering.infinispan.cache-configuration.web.passivation.write",
"org.wildfly.clustering.infinispan.cache-container.hibernate",
"org.wildfly.clustering.infinispan.cache-container.server",
"org.wildfly.clustering.infinispan.cache-container.web",
"org.wildfly.clustering.infinispan.cache-container-configuration.hibernate",
"org.wildfly.clustering.infinispan.cache-container-configuration.hibernate.transport",
"org.wildfly.clustering.infinispan.cache-container-configuration.server",
"org.wildfly.clustering.infinispan.cache-container-configuration.server.transport",
"org.wildfly.clustering.infinispan.cache-container-configuration.web",
"org.wildfly.clustering.infinispan.cache-container-configuration.web.transport",
"org.wildfly.clustering.infinispan.default-cache.ejb",
"org.wildfly.clustering.infinispan.default-cache.web",
"org.wildfly.clustering.infinispan.default-cache-configuration.server",
"org.wildfly.clustering.infinispan.default-cache-configuration.web"
]
}
}
**POM file**
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<artifactId>CactusServer</artifactId>
<packaging>war</packaging>
<name>Cactus Server Project</name>
<parent>
<groupId>com.xorail</groupId>
<artifactId>com.xorail.talon.pom</artifactId>
<relativePath>../com.xorail.talon.pom</relativePath>
<version>${app-version}</version>
</parent>
<repositories>
<repository>
<id>prime-repo</id>
<name>PrimeFaces Maven Repository</name>
<url>https://repository.primefaces.org</url>
<layout>default</layout>
</repository>
<repository>
<id>Jboss 3rd party</id>
<url>https://repository.jboss.org/nexus/content/repositories/thirdparty-releases/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>com.xorail</groupId>
<artifactId>CactusShared</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.xorail</groupId>
<artifactId>com.xorail.util.property</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.xorail</groupId>
<artifactId>com.xorail.qaqc</artifactId>
<version>${project.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.richfaces.framework/richfaces-api
<dependency> <groupId>org.richfaces.framework</groupId> <artifactId>richfaces-api</artifactId>
<version>3.3.3.Final</version> </dependency> -->
<!-- https://mvnrepository.com/artifact/org.richfaces.framework/richfaces-impl
<dependency> <groupId>org.richfaces.framework</groupId> <artifactId>richfaces-impl</artifactId>
<version>3.3.3.Final</version> </dependency> -->
<!-- https://mvnrepository.com/artifact/org.richfaces.framework/richfaces-impl-jsf2
<dependency> <groupId>org.richfaces.framework</groupId> <artifactId>richfaces-impl-jsf2</artifactId>
<version>3.3.3.Final</version> <scope>provided</scope> </dependency> -->
<!-- https://mvnrepository.com/artifact/org.richfaces.ui/richfaces-ui <dependency>
<groupId>org.richfaces.ui</groupId> <artifactId>richfaces-ui</artifactId>
<version>3.3.3.Final</version> </dependency> -->
<!-- https://mvnrepository.com/artifact/org.apache.axis/axis -->
<dependency>
<groupId>org.apache.axis</groupId>
<artifactId>axis</artifactId>
<version>1.4</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.axis2/axis2 -->
<dependency>
<groupId>org.apache.axis2</groupId>
<artifactId>axis2</artifactId>
<version>1.6.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/commons-httpclient/commons-httpclient -->
<dependency>
<groupId>commons-httpclient</groupId>
<artifactId>commons-httpclient</artifactId>
<version>3.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.quartz-scheduler/quartz -->
<dependency>
<groupId>org.quartz-scheduler</groupId>
<artifactId>quartz</artifactId>
<version>1.8.6</version>
</dependency>
<!-- cron-utils -->
<dependency>
<groupId>com.cronutils</groupId>
<artifactId>cron-utils</artifactId>
<version>9.0.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/net.sf.dozer/dozer -->
<dependency>
<groupId>net.sf.dozer</groupId>
<artifactId>dozer</artifactId>
<version>5.4.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/commons-net/commons-net -->
<dependency>
<groupId>commons-net</groupId>
<artifactId>commons-net</artifactId>
<version>3.0.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.jboss.spec.javax.faces/jboss-jsf-api_2.2_spec -->
<dependency>
<groupId>org.jboss.spec.javax.faces</groupId>
<artifactId>jboss-jsf-api_2.2_spec</artifactId>
<version>2.2.12</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.myfaces.tomahawk/tomahawk21 -->
<dependency>
<groupId>org.apache.myfaces.tomahawk</groupId>
<artifactId>tomahawk21</artifactId>
<version>1.1.14</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-compress -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.16.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.sun.facelets/jsf-facelets <dependency>
<groupId>com.sun.facelets</groupId> <artifactId>jsf-facelets</artifactId>
<version>1.1.15</version> </dependency> -->
<dependency>
<groupId>org.jboss.spec.javax.servlet</groupId>
<artifactId>jboss-servlet-api_3.1_spec</artifactId>
<version>1.0.0.Final</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.jboss.spec.javax.servlet.jsp</groupId>
<artifactId>jboss-jsp-api_2.3_spec</artifactId>
<version>1.0.1.Final</version>
<scope>provided</scope>
</dependency>
<!-- Redmine related references -->
<!-- https://mvnrepository.com/artifact/asm/asm -->
<dependency>
<groupId>asm</groupId>
<artifactId>asm</artifactId>
<version>3.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.httpcomponents/httpclient -->
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.taskadapter/redmine-java-api -->
<dependency>
<groupId>com.taskadapter</groupId>
<artifactId>redmine-java-api</artifactId>
<version>2.6.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.primefaces/primefaces -->
<dependency>
<groupId>org.primefaces</groupId>
<artifactId>primefaces</artifactId>
<version>8.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.primefaces.themes/casablanca -->
<dependency>
<groupId>org.primefaces.themes</groupId>
<artifactId>bootstrap</artifactId>
<version>1.0.10</version>
</dependency>
<dependency>
<groupId>commons-discovery</groupId>
<artifactId>commons-discovery</artifactId>
<version>0.5</version>
</dependency>
<dependency>
<groupId>wsdl4j</groupId>
<artifactId>wsdl4j</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>com.nimbusds</groupId>
<artifactId>oauth2-oidc-sdk</artifactId>
<version>5.24.1</version>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20090211</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>3.0.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/commons-codec/commons-codec -->
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.13</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src</sourceDirectory>
<finalName>${project.artifactId}</finalName>
<resources>
<resource>
<directory>src</directory>
<includes>
<include>**/*.properties</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<warSourceDirectory>WebContent</warSourceDirectory>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>${java-version}</source>
<target>${java-version}</target>
</configuration>
</plugin>
</plugins>
</build>
</project>

Read CSV file in Spark and insert it to HBase using created RDD

I am able to insert the data in HBase table using the Put method. But, now I want to read data from CSV file and write it to the HBase table.
I wrote this code:
object Read_CSV_Method2 {
case class HotelBooking(hotelName : String, is_cancelled : String, reservation_status : String,
reservation_status_date : String)
def main(args : Array[String]){
Logger.getLogger("org").setLevel(Level.ERROR)
val sparkConf = new SparkConf().setAppName("Spark_Hbase_Connection").setMaster("local[*]")
val sc = new SparkContext(sparkConf)
val conf = HBaseConfiguration.create()
// establish a connection
val connection:Connection = ConnectionFactory.createConnection(conf)
// Table on which different commands have to be run.
val tableName = connection.getTable(TableName.valueOf("hotelTable"))
val sqlContext = new SQLContext(sc)
import sqlContext.implicits._
val textRDD = sc.textFile("/Users/sukritmehta/Documents/hotel_bookings.csv")
var c = 1
textRDD.foreachPartition{
iter =>
val hbaseConf = HBaseConfiguration.create()
val connection:Connection = ConnectionFactory.createConnection(hbaseConf)
val myTable = connection.getTable(TableName.valueOf("hotelTable"))
iter.foreach{
f =>
val col = f.split(",")
val insertData = new Put(Bytes.toBytes(c.toString()))
insertData.addColumn(Bytes.toBytes("hotelFam"), Bytes.toBytes("hotelName"), Bytes.toBytes(col(0).toString()))
insertData.addColumn(Bytes.toBytes("hotelFam"), Bytes.toBytes("is_canceled"), Bytes.toBytes(col(1).toString()))
insertData.addColumn(Bytes.toBytes("hotelFam"), Bytes.toBytes("reservation_status"), Bytes.toBytes(col(30).toString()))
insertData.addColumn(Bytes.toBytes("hotelFam"), Bytes.toBytes("reservation_status_date"), Bytes.toBytes(col(31).toString()))
c = c+1
myTable.put(insertData)
}
}
}
}
But after running this code, I am getting the following error:
Exception in thread "dag-scheduler-event-loop" java.lang.NoClassDefFoundError: org/apache/hadoop/mapred/InputSplitWithLocationInfo
at org.apache.spark.rdd.HadoopRDD.getPreferredLocations(HadoopRDD.scala:356)
at org.apache.spark.rdd.RDD$$anonfun$preferredLocations$2.apply(RDD.scala:275)
at org.apache.spark.rdd.RDD$$anonfun$preferredLocations$2.apply(RDD.scala:275)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.rdd.RDD.preferredLocations(RDD.scala:274)
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal(DAGScheduler.scala:2003)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2$$anonfun$apply$1.apply$mcVI$sp(DAGScheduler.scala:2014)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2$$anonfun$apply$1.apply(DAGScheduler.scala:2013)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2$$anonfun$apply$1.apply(DAGScheduler.scala:2013)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2.apply(DAGScheduler.scala:2013)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal$2.apply(DAGScheduler.scala:2011)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$getPreferredLocsInternal(DAGScheduler.scala:2011)
at org.apache.spark.scheduler.DAGScheduler.getPreferredLocs(DAGScheduler.scala:1977)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$17.apply(DAGScheduler.scala:1112)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$17.apply(DAGScheduler.scala:1110)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(DAGScheduler.scala:1110)
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:1069)
at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:1013)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2065)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapred.InputSplitWithLocationInfo
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
... 32 more
pom.xml file
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>Spark_Hbase</groupId>
<artifactId>Spark_Hbase</artifactId>
<version>0.0.1-SNAPSHOT</version>
<build>
<sourceDirectory>src</sourceDirectory>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<!-- Scala and Spark dependencies -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.11</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.0</version>
<scope>provided</scope>
</dependency>
<!-- <dependency> <groupId>org.codehaus.janino</groupId> <artifactId>commons-compiler</artifactId>
<version>3.0.7</version> </dependency> <dependency> <groupId>org.apache.spark</groupId>
<artifactId>spark-network-common_2.11</artifactId> <version>2.1.1</version>
</dependency> -->
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9.9</version>
</dependency>
<!-- <dependency> <groupId>org.mongodb.spark</groupId> <artifactId>mongo-spark-connector_2.10</artifactId>
<version>2.1.1</version> </dependency> -->
<!-- <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-mllib_2.10</artifactId>
<version>2.1.1</version> </dependency> -->
<!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase -->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>2.2.4</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-spark</artifactId>
<version>2.0.0-alpha4</version> <!-- Hortonworks Latest -->
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hbase/hbase-mapreduce -->
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-mapreduce</artifactId>
<version>2.2.4</version>
</dependency>
</dependencies>
</project>
It would be great if anyone could help me out in resolving this issue.
I think you are missing
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.6.5</version>
<scope>provided</scope>
</dependency>

Spark + cassandra - Just one record getting inserted

Am trying to get data from hive and insert the same in Cassandra using Spark.Very surprisingly i see only one record inserted into cassandra inspite of Data frame having 4000+ records.
import org.apache.spark.sql.SparkSession
import com.datastax.spark.connector.cql.CassandraConnector
import com.typesafe.config.ConfigFactory
import org.apache.spark.sql.cassandra._
import com.datastax.spark.connector._
import java.math.BigDecimal
case class sales(wk_nbr: Int,
store_nbr: Int,
sales_amt: BigDecimal)
object HiveConnector extends App {
val cassandraConfig = ConfigFactory.load("cassandra.conf")
println("cassandraConfig loaded = " + cassandraConfig)
val spark = SparkSession.builder().appName("HiveConnector")
.config("spark.sql.warehouse.dir", "file:/data/raw/historical/tables")
.config("hive.exec.dynamic.partition.mode", "nonstrict")
.config("mapred.input.dir.recursive","true")
.config("mapreduce.input.fileinputformat.input.dir.recursive","true")
.config("spark.cassandra.connection.host", "***********")
.config("spark.cassandra.auth.username", "*****un****")
.config("spark.cassandra.auth.password", "******pw*****")
.enableHiveSupport()
.master("yarn").getOrCreate()
import spark.implicits._
val query = "select wk_nbr,store_nbr,sum(sales_amt) as sales_amt from scan where visit_dt between '2018-05-08' and '2018-05-11' group by wm_yr_wk_nbr,store_nbr"
val resDF = spark.sql(query)
resDF.persist()
println("RESDF size = " + resDF.count()) //prints the record count
println("RESDF sample rec = " + resDF.show(2)) //see 2 records in the log
CassandraConnector(spark.sparkContext).withSessionDo { spark =>
spark.execute("CREATE TABLE raaspoc.sales_data (wk_nbr INT PRIMARY KEY, store_nbr INT, sales_amt DOUBLE)")
}
/*
None of the following saveToCassandra work - meaning not inserting all records but only one record
*/
resDF.map { x => sales.apply(x.get(0).asInstanceOf[Int], x.get(1).asInstanceOf[Int],x.get(2).asInstanceOf[BigDecimal])
}.rdd.saveToCassandra("raaspoc","sales_data") // Not working
resDF.rdd.saveToCassandra("raaspoc","sales_data") // Not working
resDF.write.format("org.apache.spark.sql.cassandra").options(Map("table" -> "sales_data", "keyspace" -> "raaspoc")).save() // Not working
resDF.write.cassandraFormat("sales_data","raaspoc").save() // Not working
/*
When the data frame is written to HDFS, i see all 4000+ records in the sales.csv
*/
resDF.write.format("csv").save("hdfs:/dev/test/sales.csv")
println("RESDF size after write to cassandra = " + resDF.count()) //prints 4732 (record count)
spark.close()
}
I dont see any errors in the log and Spark submit completes without any errors but inserts only one record. Following is my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.test.raas</groupId>
<artifactId>RaasDataPipelines</artifactId>
<version>1.0-SNAPSHOT</version>
<inceptionYear>2008</inceptionYear>
<properties>
<scala.version>2.11.0</scala.version>
<spark.version>2.2.0</spark.version>
</properties>
<repositories>
<repository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>${spark.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.specs</groupId>
<artifactId>specs</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.0.7</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.11</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.0.0</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>3.5.0</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-mapping</artifactId>
<version>3.5.0</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-extras</artifactId>
<version>3.5.0</version>
<scope>compile</scope>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<args>
<arg>-target:jvm-1.5</arg>
</args>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>
shade
</goal>
</goals>
</execution>
</executions>
<configuration>
<minimizeJar>true</minimizeJar>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>fat</shadedClassifierName>
<relocations>
<relocation>
<pattern>com.google</pattern>
<shadedPattern>shaded.guava</shadedPattern>
<includes>
<include>com.google.**</include>
</includes>
<excludes>
<exclude>com.google.common.base.Optional</exclude>
<exclude>com.google.common.base.Absent</exclude>
<exclude>com.google.common.base.Present</exclude>
</excludes>
</relocation>
</relocations>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</plugin>
</plugins>
</build>
<reporting>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>
</plugins>
</reporting>
</project>
is there a possibility that your primary key(wk_nbr) to be the same on all the 4000+ rows?

Flink with Kafka connection

Environment:
Ubuntu 16.04.1 LTS
Flink 1.1.3
Kakfa 0.10.1.1
I'm trying to connect flink with kafka (Flink 1.1.3 Kakfa 0.10.1.1)
I already try all the fixes that i could find, but none of them work.
pom.xml :
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>ux</groupId>
<artifactId>logs</artifactId>
<version>1.3-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Flink Quickstart Job</name>
<url>http://www.myorganization.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.3-SNAPSHOT</flink.version>
<slf4j.version>1.7.7</slf4j.version>
<log4j.version>1.2.17</log4j.version>
</properties>
<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.10_2.10</artifactId>
<version>1.3-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
</dependencies>
<profiles>
<profile>
<id>build-jar</id>
<activation>
<activeByDefault>false</activeByDefault>
</activation>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.11</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.11</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes combine.self="override"></excludes>
</artifactSet>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.apache.flink:flink-annotations</exclude>
<exclude>org.apache.flink:flink-shaded-hadoop2</exclude>
<exclude>org.apache.flink:flink-shaded-curator-recipes</exclude>
<exclude>org.apache.flink:flink-core</exclude>
<exclude>org.apache.flink:flink-scala_2.11</exclude>
<exclude>org.apache.flink:flink-runtime_2.11</exclude>
<exclude>org.apache.flink:flink-optimizer_2.11</exclude>
<exclude>org.apache.flink:flink-clients_2.11</exclude>
<exclude>org.apache.flink:flink-avro_2.11</exclude>
<exclude>org.apache.flink:flink-examples-batch_2.11</exclude>
<exclude>org.apache.flink:flink-examples-streaming_2.11</exclude>
<exclude>org.apache.flink:flink-streaming-scala_2.11</exclude>
<exclude>org.apache.flink:flink-scala-shell_2.11</exclude>
<exclude>org.apache.flink:flink-python</exclude>
<exclude>org.apache.flink:flink-metrics-core</exclude>
<exclude>org.apache.flink:flink-metrics-jmx</exclude>
<exclude>org.apache.flink:flink-statebackend-rocksdb_2.11</exclude>
<exclude>log4j:log4j</exclude>
<exclude>org.scala-lang:scala-library</exclude>
<exclude>org.scala-lang:scala-compiler</exclude>
<exclude>org.scala-lang:scala-reflect</exclude>
<exclude>com.data-artisans:flakka-actor_*</exclude>
<exclude>com.data-artisans:flakka-remote_*</exclude>
<exclude>com.data-artisans:flakka-slf4j_*</exclude>
<exclude>io.netty:netty-all</exclude>
<exclude>io.netty:netty</exclude>
<exclude>commons-fileupload:commons-fileupload</exclude>
<exclude>org.apache.avro:avro</exclude>
<exclude>commons-collections:commons-collections</exclude>
<exclude>org.codehaus.jackson:jackson-core-asl</exclude>
<exclude>org.codehaus.jackson:jackson-mapper-asl</exclude>
<exclude>com.thoughtworks.paranamer:paranamer</exclude>
<exclude>org.xerial.snappy:snappy-java</exclude>
<exclude>org.apache.commons:commons-compress</exclude>
<exclude>org.tukaani:xz</exclude>
<exclude>com.esotericsoftware.kryo:kryo</exclude>
<exclude>com.esotericsoftware.minlog:minlog</exclude>
<exclude>org.objenesis:objenesis</exclude>
<exclude>com.twitter:chill_*</exclude>
<exclude>com.twitter:chill-java</exclude>
<exclude>commons-lang:commons-lang</exclude>
<exclude>junit:junit</exclude>
<exclude>org.apache.commons:commons-lang3</exclude>
<exclude>org.slf4j:slf4j-api</exclude>
<exclude>org.slf4j:slf4j-log4j12</exclude>
<exclude>log4j:log4j</exclude>
<exclude>org.apache.commons:commons-math</exclude>
<exclude>org.apache.sling:org.apache.sling.commons.json</exclude>
<exclude>commons-logging:commons-logging</exclude>
<exclude>commons-codec:commons-codec</exclude>
<exclude>com.fasterxml.jackson.core:jackson-core</exclude>
<exclude>com.fasterxml.jackson.core:jackson-databind</exclude>
<exclude>com.fasterxml.jackson.core:jackson-annotations</exclude>
<exclude>stax:stax-api</exclude>
<exclude>com.typesafe:config</exclude>
<exclude>org.uncommons.maths:uncommons-maths</exclude>
<exclude>com.github.scopt:scopt_*</exclude>
<exclude>commons-io:commons-io</exclude>
<exclude>commons-cli:commons-cli</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<artifact>org.apache.flink:*</artifact>
<excludes>
<exclude>org/apache/flink/shaded/com/**</exclude>
<exclude>web-docs/**</exclude>
</excludes>
</filter>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
my java code :
import java.util.Properties;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010;
import org.apache.flink.streaming.util.serialization.SimpleStringSchema;
public class App
{
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "localhost:9092");
properties.setProperty("zookeeper.connect", "localhost:2181");
properties.setProperty("group.id", "flink_consumer");
DataStream<String> messageStream = env.addSource(new FlinkKafkaConsumer010<>
("ux_logs", new SimpleStringSchema(), properties));
messageStream.rebalance().map(new MapFunction<String, String>() {
private static final long serialVersionUID = -6867736771747690202L;
public String map(String value) throws Exception {
return "Kafka and Flink says: " + value;
}
}).print();
env.execute();
}
}
But I get this error:
java.lang.NoClassDefFoundError: org/apache/flink/streaming/api/checkpoint/CheckpointedFunction
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at ux.App.main(App.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:509)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:320)
at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:777)
at org.apache.flink.client.CliFrontend.run(CliFrontend.java:253)
at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1005)
at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1048)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.api.checkpoint.CheckpointedFunction
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
Do i need to remove my kafka, and run an older version?
Is my flink kafka connector wrong?
i tried to use this plugin but it didn't work. (https://ci.apache.org/projects/flink/flink-docs-release-1.3/dev/linking.html)
Thanks.
You have to downgrade your connector:
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.8_2.10</artifactId>
<version>1.1.2</version>
</dependency>
Here is the similar question: https://stackoverflow.com/a/40037895/1252056
If you want to connect to Kafka 0.10~ you will have to move to Flink 1.2, otherwise, as #streetturte mentioned, you will have to downgrade your Kafka connector.
Have a look at the reference table here:
https://ci.apache.org/projects/flink/flink-docs-release-1.2/dev/connectors/kafka.html
Kafka 0.9 and newer version don't need to zookeeper.
You need to upgrade your flink or downgrade Kafka version
--topic myTopic --bootstrap.servers 10.123.34.56:9092 --group.id myGroup
FlinkKafkaConsumer09<CarDB> consumerBinden = new FlinkKafkaConsumer09<CarDB>(
kafkaTopic,
new ClassClassSchema(),
parameterTool.getProperties());
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.9_2.10</artifactId>
<version>${flink.version}</version>
</dependency>
Perhaps you are using Flink-Kafka consumer dependency for Scala in your pom file, whereas your code is in Java. Try flink kafka consumer for Java in your pom. Hope it helps.

Cannot create session bean after switching to SDN4.1

I am happy that SDN4.1 is now available, because version 4 was rather unstable. I have upgraded my project as soon as possible, but I have a strange problem which appears when, a method that uses Neo4j database connection is used. I see such exception:
SEVERE: Servlet.service() for servlet [appServlet] in context with path [/tsg] threw exception [Request processing failed; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'scopedTarget.getSession' defined in package.config.ApplicationConfig: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.neo4j.ogm.session.Session]: Factory method 'getSession' threw exception; nested exception is org.neo4j.ogm.exception.ServiceNotFoundException: org.neo4j.ogm.drivers.http.driver.HttpDriver] with root cause
org.neo4j.ogm.exception.ServiceNotFoundException: org.neo4j.ogm.drivers.http.driver.HttpDriver
at org.neo4j.ogm.service.DriverService.load(DriverService.java:51)
at org.neo4j.ogm.service.DriverService.load(DriverService.java:63)
at org.neo4j.ogm.service.Components.loadDriver(Components.java:128)
at org.neo4j.ogm.service.Components.driver(Components.java:86)
at org.neo4j.ogm.session.SessionFactory.openSession(SessionFactory.java:79)
at org.springframework.data.neo4j.config.Neo4jConfiguration.getSession(Neo4jConfiguration.java:56)
at package.config.ApplicationConfig.getSession(ApplicationConfig.java:255)
at package.config.ApplicationConfig$$EnhancerBySpringCGLIB$$5bc6b7bf.CGLIB$getSession$15(<generated>)
I believe that this is a configuration problem, so here is my config:
package package.config;
import java.util.HashMap;
import java.util.Map;
import javax.sql.DataSource;
import org.neo4j.ogm.config.Configuration;
import org.apache.commons.dbcp2.BasicDataSource;
import org.hibernate.jpa.HibernatePersistenceProvider;
import org.neo4j.ogm.session.Session;
import org.neo4j.ogm.session.SessionFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.ComponentScan.Filter;
import org.springframework.context.annotation.FilterType;
import org.springframework.context.annotation.Import;
import org.springframework.context.annotation.PropertySource;
import org.springframework.context.annotation.Scope;
import org.springframework.context.annotation.ScopedProxyMode;
import org.springframework.core.annotation.Order;
import org.springframework.core.env.Environment;
import org.springframework.core.io.ClassPathResource;
import org.springframework.dao.annotation.PersistenceExceptionTranslationPostProcessor;
import org.springframework.data.jpa.repository.config.EnableJpaRepositories;
import org.springframework.data.neo4j.config.Neo4jConfiguration;
import org.springframework.data.neo4j.repository.config.EnableNeo4jRepositories;
import org.springframework.data.neo4j.server.Neo4jServer;
import org.springframework.data.neo4j.server.RemoteServer;
import org.springframework.data.neo4j.transaction.Neo4jTransactionManager;
import org.springframework.data.transaction.ChainedTransactionManager;
import org.springframework.jdbc.datasource.init.DataSourceInitializer;
import org.springframework.jdbc.datasource.init.DatabasePopulator;
import org.springframework.jdbc.datasource.init.ResourceDatabasePopulator;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.HibernateJpaDialect;
import org.springframework.retry.annotation.EnableRetry;
import org.springframework.transaction.PlatformTransactionManager;
import org.springframework.transaction.annotation.EnableTransactionManagement;
import org.springframework.transaction.jta.JtaTransactionManager;
import org.springframework.web.multipart.commons.CommonsMultipartResolver;
import org.springframework.web.servlet.config.annotation.EnableWebMvc;
import org.springframework.web.servlet.config.annotation.ResourceHandlerRegistry;
import package.controller.DeviceController;
import package.controller.PersonController;
#org.springframework.context.annotation.Configuration
#ComponentScan(basePackages = "package", excludeFilters = {
#Filter(type = FilterType.ASSIGNABLE_TYPE, value = { DeviceController.class,
PersonController.class }) })
#EnableTransactionManagement
#PropertySource("classpath:application.properties")
#EnableJpaRepositories(basePackages = "package.persistence")
#EnableNeo4jRepositories("package.neo4jrepository")
#Import({ WebSpringConfig.class })
public class ApplicationConfig extends Neo4jConfiguration {
#Autowired
Environment env;
public ApplicationConfig() {
}
#Bean
public Configuration getConfiguration() {
Configuration config = new Configuration();
config
.driverConfiguration()
.setDriverClassName
("org.neo4j.ogm.drivers.http.driver.HttpDriver").setURI("http://xxx:yyy#localhost:7474");
return config;
}
#Bean
public DataSource dataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName("com.mysql.jdbc.Driver");
dataSource.setUrl(env.getProperty("database.url"));
dataSource.setUsername(env.getProperty("database.user"));
dataSource.setPassword(env.getProperty("database.password"));
return dataSource;
}
#Bean
public DataSourceInitializer initializeDatabase() {
DataSourceInitializer dbInitializer = new DataSourceInitializer();
dbInitializer.setDataSource(dataSource());
dbInitializer.setDatabasePopulator(databasePopulator());
dbInitializer.setEnabled(true);
return dbInitializer;
}
#Bean
public DatabasePopulator databasePopulator() {
ResourceDatabasePopulator resDatabasePopulator = new ResourceDatabasePopulator();
resDatabasePopulator.setContinueOnError(true);
resDatabasePopulator.addScript(new ClassPathResource("importagreements.sql"));
resDatabasePopulator.setSqlScriptEncoding("UTF-8");
return resDatabasePopulator;
}
#Bean
#Autowired
public LocalContainerEntityManagerFactoryBean entityManagerFactory(DataSource dataSource) {
LocalContainerEntityManagerFactoryBean entityManagerFactory = new LocalContainerEntityManagerFactoryBean();
entityManagerFactory.setDataSource(dataSource);
entityManagerFactory.setPackagesToScan(new String[] { "package.persistence.domain" });
entityManagerFactory.setJpaDialect(new HibernateJpaDialect());
Map<String, String> jpaProperties = new HashMap<String, String>();
jpaProperties.put("hibernate.connection.charSet", "UTF-8");
jpaProperties.put("spring.jpa.hibernate.ddl-auto", "update");
jpaProperties.put("hibernate.ejb.naming_strategy", "org.hibernate.cfg.EJB3NamingStrategy");
jpaProperties.put("hibernate.bytecode.provider", "javassist");
jpaProperties.put("hibernate.dialect", "org.hibernate.dialect.MySQL5InnoDBDialect");
jpaProperties.put("hibernate.hbm2ddl.auto", "update");
jpaProperties.put("hibernate.order_inserts", "true");
jpaProperties.put("hibernate.jdbc.batch_size", "50");
entityManagerFactory.setJpaPropertyMap(jpaProperties);
entityManagerFactory.setPersistenceProvider(new HibernatePersistenceProvider());
return entityManagerFactory;
}
#Bean(name = "transactionManager")
#Autowired
public PlatformTransactionManager neo4jTransactionManager(Neo4jTransactionManager neoTransactionManager,
JpaTransactionManager mysqlTransactioNmanager) throws Exception {
return new ChainedTransactionManager(mysqlTransactioNmanager, neoTransactionManager);
}
#Bean(name = "neo4jTransactionManager")
#Autowired
public Neo4jTransactionManager neo4jTransactionManager(Session session) throws Exception {
return new Neo4jTransactionManager(session);
}
#Bean(name = "mysqlTransactionManager")
#Autowired
public JpaTransactionManager mysqlTransactionManager(LocalContainerEntityManagerFactoryBean entityManagerFactory)
throws Exception {
JpaTransactionManager mysqlTransactioNmanager = new JpaTransactionManager(entityManagerFactory.getObject());
return mysqlTransactioNmanager;
}
#Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslation() {
return new PersistenceExceptionTranslationPostProcessor();
}
#Bean(name = "multipartResolver")
public CommonsMultipartResolver commonsMultipartResolver() {
CommonsMultipartResolver commonsMultipartResolver = new CommonsMultipartResolver();
commonsMultipartResolver.setDefaultEncoding("utf-8");
commonsMultipartResolver.setMaxUploadSize(50000000);
return commonsMultipartResolver;
}
#Override
public SessionFactory getSessionFactory() {
return new SessionFactory(getConfiguration(),"package.domain", "package.relation", "package.util",
"package.httpBody");
}
#Bean
#Scope(value = "session", proxyMode = ScopedProxyMode.TARGET_CLASS)
public Session getSession() throws Exception {
return super.getSession();
}
}
Here is my POM:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>project</groupId>
<artifactId>tsg</artifactId>
<name>tsg</name>
<packaging>war</packaging>
<version>1.0.0-BUILD-SNAPSHOT</version>
<properties>
<!-- Generic properties -->
<java-version>1.7</java-version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<javax.inject.version>1</javax.inject.version>
<cglib.version>2.2.2</cglib.version>
<!-- Web -->
<javax.servlet.version>2.5</javax.servlet.version>
<javax.servlet.jsp.version>2.1</javax.servlet.jsp.version>
<javax.servlet.jstl.version>1.2</javax.servlet.jstl.version>
<jersey.version>1.8</jersey.version>
<com.google.code.gson.version>2.2.4</com.google.code.gson.version>
<javax.ws.rs-api.version>2.0</javax.ws.rs-api.version>
<!-- Spring -->
<org.springframework-version>4.1.9.RELEASE</org.springframework-version>
<spring-security.version>3.2.4.RELEASE</spring-security.version>
<spring-data-commons.version>1.12.0.M1</spring-data-commons.version>
<spring-data-neo4j.version>4.1.0.M1</spring-data-neo4j.version>
<spring-data-jpa.version>1.9.2.RELEASE</spring-data-jpa.version>
<!-- Aspects -->
<org.aspectj-version>1.7.4</org.aspectj-version>
<!-- Neo4j -->
<org.neo4j.app.version>2.0.3</org.neo4j.app.version>
<neo4j.ogm.version>2.0.0-M2</neo4j.ogm.version>
<!-- Hibernate / JPA -->
<org.hibernate.version>4.3.11.Final</org.hibernate.version>
<org.hibernate-validator.version>5.1.1.Final</org.hibernate-validator.version>
<!-- MySQL -->
<mysql-connector-java.version>5.1.34</mysql-connector-java.version>
<!-- Logging -->
<org.slf4j-version>1.6.6</org.slf4j-version>
<log4j.version>1.2.15</log4j.version>
<!-- Testing -->
<junit.version>4.7</junit.version>
<!-- Maven plugin -->
<maven-eclipse-plugin.version>2.9</maven-eclipse-plugin.version>
<maven-compiler-plugin.version>2.5.1</maven-compiler-plugin.version>
<exec-maven-plugin.version>1.2.1</exec-maven-plugin.version>
<!-- Smack -->
<smack.version>4.1.0</smack.version>
</properties>
<repositories>
<repository>
<id>spring-snapshots</id>
<name>Spring Snapshots</name>
<url>http://repo.spring.io/snapshot</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>spring-maven-snapshot</id>
<snapshots><enabled>true</enabled></snapshots>
<name>Springframework Maven MILESTONE Repository</name>
<url>http://maven.springframework.org/milestone</url>
</repository>
<repository>
<id>jcenter-snapshots</id>
<name>jcenter</name>
<url>https://jcenter.bintray.com/</url>
</repository>
<repository>
<id>neo4j-release-repository</id>
<name>Neo4j Maven 2 release repository</name>
<url>http://m2.neo4j.org/content/repositories/releases/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<!-- Apache Commons FileUpload -->
<dependency>
<groupId>commons-fileupload</groupId>
<artifactId>commons-fileupload</artifactId>
<version>1.3.1</version>
</dependency>
<!-- Apache Commons IO -->
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.4</version>
</dependency>
<!-- dependency> <groupId>javax.validation</groupId> <artifactId>validation-api</artifactId>
<version>1.1.0.Final</version> </dependency -->
<!-- Smack -->
<dependency>
<groupId>org.igniterealtime.smack</groupId>
<artifactId>smack-java7</artifactId>
<version>${smack.version}</version>
</dependency>
<dependency>
<groupId>org.igniterealtime.smack</groupId>
<artifactId>smack-tcp</artifactId>
<version>${smack.version}</version>
</dependency>
<dependency>
<groupId>org.igniterealtime.smack</groupId>
<artifactId>smack-im</artifactId>
<version>${smack.version}</version>
</dependency>
<dependency>
<groupId>org.igniterealtime.smack</groupId>
<artifactId>smack-extensions</artifactId>
<version>${smack.version}</version>
</dependency>
<!-- Spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${org.springframework-version}</version>
<exclusions>
<!-- Exclude Commons Logging in favor of SLF4j -->
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<!-- Spring Data -->
<!-- <dependency> <groupId>org.springframework.data</groupId> <artifactId>spring-data-commons</artifactId>
<version>${spring-data-commons.version}</version> </dependency> -->
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-neo4j</artifactId>
<version>${spring-data-neo4j.version}</version>
<exclusions>
<exclusion>
<artifactId>jersey-server</artifactId>
<groupId>com.sun.jersey</groupId>
</exclusion>
<exclusion>
<artifactId>jersey-servlet</artifactId>
<groupId>com.sun.jersey</groupId>
</exclusion>
<exclusion>
<artifactId>jersey-multipart</artifactId>
<groupId>com.sun.jersey.contribs</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.retry</groupId>
<artifactId>spring-retry</artifactId>
<version>1.1.2.RELEASE</version>
</dependency>
<!-- Spring Security -->
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-web</artifactId>
<version>${spring-security.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-core</artifactId>
<version>${spring-security.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-config</artifactId>
<version>${spring-security.version}</version>
</dependency>
<!-- AspectJ -->
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>${org.aspectj-version}</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>1.8.5</version>
</dependency>
<!-- Logging -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${org.slf4j-version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<version>${org.slf4j-version}</version>
<scope>runtime</scope>
</dependency>
<!-- <dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${org.slf4j-version}</version>
<scope>runtime</scope>
</dependency>-->
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
<exclusions>
<exclusion>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
</exclusion>
<exclusion>
<groupId>javax.jms</groupId>
<artifactId>jms</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jdmk</groupId>
<artifactId>jmxtools</artifactId>
</exclusion>
<exclusion>
<groupId>com.sun.jmx</groupId>
<artifactId>jmxri</artifactId>
</exclusion>
</exclusions>
<scope>runtime</scope>
</dependency>
<!-- #Inject -->
<dependency>
<groupId>javax.inject</groupId>
<artifactId>javax.inject</artifactId>
<version>${javax.inject.version}</version>
</dependency>
<!-- Servlet -->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>${javax.servlet.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.servlet.jsp</groupId>
<artifactId>jsp-api</artifactId>
<version>${javax.servlet.jsp.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
<version>${javax.servlet.jstl.version}</version>
</dependency>
<!-- Test -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
<!-- ******* JPA/Hibernate ******** -->
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>${org.hibernate.version}</version>
</dependency>
<dependency>
<groupId>org.hibernate.javax.persistence</groupId>
<artifactId>hibernate-jpa-2.1-api</artifactId>
<version>1.0.0.Final</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-entitymanager</artifactId>
<version>${org.hibernate.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-jpa</artifactId>
<version>${spring-data-jpa.version}</version>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-validator</artifactId>
<version>${org.hibernate-validator.version}</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>${mysql-connector-java.version}</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-dbcp2</artifactId>
<version>2.0</version>
</dependency>
<!-- additional libraries -->
<!-- <dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>${cglib.version}</version>
</dependency> -->
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>${com.google.code.gson.version}</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-client</artifactId>
<version>5.9.1</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-client</artifactId>
<version>5.9.1</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.6.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.6.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml</groupId>
<artifactId>classmate</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-xml</artifactId>
<version>2.6.3</version>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
<version>2.9</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger2</artifactId>
<version>2.1.2</version>
</dependency>
<dependency>
<groupId>io.springfox</groupId>
<artifactId>springfox-swagger-ui</artifactId>
<version>2.1.2</version>
</dependency>
<dependency>
<groupId>org.webjars</groupId>
<artifactId>bootstrap</artifactId>
<version>3.3.5</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-eclipse-plugin</artifactId>
<version>${maven-eclipse-plugin.version}</version>
<configuration>
<additionalProjectnatures>
<projectnature>org.springframework.ide.eclipse.core.springnature</projectnature>
</additionalProjectnatures>
<additionalBuildcommands>
<buildcommand>org.springframework.ide.eclipse.core.springbuilder</buildcommand>
</additionalBuildcommands>
<downloadSources>true</downloadSources>
<downloadJavadocs>true</downloadJavadocs>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven-compiler-plugin.version}</version>
<configuration>
<source>${java-version}</source>
<target>${java-version}</target>
<compilerArgument>-Xlint:all</compilerArgument>
<showWarnings>true</showWarnings>
<showDeprecation>true</showDeprecation>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>${exec-maven-plugin.version}</version>
<configuration>
<mainClass>org.test.int1.Main</mainClass>
</configuration>
</plugin>
</plugins>
<resources>
<!-- Placeholders that are found from the files located in the configured
resource directories are replaced with the property values found from the
profile specific configuration file. -->
<resource>
<filtering>true</filtering>
<directory>src/main/resources</directory>
</resource>
</resources>
<filters>
<!-- Ensures that the config.properties file is always loaded from the
configuration directory of the active Maven profile. -->
<filter>src/main/resources/profiles/${build.profile.id}/config.properties</filter>
</filters>
</build>
</project>
The config looks correct. I had same issue upgrading over the weekend. I think the ServiceNotFoundException means you're You are probably still importing 1.x.x version of neo4j-ogm jar (post your pom config?). You should explicitly remove it from your pom.
<!-- <dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-ogm</artifactId>
<version>${neo4j.ogm.version}</version>
</dependency>
-->
Make sure you are bringing in the new 2.x.x neo4j-ogm-api neo4j-ogm-core jars. Also make sure you are using spring-data-commons 1.12.0.M1 Hopper release train.
Here's how my spring-data-neo4j-4.1.0.M1 jar dependency hierarchy looks like (and is 100% working).
This example pom (not neo4j specific) helped me get up and running.
https://github.com/spring-projects/spring-data-examples/blob/master/pom.xml#L35
also see:
https://spring.io/blog/2016/02/12/spring-data-release-train-hopper-m1-released
UPDATE: if you are using spring boot something like this (bare minimum pom) should get you up and running. By specifying the spring-data-releasetrain.version with a value of Hopper-M1 into your pom properites spring boot maven plugin will automatically bring down all the correct depedencies for spring-data-neo4j-4.1.0.M1
<!-- Inherit defaults from Spring Boot -->
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.3.2.RELEASE</version>
</parent>
<! --- omitted project info -->
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
<spring-data-releasetrain.version>Hopper-M1</spring-data-releasetrain.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-neo4j</artifactId>
</dependency>
</dependencies>
<repositories>
<repository>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>spring-libs-snapshot</id>
<url>https://repo.spring.io/libs-snapshot</url>
</pluginRepository>
</pluginRepositories>