Running Drools On Spark Cluster - Getting Null Pointer at org.kie.internal.builder.KnowledgeBuilderFactory.newKnowledgeBuilder - scala

I am using drools with Spark to execute some set of rules. I have written a method to load the .drl file. and Instantiate the object of "InternalKnowledgeBase".
This below code is working in local mode, But when I am running in cluster (EMR), I am getting below exception. attaching code and exception stackTrace.
def loadDrl(drlFilePath: String): Option[InternalKnowledgeBase] = {
try {
val resource = ResourceFactory.newClassPathResource(drlFilePath)
val kbuilder = KnowledgeBuilderFactory.newKnowledgeBuilder() //Exception Occurs
kbuilder.add(resource, ResourceType.DRL)
if (kbuilder.hasErrors()) {
throw new RuntimeException(kbuilder.getErrors().toString())
}
val kbase = KnowledgeBaseFactory.newKnowledgeBase()
kbase.addPackages(kbuilder.getKnowledgePackages())
Some(kbase)
} catch {
case e: Exception => {
e.printStackTrace()
None
}
}
}
Please find the Exception Stack Trace.
java.lang.NullPointerException
at org.kie.internal.builder.KnowledgeBuilderFactory.newKnowledgeBuilder(KnowledgeBuilderFactory.java:48)
at $line27.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.loadDrl(<pastie>:44)
at $line33.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:39)
at $line33.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:44)
at $line33.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:46)
at $line33.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:48)
at $line33.$read$$iw$$iw$$iw$$iw.<init>(<console>:50)
at $line33.$read$$iw$$iw$$iw.<init>(<console>:52)
at $line33.$read$$iw$$iw.<init>(<console>:54)
at $line33.$read$$iw.<init>(<console>:56)
at $line33.$read.<init>(<console>:58)
at $line33.$read$.<init>(<console>:62)
at $line33.$read$.<clinit>(<console>)
at $line33.$eval$.$print$lzycompute(<console>:7)
at $line33.$eval$.$print(<console>:6)
at $line33.$eval.$print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
at scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:819)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:691)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:404)
at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:425)
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:285)
at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
at org.apache.spark.repl.Main$.doMain(Main.scala:78)
at org.apache.spark.repl.Main$.main(Main.scala:58)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Appreciate your help! Thanks.

I had a similar issue a while ago. Mine was related to building a jar with shade plugin. Resolved it after finding one of these questions (don't remember with one exactly):
Drools 7.4.1 kieservices.factory.get() returns null
how to use drools in storm topology
Had to add this transformer to my pom.xml
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/kie.conf</resource>
</transformer>

Related

Rest-Assured- giving 404 error for Get request containing path parameter

We are using Rest Assured with TestNG to execute rest services. Following get request failing with 404 error. However same rest url is working fine in Postman. Not sure is this because of some special chars in url (Colon, hyphen and dot etc)
Following is the testNG test method.
#Test
public void listOfStepsTest() {
try {
RestAssured.given()
.header("Access-Key", "a06669527b7bUCCzd4vx4JbYbKYLdV8rqr4DG-ejTnY9J_4_936QrnHcoeJtCsFhQGkNLJeb2wwu").
when()
.get("http://xyz:8080/rest/runtime/XYZ111:XYZProject1:1.0-SNAPSHOT/process/XYZProject1.TaskExample").
then().
assertThat().statusCode(200);
} catch (Exception e) {
e.printStackTrace();
}
}
Error log trace :
FAILED: listOfStepsTest
java.lang.AssertionError: 1 expectation failed.
Expected status code <200> but was <404>.
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:83)
at org.codehaus.groovy.reflection.CachedConstructor.doConstructorInvoke(CachedConstructor.java:77)
at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrap.callConstructor(ConstructorSite.java:84)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:238)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:250)
at io.restassured.internal.ResponseSpecificationImpl$HamcrestAssertionClosure.validate(ResponseSpecificationImpl.groovy:494)
at io.restassured.internal.ResponseSpecificationImpl$HamcrestAssertionClosure$validate$1.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:128)
at io.restassured.internal.ResponseSpecificationImpl.validateResponseIfRequired(ResponseSpecificationImpl.groovy:656)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:210)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:59)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:51)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:157)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:169)
at io.restassured.internal.ResponseSpecificationImpl.statusCode(ResponseSpecificationImpl.groovy:125)
at io.restassured.specification.ResponseSpecification$statusCode$0.callCurrent(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:51)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:157)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:169)
at io.restassured.internal.ResponseSpecificationImpl.statusCode(ResponseSpecificationImpl.groovy:133)
at io.restassured.internal.ValidatableResponseOptionsImpl.statusCode(ValidatableResponseOptionsImpl.java:119)
at com.cbpm.bpm_rest_automation.BPM_REST_Automation.listOfStepsTest(BPM_REST_Automation.java:213)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:80)
at org.testng.internal.Invoker.invokeMethod(Invoker.java:714)
at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:901)
at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1231)
at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:127)
at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:111)
at org.testng.TestRunner.privateRun(TestRunner.java:767)
at org.testng.TestRunner.run(TestRunner.java:617)
at org.testng.SuiteRunner.runTest(SuiteRunner.java:334)
at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:329)
at org.testng.SuiteRunner.privateRun(SuiteRunner.java:291)
at org.testng.SuiteRunner.run(SuiteRunner.java:240)
at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:52)
at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:86)
at org.testng.TestNG.runSuitesSequentially(TestNG.java:1198)
at org.testng.TestNG.runSuitesLocally(TestNG.java:1123)
at org.testng.TestNG.run(TestNG.java:1031)
at org.testng.remote.AbstractRemoteTestNG.run(AbstractRemoteTestNG.java:115)
at org.testng.remote.RemoteTestNG.initAndRun(RemoteTestNG.java:251)
at org.testng.remote.RemoteTestNG.main(RemoteTestNG.java:77)
Disable url encoding in RestAssured. Here's an example:
RestAssured
.given()
.urlEncodingEnabled(false)
.when()
.get("http://example.com/rest/runtime/XYZ111:XYZProject1:1.0-SNAPSHOT/process/XYZProject1.TaskExample");

Error when launching the graph (DSEGraphFrame )

I have a dse graph in validation/prod environement.
The problem occurs when I try to launch a DSEGraphFrame query using Spark in Scala.
val graph = spark.dseGraph("my_graph")
generates the following exception:
Exception in thread "main"
com.datastax.driver.core.exceptions.InvalidQueryException: The method
DseGraphRpc.getSchemaBlob does not exist. Make sure that the required
component for that method is active/enabled
at com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:40)
at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:26)
at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:284)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:37)
at com.sun.proxy.$Proxy27.execute(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:37)
at com.sun.proxy.$Proxy28.execute(Unknown Source)
at com.datastax.bdp.util.rpc.RpcUtil.callInternal(RpcUtil.java:57)
at com.datastax.bdp.util.rpc.RpcUtil.call(RpcUtil.java:40)
at com.datastax.bdp.graph.spark.DseGraphRpc.callGetSchema(DseGraphRpc.java:45)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer$1.apply(DseGraphFrame.scala:586)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer$1.apply(DseGraphFrame.scala:586)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:115)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:114)
at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:158)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:114)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$.getSchemaFromServer(DseGraphFrame.scala:586)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrameBuilder$.apply(DseGraphFrameBuilder.scala:257)
at com.datastax.bdp.graph.spark.graphframe.SparkSessionFunctions.dseGraph(SparkSessionFunctions.scala:20)
What could I do to run DSEGraphFrame properly?
The problem comes from a node in dse cluster wich the graph isn't activated

Error when converting docx to PDF with Docx4j and FOP

I'm trying to convert a Word file into PDF using the Docx4j community package provided on their site. (http://www.docx4java.org/docx4j/docx4j-community-3.3.1.zip)
It looks like they have a version incompatibility between docx4j and FOP on that package, I wonder if anybody hit that problem before and if you know what version of the libraries would make this work.
Here is my code:
FOSettings foSettings = Docx4J.createFOSettings();
String inputfilepath = "path/to/file.docx";
String outputfilepath = "path/to/file.pdf";
WordprocessingMLPackage wordMLPackage = WordprocessingMLPackage.load(new java.io.File(inputfilepath));
FileOutputStream os = new java.io.FileOutputStream(outputfilepath);
foSettings.setFoDumpFile(new java.io.File(inputfilepath + ".fo"));
foSettings.setWmlPackage(wordMLPackage);
Docx4J.toFO(foSettings, os, Docx4J.FLAG_EXPORT_PREFER_XSL);
I copied the stacktrace below, but the error is complaining that FOPFactory does not have the method newInstance() without parameters.
This is being called from FORendererApacheFOP.java, I can see that older versions of FOP used to have that method, I tried to replace the version on the package but that breaks other dependencies.
On the package I have Docx4j 3.3.1 and FOP 2.1
Thanks for any help.
java.lang.NoSuchMethodException: org.apache.fop.apps.FopFactory.newInstance()
at java.lang.Class.getDeclaredMethod(Class.java:2130)
at org.docx4j.convert.out.fo.renderers.FORendererApacheFOP.createFopFactory(FORendererApacheFOP.java:329)
at org.docx4j.convert.out.fo.renderers.FORendererApacheFOP.getFopFactory(FORendererApacheFOP.java:253)
at org.docx4j.convert.out.fo.renderers.FORendererApacheFOP.render(FORendererApacheFOP.java:119)
at org.docx4j.convert.out.fo.AbstractFOExporter.postprocess(AbstractFOExporter.java:168)
at org.docx4j.convert.out.fo.AbstractFOExporter.postprocess(AbstractFOExporter.java:47)
at org.docx4j.convert.out.common.AbstractExporter.export(AbstractExporter.java:82)
at org.docx4j.Docx4J.toFO(Docx4J.java:568)
at org.docx4j.convert.out.fo.FOPAreaTreeHelper.getAreaTreeViaFOP(FOPAreaTreeHelper.java:191)
at org.docx4j.convert.out.fo.LayoutMasterSetBuilder.fixExtents(LayoutMasterSetBuilder.java:138)
at org.docx4j.convert.out.fo.LayoutMasterSetBuilder.getLayoutMasterSetFragment(LayoutMasterSetBuilder.java:97)
at org.docx4j.convert.out.fo.XsltFOFunctions.getLayoutMasterSetFragment(XsltFOFunctions.java:81)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.xalan.extensions.ExtensionHandlerJavaPackage.callFunction(ExtensionHandlerJavaPackage.java:343)
at org.apache.xalan.extensions.ExtensionHandlerJavaPackage.callFunction(ExtensionHandlerJavaPackage.java:440)
at org.apache.xalan.extensions.ExtensionsTable.extFunction(ExtensionsTable.java:222)
at org.apache.xalan.transformer.TransformerImpl.extFunction(TransformerImpl.java:475)
at org.apache.xpath.functions.FuncExtFunction.execute(FuncExtFunction.java:208)
at org.apache.xpath.XPath.execute(XPath.java:337)
at org.apache.xalan.templates.ElemCopyOf.execute(ElemCopyOf.java:134)
at org.apache.xalan.transformer.TransformerImpl.executeChildTemplates(TransformerImpl.java:2402)
at org.apache.xalan.templates.ElemLiteralResult.execute(ElemLiteralResult.java:1376)
at org.apache.xalan.templates.ElemApplyTemplates.transformSelectedNodes(ElemApplyTemplates.java:395)
at org.apache.xalan.templates.ElemApplyTemplates.execute(ElemApplyTemplates.java:178)
at org.apache.xalan.transformer.TransformerImpl.executeChildTemplates(TransformerImpl.java:2402)
at org.apache.xalan.transformer.TransformerImpl.applyTemplateToNode(TransformerImpl.java:2272)
at org.apache.xalan.transformer.TransformerImpl.transformNode(TransformerImpl.java:1358)
at org.apache.xalan.transformer.TransformerImpl.transform(TransformerImpl.java:711)
at org.apache.xalan.transformer.TransformerImpl.transform(TransformerImpl.java:1275)
at org.apache.xalan.transformer.TransformerImpl.transform(TransformerImpl.java:1253)
at org.docx4j.XmlUtils.transform(XmlUtils.java:1275)
at org.docx4j.XmlUtils.transform(XmlUtils.java:1100)
at org.docx4j.convert.out.common.AbstractXsltExporterDelegate.process(AbstractXsltExporterDelegate.java:66)
at org.docx4j.convert.out.common.AbstractWmlExporter.process(AbstractWmlExporter.java:63)
at org.docx4j.convert.out.common.AbstractWmlExporter.process(AbstractWmlExporter.java:32)
at org.docx4j.convert.out.common.AbstractExporter.export(AbstractExporter.java:79)
at org.docx4j.Docx4J.toFO(Docx4J.java:568)
at PDFConversion.main(PDFConversion.java:26)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
java.lang.NoSuchMethodException: org.apache.fop.apps.FopFactory.newInstance()
at java.lang.Class.getDeclaredMethod(Class.java:2130)
at org.docx4j.convert.out.fo.renderers.FORendererApacheFOP.createFopFactory(FORendererApacheFOP.java:329)
at org.docx4j.convert.out.fo.renderers.FORendererApacheFOP.getFopFactory(FORendererApacheFOP.java:253)
at org.docx4j.convert.out.fo.renderers.FORendererApacheFOP.render(FORendererApacheFOP.java:119)
at org.docx4j.convert.out.fo.AbstractFOExporter.postprocess(AbstractFOExporter.java:168)
at org.docx4j.convert.out.fo.AbstractFOExporter.postprocess(AbstractFOExporter.java:47)
at org.docx4j.convert.out.common.AbstractExporter.export(AbstractExporter.java:82)
at org.docx4j.Docx4J.toFO(Docx4J.java:568)
at PDFConversion.main(PDFConversion.java:26)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
org.docx4j.openpackaging.exceptions.Docx4JException: Exception exporting package
at org.docx4j.convert.out.common.AbstractExporter.export(AbstractExporter.java:109)
at org.docx4j.Docx4J.toFO(Docx4J.java:568)
at PDFConversion.main(PDFConversion.java:26)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.NullPointerException
at org.docx4j.convert.out.fo.renderers.FORendererApacheFOP.render(FORendererApacheFOP.java:199)
at org.docx4j.convert.out.fo.renderers.FORendererApacheFOP.render(FORendererApacheFOP.java:159)
at org.docx4j.convert.out.fo.AbstractFOExporter.postprocess(AbstractFOExporter.java:168)
at org.docx4j.convert.out.fo.AbstractFOExporter.postprocess(AbstractFOExporter.java:47)
at org.docx4j.convert.out.common.AbstractExporter.export(AbstractExporter.java:82)
... 7 more
The stack trace you are seeing is occuring after an exception has been caught (FOP is unable to initialise FOP v2.1, so is falling back to FOP 1.0 or 1.1 config, which doesn't work, since you have FOP 2.1)
Why can't FOP 2.1 init? The diagnostics we need to see are at https://github.com/plutext/docx4j-export-FO/blob/master/src/main/java/org/docx4j/convert/out/fo/renderers/FORendererApacheFOP.java#L320
[java]
} catch (Exception e) {
log.warn("Can't set up FOP svn; " + e.getMessage() );
log.debug(e.getMessage(), e);
[/java]
So please turn on DEBUG level logging for class org.docx4j.convert.out.fo.renderers.FORendererApacheFOP
To do that, please see the comments at https://github.com/plutext/docx4j/blob/master/src/samples/_resources/log4j.xml
This issue is under active discussion at http://www.docx4java.org/forums/pdf-output-f27/error-on-convert-pdf-with-fo-on-dox4j-3-3-1-t2446.html
But now tracking it as https://github.com/plutext/docx4j-export-FO/issues/1
Does your FOP 2.1 come from Maven, or somewhere else?

How do I configure the YARN address for yarn-client mode in spark?

From a remote scala program, using Spark 1.3, how do I initialize the sparkContext so that I can connect to Spark running on YARN? i.e. where do I put the address of the YARN node(s)?
Currently my program contains:
val conf = new SparkConf().setMaster("yarn-client").setAppName("MyApp")
val sc = new SparkContext(conf)
and it yields
[error] (run-main-0) java.lang.ExceptionInInitializerError
java.lang.ExceptionInInitializerError
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1959)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:179)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:310)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
at SparkExampleLocalDriver$.main(SparkExample.scala:9)
at SparkExampleLocalDriver.main(SparkExample.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: org.apache.spark.SparkException: Unable to load YARN support
at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:217)
at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:212)
at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1959)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:179)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:310)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
at SparkExampleLocalDriver$.main(SparkExample.scala:9)
at SparkExampleLocalDriver.main(SparkExample.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:195)
at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:213)
at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:212)
at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1959)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:179)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:310)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
at SparkExampleLocalDriver$.main(SparkExample.scala:9)
at SparkExampleLocalDriver.main(SparkExample.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Your Spark binary doesn't contain YARN related classes
Use pre-built binary for hadoop
http://www.apache.org/dyn/closer.cgi/spark/spark-1.3.1/spark-1.3.1-bin-hadoop2.4.tgz
If you are compiling source, include yarn and hadoop profiles
./make-distribution.sh --tgz --skip-java-test -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0

GWT Autobeans / NoClassDefFoundError: What am I doing wrong?

I am trying to serialize a POJO to JSON in GWT using Autobeans, an I keep receiving a NoClassDefFoundError and ClassNotFoundException, both looking for org.json.JSONObject. This is in the context of tokenizing a Place.
I created a JUnit test to isolate the problem. The test is:
#Test public void testGetToken() {
SearchCriterion searchCriterion = new JavaSearchCriterion("a", "b", ImmutableList.of("c"));
ResourcesPlace place = new ResourcesPlace(ImmutableList.of(searchCriterion));
String token = new Tokenizer().getToken(place);
assertNotNull(token);
}
And my stack trace is
java.lang.NoClassDefFoundError: org/json/JSONObject
at com.google.web.bindery.autobean.shared.impl.StringQuoter.quote(StringQuoter.java:69)
at com.google.web.bindery.autobean.shared.impl.AutoBeanCodexImpl$PropertyGetter.encodeProperty(AutoBeanCodexImpl.java:411)
at com.google.web.bindery.autobean.shared.impl.AutoBeanCodexImpl$PropertyGetter.visitValueProperty(AutoBeanCodexImpl.java:397)
at com.google.web.bindery.autobean.vm.impl.ProxyAutoBean.traverseProperties(ProxyAutoBean.java:260)
at com.google.web.bindery.autobean.shared.impl.AbstractAutoBean.traverse(AbstractAutoBean.java:166)
at com.google.web.bindery.autobean.shared.impl.AbstractAutoBean.accept(AbstractAutoBean.java:101)
at com.google.web.bindery.autobean.shared.impl.AutoBeanCodexImpl.doEncode(AutoBeanCodexImpl.java:558)
at com.google.web.bindery.autobean.shared.AutoBeanCodex.encode(AutoBeanCodex.java:83)
at com.redacted.client.place.ResourcesPlace$Tokenizer.getToken(ResourcesPlace.java:62)
at com.redacted.client.place.ResourcesPlaceTokenizerTest.testGetToken(ResourcesPlaceTokenizerTest.java:19)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:76)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:49)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
Caused by: java.lang.ClassNotFoundException: org.json.JSONObject
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
... 32 more
And here is the relevant bit of the Tokenizer:
private MyFactory factory = AutoBeanFactorySource.create(MyFactory.class);
interface MyFactory extends AutoBeanFactory {
// Factory method for a simple AutoBean
AutoBean<SearchCriterion> person();
// Factory method for a non-simple type or to wrap an existing instance
AutoBean<SearchCriterion> person(SearchCriterion toWrap);
}
#Override
public String getToken(ResourcesPlace place) {
AutoBean<SearchCriterion> placeBean = factory.create(SearchCriterion.class, place.getSearchCriteria().get(0));
return AutoBeanCodex.encode(placeBean).getPayload();
}
I've added the Autobeans include in my .gwt.xml, but I still seem to be missing something.
How are you launching the test runner? Those classes originate from gwt/tools/redist/json/r2_20080312/json-1.5.jar in the GWT repo and should be packed into the gwt-servlet-deps.jar. Try adding that jar to the classpath of your test run configuration to see if that clears it up.