Error when launching the graph (DSEGraphFrame ) - scala

I have a dse graph in validation/prod environement.
The problem occurs when I try to launch a DSEGraphFrame query using Spark in Scala.
val graph = spark.dseGraph("my_graph")
generates the following exception:
Exception in thread "main"
com.datastax.driver.core.exceptions.InvalidQueryException: The method
DseGraphRpc.getSchemaBlob does not exist. Make sure that the required
component for that method is active/enabled
at com.datastax.driver.core.exceptions.InvalidQueryException.copy(InvalidQueryException.java:40)
at com.datastax.driver.core.DriverThrowables.propagateCause(DriverThrowables.java:26)
at com.datastax.driver.core.DefaultResultSetFuture.getUninterruptibly(DefaultResultSetFuture.java:284)
at com.datastax.driver.core.AbstractSession.execute(AbstractSession.java:49)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:37)
at com.sun.proxy.$Proxy27.execute(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.datastax.spark.connector.cql.SessionProxy.invoke(SessionProxy.scala:37)
at com.sun.proxy.$Proxy28.execute(Unknown Source)
at com.datastax.bdp.util.rpc.RpcUtil.callInternal(RpcUtil.java:57)
at com.datastax.bdp.util.rpc.RpcUtil.call(RpcUtil.java:40)
at com.datastax.bdp.graph.spark.DseGraphRpc.callGetSchema(DseGraphRpc.java:45)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer$1.apply(DseGraphFrame.scala:586)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$$anonfun$getSchemaFromServer$1.apply(DseGraphFrame.scala:586)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:115)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:114)
at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:158)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:114)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrame$.getSchemaFromServer(DseGraphFrame.scala:586)
at com.datastax.bdp.graph.spark.graphframe.DseGraphFrameBuilder$.apply(DseGraphFrameBuilder.scala:257)
at com.datastax.bdp.graph.spark.graphframe.SparkSessionFunctions.dseGraph(SparkSessionFunctions.scala:20)
What could I do to run DSEGraphFrame properly?

The problem comes from a node in dse cluster wich the graph isn't activated

Related

Mongo-Hadoop connector issue

I run file jar Word Count but have this error
NoClassDefFoundError: org/Apache/commons/lang/StringUtils ..
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/lang/StringUtils
at com.mongodb.hadoop.util.MongoConfigUtil.getMongoURIs(MongoConfigUtil.java:409)
at com.mongodb.hadoop.util.MongoConfigUtil.getOutputURIs(MongoConfigUtil.java:598)
at com.mongodb.hadoop.MongoOutputFormat.checkOutputSpecs(MongoOutputFormat.java:32)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:277)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1571)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1568)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1568)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1589)
at WordCount.main(WordCount.java:55)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

Using fastutill ObjectArrayList in Spark application throws exception

I am using a fastutill ObjectArrayList[object] instead of ArrayBuffer in scala in my spark streamign applicatiuon (inside a mapwithstae function)
I use the lates version..using below sbt
libraryDependencies += "it.unimi.dsi" % "fastutil" % "7.0.13"
However, it throws an exception
yarn.ApplicationMaster: User class threw exception: java.lang.NoClassDefFoundError: it/unimi/dsi/fastutil/objects/ObjectArrayList
java.lang.NoClassDefFoundError: it/unimi/dsi/fastutil/objects/ObjectArrayList
at MapRRecoverableConsumer$$anonfun$5.apply(MapRRecoverableConsumer.scala:756)
at MapRRecoverableConsumer$$anonfun$5.apply(MapRRecoverableConsumer.scala:293)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:864)
at MapRRecoverableConsumer$.main(MapRRecoverableConsumer.scala:863)
at MapRRecoverableConsumer.main(MapRRecoverableConsumer.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:542)
Caused by: java.lang.ClassNotFoundException: it.unimi.dsi.fastutil.objects.ObjectArrayList
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
Ok..just added as jars parameter in spark submit and it resolved that..

How do I configure the YARN address for yarn-client mode in spark?

From a remote scala program, using Spark 1.3, how do I initialize the sparkContext so that I can connect to Spark running on YARN? i.e. where do I put the address of the YARN node(s)?
Currently my program contains:
val conf = new SparkConf().setMaster("yarn-client").setAppName("MyApp")
val sc = new SparkContext(conf)
and it yields
[error] (run-main-0) java.lang.ExceptionInInitializerError
java.lang.ExceptionInInitializerError
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1959)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:179)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:310)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
at SparkExampleLocalDriver$.main(SparkExample.scala:9)
at SparkExampleLocalDriver.main(SparkExample.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: org.apache.spark.SparkException: Unable to load YARN support
at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:217)
at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:212)
at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1959)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:179)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:310)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
at SparkExampleLocalDriver$.main(SparkExample.scala:9)
at SparkExampleLocalDriver.main(SparkExample.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.deploy.yarn.YarnSparkHadoopUtil
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:195)
at org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:213)
at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:212)
at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1959)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:179)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:310)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:269)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:272)
at SparkExampleLocalDriver$.main(SparkExample.scala:9)
at SparkExampleLocalDriver.main(SparkExample.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Your Spark binary doesn't contain YARN related classes
Use pre-built binary for hadoop
http://www.apache.org/dyn/closer.cgi/spark/spark-1.3.1/spark-1.3.1-bin-hadoop2.4.tgz
If you are compiling source, include yarn and hadoop profiles
./make-distribution.sh --tgz --skip-java-test -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0

Tinkerpop Gremlin console not working

I have installed Tinkerpop 3.0 M4 and Gremlin console, both 3.0 M4.
I am not able to do neither the following:
1) to load a database already existing,
2) to create the classic DB.
The Database is working fine in Java thrugh Tinkerpop 3, just Gremlin console isn't.
While opening the DB I have the same identical issue with Gremlin 2.6 from OrientDB install, so it is not due to version incompatibility between OrientDB 2.0M2 and Grails 2.6 or 3.0M4 (nor M5).
I can't figure out what is the problem.
1) This is the command I give in Gremlin to open the Database:
g = new OrientGraph("local:/opt/orientdb/databases/dbname", "user", "password");
This is the exception I get:
groovysh_evaluate: 3: unable to resolve class OrintGraph
# line 3, column 5.
g = new OrintGraph("local:/opt/orientdb/databases/circletask");
^
1 error
Display stack trace? [yN] y
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
groovysh_evaluate: 3: unable to resolve class OrintGraph
# line 3, column 5.
g = new OrintGraph("local:/opt/orientdb/databases/circletask");
^
1 error
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:309)
at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:938)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:585)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:534)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:286)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:259)
at groovy.lang.GroovyShell.parseClass(GroovyShell.java:674)
at groovy.lang.GroovyShell.parse(GroovyShell.java:686)
at groovy.lang.GroovyShell.parse(GroovyShell.java:722)
at org.codehaus.groovy.tools.shell.Interpreter.evaluate(Interpreter.groovy:62)
at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:159)
at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:121)
at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:93)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy)
at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:123)
at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:83)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at com.tinkerpop.gremlin.console.Console.<init>(Console.groovy:99)
at java.lang.invoke.MethodHandle.invokeWithArguments(MethodHandle.java:636)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at com.tinkerpop.gremlin.console.Console.main(Console.groovy:230)
2) This is the command to create the Tutorial DB:
gremlin> Graph g = TinkerFactory.createClassic();
This is the exception I get:
No such property: TinkerFactory for class: groovysh_evaluate
groovy.lang.MissingPropertyException: No such property: TinkerFactory for class: groovysh_evaluate
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:51)
at org.codehaus.groovy.runtime.callsite.PogoGetPropertySite.getProperty(PogoGetPropertySite.java:49)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callGroovyObjectGetProperty(AbstractCallSite.java:231)
at groovysh_evaluate.run(groovysh_evaluate:3)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at org.codehaus.groovy.tools.shell.Interpreter.evaluate(Interpreter.groovy:68)
at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:159)
at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:121)
at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:93)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy)
at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:123)
at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:324)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1207)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:130)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:150)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:83)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at com.tinkerpop.gremlin.console.Console.<init>(Console.groovy:99)
at java.lang.invoke.MethodHandle.invokeWithArguments(MethodHandle.java:636)
at org.codehaus.groovy.vmplugin.v7.IndyInterface.selectMethod(IndyInterface.java:215)
at com.tinkerpop.gremlin.console.Console.main(Console.groovy:230)
I found that plugin where not active by default as expected:
gremlin> :plugin list
==>tinkerpop.server
==>tinkerpop.gephi
==>tinkerpop.sugar
==>tinkerpop.utilities
==>tinkerpop.tinkergraph
that solved the problem:
:plugin use tinkerpop.server
:plugin use tinkerpop.tinkergraph
:plugin use tinkerpop.utilities
The Error says:
groovysh_evaluate: 3: unable to resolve class OrintGraph
# line 3, column 5.
g = new OrintGraph("local:/opt/orientdb/databases/circletask");
So seems you misspelled the class OrientGraph in OrintGraph? Try using OrientGraph.
There is no OrientDB implementation for TinkerPop3 at this time.

titan with es and cassandra - java.lang.IllegalArgumentException: Index is unknown or not configured: search

I am using es-cassandra. I am trying to load the default graph but I get the below error. How do I fix? I am assming I have to create a ES index? if so then what index?
curl -XPUT 'http://localhost:9200/WHAT_INDEX???/'
gremlin> g = GraphOfTheGodsFactory.create('/tmp/titan')
==>titangraph[local:/tmp/titan]
gremlin> g = TitanFactory.open('conf/titan-cassandra-es.properties')
==>titangraph[local:/home/ubuntu/conf/titan-cassandra-es.properties]
gremlin> GraphOfTheGodsFactory.load(g)
Index is unknown or not configured: search
Display stack trace? [yN] y
java.lang.IllegalArgumentException: Index is unknown or not configured: search
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:119)
at com.thinkaurelius.titan.graphdb.database.IndexSerializer.supports(IndexSerializer.java:73)
at com.thinkaurelius.titan.graphdb.types.StandardKeyMaker.make(StandardKeyMaker.java:149)
at com.thinkaurelius.titan.example.GraphOfTheGodsFactory.load(GraphOfTheGodsFactory.java:52)
at com.thinkaurelius.titan.example.GraphOfTheGodsFactory$load.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:108)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at groovysh_evaluate.run(groovysh_evaluate:56)
at groovysh_evaluate$run.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:42)
at groovysh_evaluate$run.call(Unknown Source)
at org.codehaus.groovy.tools.shell.Interpreter.evaluate(Interpreter.groovy:67)
at org.codehaus.groovy.tools.shell.Interpreter$evaluate.call(Unknown Source)
at org.codehaus.groovy.tools.shell.Groovysh.execute(Groovysh.groovy:152)
at org.codehaus.groovy.tools.shell.Shell.leftShift(Shell.groovy:114)
at org.codehaus.groovy.tools.shell.Shell$leftShift$0.call(Unknown Source)
at org.codehaus.groovy.tools.shell.ShellRunner.work(ShellRunner.groovy:88)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$work(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.work(InteractiveShellRunner.groovy:100)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:272)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:137)
at org.codehaus.groovy.tools.shell.ShellRunner.run(ShellRunner.groovy:57)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.super$2$run(InteractiveShellRunner.groovy)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:90)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:233)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1079)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuperN(ScriptBytecodeAdapter.java:128)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.invokeMethodOnSuper0(ScriptBytecodeAdapter.java:148)
at org.codehaus.groovy.tools.shell.InteractiveShellRunner.run(InteractiveShellRunner.groovy:66)
at com.thinkaurelius.titan.tinkerpop.gremlin.Console.<init>(Console.java:57)
at com.thinkaurelius.titan.tinkerpop.gremlin.Console.<init>(Console.java:70)
at com.thinkaurelius.titan.tinkerpop.gremlin.Console.main(Console.java:96)
Not sure I'm following what you're trying to do. You've already initialized g with
g = GraphOfTheGodsFactory.create('/tmp/titan')
With that you've create Graph of the Gods with BerkeleyDB. Why are you then re-initializing it with cassandra?
If I first start an instance of cassandra, then execute:
gremlin> g = TitanFactory.open('conf/titan-cassandra-es.properties')
==>titangraph[cassandrathrift:127.0.0.1]
gremlin> GraphOfTheGodsFactory.load(g)
==>null
Note the difference as compared to your gremlin session:
gremlin> g = TitanFactory.open('conf/titan-cassandra-es.properties')
==>titangraph[local:/home/ubuntu/conf/titan-cassandra-es.properties]
It actually looks like it's not finding the configuration file at conf/titan-cassandra-es.properties and is then creating a BerkeleyDB instance. You will likely find a directory at: /home/ubuntu/conf/titan-cassandra-es.properties with berkeleydb data files in it. Check the location of the actual conf/titan-cassandra-es.properties and make sure it is there.