Why does database initialization fail with "ERROR: improper qualified name (too many dotted names)"? - postgresql

I'm using Scala 2.10 and Slick 2.10-1.0.1 with plain queries.
I tried to init a lazy evaluating database with Tomcat at localhost. For query evaluation I use PostgreSQL on port 5432.
As I tried to compile I got following error message:
ERROR org.quartz.core.JobRunShell - Job DEFAULT.MissionLifecycleManager threw an unhandled Exception: org.postgresql.util.PSQLException: ERROR: improper qualified name (too many dotted names)
Position: 16
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2102) ~[postgresql-9.1-901.jdbc4.jar:na]
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1835) ~[postgresql-9.1-901.jdbc4.jar:na]
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:257) ~[postgresql-9.1-901.jdbc4.jar:na]
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:500) ~[postgresql-9.1-901.jdbc4.jar:na]
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:388) ~[postgresql-9.1-901.jdbc4.jar:na]
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:381) ~[postgresql-9.1-901.jdbc4.jar:na]
at scala.slick.jdbc.StatementInvoker.results(StatementInvoker.scala:34) ~[slick_2.10-1.0.1.jar:1.0.1]
at scala.slick.jdbc.StatementInvoker.elementsTo(StatementInvoker.scala:17) ~[slick_2.10-1.0.1.jar:1.0.1]
....
This is the code of my initialisation:
import com.weiglewilczek.slf4s.Logging
import scala.slick.driver.PostgresDriver._
import scala.slick.session.Database
import Database.threadLocalSession
import scala.slick.jdbc.{GetResult, StaticQuery => Q}
import scala.slick.driver.PostgresDriver.simple._
object SQLUtilities extends Logging with ServiceInjector {
lazy val db = init()
private def init() = {
info("Connecting to postgres database at localhost") //writes in a log file
val qe = Database.forURL("jdbc:postgresql://localhost:5432", "user", "pass", driver = "org.postgresql.Driver")
info("Connected to database")
qe
}
}
Obviously, something went wrong. So I think, my database initiallisation is not correct. Do I have forgotten some parameters? Are my parameters correct at all?
An another - not so fatal - question: If I have a body where I want to log something at the beginning and at the end of a method - let's say always the same log message, but different bodys - as a sign that I started and leaved this method... Is there a proper way to do this than this example here in init()?

Specify a database in your connection string "jdbc:postgresql://localhost:5432/somedatabase".
See http://jdbc.postgresql.org/documentation/head/connect.html

Related

MongoTimeoutException: Error While Using MongoDB with Trino

I'm passing mongodb.properties as
connector.name=mongodb mongodb.seeds=127.0.0.1:27017 mongodb.credentials=username:password#database
But when running the catalog after passing the query it is giving error as
Query 20210312_110147_00003_zxyd4 failed: com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting for a server that matches com.mongodb.client.internal.MongoClientDelegate$1#469d4507. Client view of cluster state is {type=REPLICA_SET, servers=[{address=hostname:27017, type=UNKNOWN, state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception opening socket}, caused by {java.net.SocketTimeoutException: connect timed out}}]
Can Someone Help. Am i doing something wrong while doing the connection?
Note:- I'm running trino locally using single machine for testing, that functions as both a coordinator and worker and mongodb is on some other server but i'm routing the localhost to the desired server using ssh
It's working while using this JAVA Code, Use these two mongodb-driver-core-3.7.1-javadoc.jar,mongo-java-driver-3.3.0.jar files to compile and run below code
import com.mongodb.MongoClient;
import com.mongodb.MongoClientOptions;
import com.mongodb.MongoCredential;
import com.mongodb.ServerAddress;
import java.util.Collections;
import org.bson.Document;
import java.util.List;
public class MongoSession
{
public static void main(String[] args)
{
ServerAddress seed = new ServerAddress("127.0.0.1:27017");
MongoCredential credential = MongoCredential.createCredential("user", "database", "password".toCharArray());
MongoClient client = new MongoClient(seed, Collections.singletonList(credential), MongoClientOptions.builder().build());
client.getDatabase("database").runCommand(new Document("ping", 1));
for (String name : client.getDatabase("database").listCollectionNames()) {
System.out.println(name);
}
}
}
Thanks to MongoDB Community for such a Quick Response
Jira:- https://jira.mongodb.org/browse/JAVA-4076?focusedCommentId=3673048&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-3673048

Invalid mongo configuration, either uri or host/port/credentials must be specified

I'm getting this exception :
Caused by: java.lang.IllegalStateException: Invalid mongo configuration, either uri or host/port/credentials must be specified
at org.springframework.boot.autoconfigure.mongo.MongoProperties.createMongoClient(MongoProperties.java:207)
at org.springframework.boot.autoconfigure.mongo.MongoAutoConfiguration.mongo(MongoAutoConfiguration.java:73)
at org.springframework.boot.autoconfigure.mongo.MongoAutoConfiguration$$EnhancerBySpringCGLIB$$15f9b896.CGLIB$mongo$1(<generated>)
at org.springframework.boot.autoconfigure.mongo.MongoAutoConfiguration$$EnhancerBySpringCGLIB$$15f9b896$$FastClassBySpringCGLIB$$c0338f6a.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:356)
at org.springframework.boot.autoconfigure.mongo.MongoAutoConfiguration$$EnhancerBySpringCGLIB$$15f9b896.mongo(<generated>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:162)
... 25 common frames omitted
Here is my application.yml content :
spring:
data:
mongodb:
uri: mongodb://develop:d3VeL0p$#<my_host>:27017/SHAM
Here is my configuration class :
package com.me.service.testservice.config;
import com.mongodb.MongoClient;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.mongodb.core.MongoTemplate;
import org.springframework.data.mongodb.repository.config.EnableMongoRepositories;
#Configuration
#EnableMongoRepositories(basePackages = {"com.me.service.testservice.repository"}, considerNestedRepositories = true)
public class SpringMongoConfiguration {
#Bean
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(new MongoClient("<my_host>"), "SHAM");
}
}
Now I'm getting this stack trace when starting without failing, it looks like user develop doesn't have the right to connect:
Caused by: com.mongodb.MongoCommandException: Command failed with error 18: 'Authentication failed.' on server phelbwlabect003.karmalab.net:27017. The full response is { "ok" : 0.0, "errmsg" : "Authentication failed.", "code" : 18, "codeName" : "AuthenticationFailed" }
at com.mongodb.connection.CommandHelper.createCommandFailureException(CommandHelper.java:170)
at com.mongodb.connection.CommandHelper.receiveCommandResult(CommandHelper.java:123)
at com.mongodb.connection.CommandHelper.executeCommand(CommandHelper.java:32)
at com.mongodb.connection.SaslAuthenticator.sendSaslStart(SaslAuthenticator.java:117)
at com.mongodb.connection.SaslAuthenticator.access$000(SaslAuthenticator.java:37)
at com.mongodb.connection.SaslAuthenticator$1.run(SaslAuthenticator.java:50)
... 9 common frames omitted
You are mixing the uri style connection settings with the individual properties style settings.
Either use
spring:
data:
mongodb:
host: localhost
port: 27017
database: SHAM_TEST
username: develop
password: pass
Or
spring:
data:
mongodb:
uri:mongodb://develop:pass#localhost:27017/SHAM_TEST
If your project has more then one application.properties/application.yml file, sometime they both conflicts if one has URI and other has Host/Port/Credential.
This result in the error "Invalid mongo configuration, either uri or host/port/credentials must be specified"
To Avoid the conflict, make sure you either use URI or Host/Port/Credential in all.
The issue was authentication-database was missing.
Here is now the working configuration :
spring:
data:
mongodb:
host: <my_host>
username: develop
password: d3VeL0p$
port: 27017
database: SHAM
repositories:
enabled: true
authentication-database: admin
If you are using application.properties file to store your configuration, Use the following structure.
spring.data.mongodb.host = localhost
spring.data.mongodb.port = 27017
spring.data.mongodb.database = SHAM_TEST
spring.data.mongodb.username = develop
spring.data.mongodb.password = pass
This issue appears with mongo version 4.x,
spring or spring boot supports version 2.x only so when we try to connect with mongo 4 this error appears:
Downgrade your mongo from 4.x to 3.x or lower.
If not, then change these placeholders.
Change from :
spring.data.mongodb.host= localhost
spring.data.mongodb.port=27017
spring.data.mongodb.database= your db name
spring.data.mongodb.username= something
spring.data.mongodb.password= ***
-to-
mongo.replica.hosts=localhost:27017
mongo.dbname= your db name
mongo.username= something
mongo.password= ***

Talend - The import org.apache cannot be resolved

I've created a custom Talend component, which at certain step connects to an external Http service. For that, I'm using org.apache.commons.httpclient through javajet imports. I've seen the modules already exist in the Modules view. Nevertheless, when running a job the console outputs:
Execution failed : Failed to generate code.
[----------
1. ERROR in /Users/frb/Downloads/TOS_DI-20160510_1709-V6.2.0/workspace/.JETEmitters/src/org/talend/designer/codegen/translators/ngsi/orion/TOrionAppendBeginJava.java (at line 14)
import org.apache.commons.httpclient.*;
^^^^^^^^^^
The import org.apache cannot be resolved
----------
2. ERROR in /Users/frb/Downloads/TOS_DI-20160510_1709-V6.2.0/workspace/.JETEmitters/src/org/talend/designer/codegen/translators/ngsi/orion/TOrionAppendBeginJava.java (at line 15)
import org.apache.commons.httpclient.methods.*;
^^^^^^^^^^
The import org.apache cannot be resolved
----------
3. ERROR in /Users/frb/Downloads/TOS_DI-20160510_1709-V6.2.0/workspace/.JETEmitters/src/org/talend/designer/codegen/translators/ngsi/orion/TOrionAppendBeginJava.java (at line 16)
import org.apache.commons.httpclient.params.HttpMethodParams;;
^^^^^^^^^^
The import org.apache cannot be resolved
----------
3 problems (3 errors)
]
Any hints about how to fix this issue? My Talend version is 6.2.0.
EDIT 1
This is my begin code:
<%# jet
imports="
org.talend.core.model.process.INode
org.talend.core.model.process.ElementParameterParser
org.talend.core.model.metadata.IMetadataTable
org.talend.core.model.metadata.IMetadataColumn
org.talend.core.model.process.IConnection
org.talend.core.model.process.IConnectionCategory
org.talend.designer.codegen.config.CodeGeneratorArgument
org.talend.core.model.metadata.types.JavaTypesManager
org.talend.core.model.metadata.types.JavaType
java.util.List
java.util.Map
org.apache.commons.httpclient.*
org.apache.commons.httpclient.methods.*
org.apache.commons.httpclient.params.HttpMethodParams
"
%>
<%
// Get the CID
CodeGeneratorArgument codeGenArgument = (CodeGeneratorArgument) argument;
INode node = (INode)codeGenArgument.getArgument();
String cid = node.getUniqueName();
// Get the component parameters
String orionEndpoint = ElementParameterParser.getValue(node, "__ORION_ENDPOINT__");
String authEndpoint = ElementParameterParser.getValue(node, "__AUTH_ENDPOINT__");
String authUsername = ElementParameterParser.getValue(node, "__AUTH_USERNAME__");
String authPassword = ElementParameterParser.getValue(node, "__AUTH_PASSWORD__");
String entityIdField = ElementParameterParser.getValue(node, "__ENTITY_ID_FIELD__");
String entityTypeField = ElementParameterParser.getValue(node, "__ENTITY_TYPE_FIELD__");
String defaultEntityType = ElementParameterParser.getValue(node, "__DEFAULT_ENTITY_TYPE__");
String ignoredFilds = ElementParameterParser.getValue(node, "__IGNORED_FIELDS__");
%>
System.out.println("I am the begin section");
HttpClient client = new HttpClient();
PostMethod method = new PostMethod(<%=authEndpoint%>);
method.setRequestHeader(new Header("Content-Type", "application/json"));
method.setRequestBody("{\"username\":\"" + <%=authUsername%> + "\",\"password\":\"" + <%=authPassword%> + "\"}");
try {
int statusCode = client.executeMethod(method);
if (statusCode != HttpStatus.SC_OK) {
System.err.println("Method failed: " + method.getStatusLine());
} // if
byte[] responseBody = method.getResponseBody();
System.out.println(new String(responseBody));
} catch (HttpException e) {
System.err.println("Fatal protocol violation: " + e.getMessage());
e.printStackTrace();
} catch (IOException e) {
System.err.println("Fatal transport error: " + e.getMessage());
e.printStackTrace();
} finally {
method.releaseConnection();
} // try
EDIT 2
I've added the following to my Component Descriptor file:
<IMPORTS>
<IMPORT
NAME="commons-httpclient"
MODULE="commons-httpclient-3.1.jar"
REQUIRED="true"
/>
</IMPORTS>
Now, in the modules view I'm able to see the following:
Sadly, the component outputs the same errors.
EDIT 3
After removing the imports and using fully qualified names, as suggested by #Balazs Gunics, the code seems to be generated. Nevertheless, some other errors related to commons-httpclient arise at running time:
Starting job job_tOrionAppend at 08:20 21/06/2016.
[statistics] connecting to socket on port 3916
[statistics] connected
I am the begin section
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.commons.httpclient.HttpClient.<clinit>(HttpClient.java:66)
at iotp_talend_connectors.job_torionappend_0_1.job_tOrionAppend.tMysqlInput_1Process(job_tOrionAppend.java:854)
at iotp_talend_connectors.job_torionappend_0_1.job_tOrionAppend.tMysqlConnection_1Process(job_tOrionAppend.java:422)
at iotp_talend_connectors.job_torionappend_0_1.job_tOrionAppend.runJobInTOS(job_tOrionAppend.java:1355)
at iotp_talend_connectors.job_torionappend_0_1.job_tOrionAppend.main(job_tOrionAppend.java:1212)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
[statistics] disconnected
[statistics] disconnected
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more
Job job_tOrionAppend ended at 08:20 21/06/2016. [exit code=1]
So in the begin.javajet code of yours the following code will only import these libraries for the code generation itself. But you need them to the generated code.
Generating java using java makes it hard to oversee this.
<%# jet
imports="
org.apache.commons.httpclient.*
org.apache.commons.httpclient.methods.*
org.apache.commons.httpclient.params.HttpMethodParams
So what you need is to have these imports added to the generated code. Well that is not really possible :( https://www.talendforge.org/forum/viewtopic.php?id=3670 To do that you need to modify the xml descriptor for your component.
So your imports are right. All you have to do is make sure you use the fully qualified names. I.e.: This piece of code:
System.out.println("I am the begin section");
HttpClient client = new HttpClient();
PostMethod method = new PostMethod(<%=authEndpoint%>);
Have to be rewritten to look like this:
System.out.println("I am the begin section");
org.apache.commons.httpclient.HttpClient client =
new org.apache.commons.httpclient.HttpClient();
org.apache.commons.httpclient.methods.PostMethod method =
new org.apache.commons.httpclient.methods.PostMethod(<%=authEndpoint%>);
Yes, it would be way more elegant if we could import the classes and use them.

slick 3.0.0 with HikariCP driver not loaded - IllegalAccessException: AbstractHikariConfig can not access a member with modifiers "private"

I am trying to use tminglei/slick-pg v9.0.0 with slick 3.0.0 and am getting an IllegalAccessException:
akka.actor.ActorInitializationException: exception during creation
at akka.actor.ActorInitializationException$.apply(Actor.scala:166) ~[akka-actor_2.11-2.3.11.jar:na]
...
Caused by: java.lang.RuntimeException: driverClassName specified class 'com.github.tminglei.MyPostgresDriver$' could not be loaded
at com.zaxxer.hikari.AbstractHikariConfig.setDriverClassName(AbstractHikariConfig.java:370) ~[HikariCP-java6-2.3.8.jar:na]
at slick.jdbc.HikariCPJdbcDataSource$$anonfun$forConfig$18.apply(JdbcDataSource.scala:145) ~[slick_2.11-3.0.0.jar:na]
at slick.jdbc.HikariCPJdbcDataSource$$anonfun$forConfig$18.apply(JdbcDataSource.scala:145) ~[slick_2.11-3.0.0.jar:na]
at scala.Option.map(Option.scala:146) ~[scala-library-2.11.7.jar:na]
at slick.jdbc.HikariCPJdbcDataSource$.forConfig(JdbcDataSource.scala:145) ~[slick_2.11-3.0.0.jar:na]
at slick.jdbc.HikariCPJdbcDataSource$.forConfig(JdbcDataSource.scala:135) ~[slick_2.11-3.0.0.jar:na]
at slick.jdbc.JdbcDataSource$.forConfig(JdbcDataSource.scala:35) ~[slick_2.11-3.0.0.jar:na]
at slick.jdbc.JdbcBackend$DatabaseFactoryDef$class.forConfig(JdbcBackend.scala:223) ~[slick_2.11-3.0.0.jar:na]
at slick.jdbc.JdbcBackend$$anon$3.forConfig(JdbcBackend.scala:33) ~[slick_2.11-3.0.0.jar:na]
...
Caused by: java.lang.IllegalAccessException: Class com.zaxxer.hikari.AbstractHikariConfig can not access a member of class com.github.tminglei.MyPostgresDriver$ with modifiers "private"
at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:109) ~[na:1.7.0_79]
at java.lang.Class.newInstance(Class.java:373) ~[na:1.7.0_79]
at com.zaxxer.hikari.AbstractHikariConfig.setDriverClassName(AbstractHikariConfig.java:366) ~[HikariCP-java6-2.3.8.jar:na]
... 43 common frames omitted
HikariCP is the default connection pool in slick 3.0.0
I have defined the driver class much like in the example:
trait MyPostgresDriver extends ExPostgresDriver with PgArraySupport
with PgEnumSupport
with PgRangeSupport
with PgHStoreSupport
with PgSearchSupport{
override val api = new MyAPI {}
//////
trait MyAPI extends API
with ArrayImplicits
with RangeImplicits
with HStoreImplicits
with SearchImplicits
with SearchAssistants
}
object MyPostgresDriver extends MyPostgresDriver
My database config is pretty straightforward [excerpt of typesafe config follows]:
slick.dbs.default {
driver="com.github.tminglei.MyPostgresDriver$"
db {
driver="org.postgresql.Driver"
url="jdbc:postgresql://hostname:port/dbname"
user=user
password="pass"
}
}
It does not seem as if it should not work, and yet...
Should I change my driver class somehow? Is it something else?
Note: as can be seen in the stacktrace I am using
Java 1.7.0_79
Scala 2.11.7
akka 2.3.11 (I share the config instance for slick and akka)
slick 3.0.0
HikariCP-java6 2.3.8
tminglei's slick-pg_core 0.9.0
Lastly, when debugging thru the jdk code at Class.class (decompiled line 143)
Constructor tmpConstructor1 = this.cachedConstructor;
I get the following (toString'ed) value (as shown by intellij):
private com.github.tminglei.MyPostgresDriver$()
Could this be indicative of the problem? If so how should I fix it?
EDIT
I have replaced the custom driver configuration with the stock PostgresDriver like so:
slick.dbs.default {
driver="slick.driver.PostgresDriver$"
db {
driver="org.postgresql.Driver"
url="jdbc:postgresql://hostname:port/dbname"
user=user
password="pass"
}
}
The error is the same:
akka.actor.ActorInitializationException: exception during creation
...
Caused by: java.lang.RuntimeException: driverClassName specified class 'slick.driver.PostgresDriver$' could not be loaded
...
Caused by: java.lang.IllegalAccessException: Class com.zaxxer.hikari.AbstractHikariConfig can not access a member of class slick.driver.PostgresDriver$ with modifiers "private"
I had a similar problem.
I think you are using Database.forConfig("slick.dbs.default") but your config file is in DatabaseConfig format.
Instead, try using:
val dbConfig: DatabaseConfig[PostgresDriver] = DatabaseConfig.forConfig("slick.dbs.default")
val db = dbConfig.db

shark/spark throws NPE when querying a table

The development part of shark/spark wiki is really brief, so I tried to put together a code in an effort to programmatically query a table. Here it is ...
object Test extends App {
val master = "spark://localhost.localdomain:8084"
val jobName = "scratch"
val sparkHome = "/home/shengc/Downloads/software/spark-0.6.1"
val executorEnvVars = Map[String, String](
"SPARK_MEM" -> "1g",
"SPARK_CLASSPATH" -> "",
"HADOOP_HOME" -> "/home/shengc/Downloads/software/hadoop-0.20.205.0",
"JAVA_HOME" -> "/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64",
"HIVE_HOME" -> "/home/shengc/Downloads/software/hive-0.9.0-bin"
)
val sc = new shark.SharkContext(master, jobName, sparkHome, Nil, executorEnvVars)
sc.sql2console("create table src")
sc.sql2console("load data local inpath '/home/shengc/Downloads/software/hive-0.9.0-bin/examples/files/kv1.txt' into table src")
sc.sql2console("select count(1) from src")
}
I can create table src and load data into src fine, but the last query threw NPE and failed, here is the output...
13/01/06 17:33:20 INFO execution.SparkTask: Executing shark.execution.SparkTask
13/01/06 17:33:20 INFO shark.SharkEnv: Initializing SharkEnv
13/01/06 17:33:20 INFO execution.SparkTask: Adding jar file:///home/shengc/workspace/shark/hive/lib/hive-builtins-0.9.0.jar
java.lang.NullPointerException
at shark.execution.SparkTask$$anonfun$execute$5.apply(SparkTask.scala:58)
at shark.execution.SparkTask$$anonfun$execute$5.apply(SparkTask.scala:55)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:34)
at scala.collection.mutable.ArrayOps.foreach(ArrayOps.scala:38)
at shark.execution.SparkTask.execute(SparkTask.scala:55)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1326)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1118)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
at shark.SharkContext.sql(SharkContext.scala:58)
at shark.SharkContext.sql2console(SharkContext.scala:84)
at Test$delayedInit$body.apply(Test.scala:20)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:60)
at scala.App$$anonfun$main$1.apply(App.scala:60)
at scala.collection.LinearSeqOptimized$class.foreach(LinearSeqOptimized.scala:59)
at scala.collection.immutable.List.foreach(List.scala:76)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:30)
at scala.App$class.main(App.scala:60)
at Test$.main(Test.scala:4)
at Test.main(Test.scala)
FAILED: Execution Error, return code -101 from shark.execution.SparkTask13/01/06 17:33:20 ERROR ql.Driver: FAILED: Execution Error, return code -101 from shark.execution.SparkTask
13/01/06 17:33:20 INFO ql.Driver: </PERFLOG method=Driver.execute start=1357511600030 end=1357511600054 duration=24>
13/01/06 17:33:20 INFO ql.Driver: <PERFLOG method=releaseLocks>
13/01/06 17:33:20 INFO ql.Driver: </PERFLOG method=releaseLocks start=1357511600054 end=1357511600054 duration=0>
However, I can query src table by typing in select * from src within the shell invoked by bin/shark-withinfo
You might ask me how about trying that sql in the shell trigged by "bin/shark-shell". Well, I cannot get into that shell. Here is the error I came across...
https://groups.google.com/forum/?fromgroups=#!topic/shark-users/glZzrUfabGc
[EDIT 1]: this NPE seems to be resulting from SharkENV.sc has not been set, so I added
shark.SharkEnv.sc = sc
right before any sql2console opertions are executed. It then complained ClassNotFoundException of scala.tools.nsc, so I manually put scala-compiler in the classpath. After that, the code complained another ClassNotFoundException, which I cannot figure out how to fix it, since I did put shark jar in classpath.
13/01/06 18:09:34 INFO cluster.TaskSetManager: Lost TID 1 (task 1.0:1)
13/01/06 18:09:34 INFO cluster.TaskSetManager: Loss was due to java.lang.ClassNotFoundException: shark.execution.TableScanOperator$$anonfun$preprocessRdd$3
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
[EDIT 2]: OK, I figured out another code which can fulfill what I want by following exactly shark's source code of how to initialize the interactive repl.
System.setProperty("MASTER", "spark://localhost.localdomain:8084")
System.setProperty("SPARK_MEM", "1g")
System.setProperty("SPARK_CLASSPATH", "")
System.setProperty("HADOOP_HOME", "/home/shengc/Downloads/software/hadoop-0.20.205.0")
System.setProperty("JAVA_HOME", "/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64")
System.setProperty("HIVE_HOME", "/home/shengc/Downloads/software/hive-0.9.0-bin")
System.setProperty("SCALA_HOME", "/home/shengc/Downloads/software/scala-2.9.2")
shark.SharkEnv.initWithSharkContext("scratch")
val sc = shark.SharkEnv.sc.asInstanceOf[shark.SharkContext]
sc.sql2console("select * from src")
this is ugly, but at least it works. Any comments of how to write a more robust piece of code is welcome!!
For whoever wishes to programmatically operate on shark, please note that all hive and shark jars must be in your CLASSPATH, and scala compiler has to be in your classpath too. The other important thing is hadoop's conf should be in the classpath too.
I believe the issue is your SharkEnv is not initialized.
I'm using shark 0.9.0 (but I believe you have to initialize SharkEnv in 0.6.1 too), and my SharkEnv is initialized in the following way:
// SharkContext
val sc = new SharkContext(master,
jobName,
System.getenv("SPARK_HOME"),
Nil,
executorEnvVar)
// Initialize SharkEnv
SharkEnv.sc = sc
// create and populate table
sc.runSql("CREATE TABLE src(key INT, value STRING)")
sc.runSql("LOAD DATA LOCAL INPATH '${env:HIVE_HOME}/examples/files/kv1.txt' INTO TABLE src")
// print result to stdout
println(sc.runSql("select * from src"))
println(sc.runSql("select count(*) from src"))
Also, try to query data from src table (comment line with "select count(*) ...") without aggregating functions, I had similar issue when data query was ok, but count(*) throwed exception, fixed by adding mysql-connector-java.jar to yarn.application.classpath in my case.