Trying to do integration testing using dockerzied postgres
12:49:19.647 [main] ERROR org.testcontainers.dockerclient.EnvironmentAndSystemPropertyClientProviderStrategy - ping failed with configuration Environment variables, system properties and defaults. Resolved:
dockerHost=unix:///var/run/docker.sock
apiVersion='{UNKNOWN_VERSION}'
registryUrl='https://index.docker.io/v1/'
registryUsername='aequalis'
registryPassword='null'
registryEmail='null'
dockerConfig='DefaultDockerClientConfig[dockerHost=unix:///var/run/docker.sock,registryUsername=aequalis,registryPassword=<null>,registryEmail=<null>,registryUrl=https://index.docker.io/v1/,dockerConfigPath=/home/aequalis/.docker,sslConfig=<null>,apiVersion={UNKNOWN_VERSION},dockerConfig=<null>]'
due to org.rnorth.ducttape.TimeoutException: Timeout waiting for result with exception
org.rnorth.ducttape.TimeoutException: Timeout waiting for result with exception
at org.rnorth.ducttape.unreliables.Unreliables.retryUntilSuccess(Unreliables.java:51)
at org.testcontainers.dockerclient.DockerClientProviderStrategy.ping(DockerClientProviderStrategy.java:190)
at org.testcontainers.dockerclient.EnvironmentAndSystemPropertyClientProviderStrategy.test(EnvironmentAndSystemPropertyClientProviderStrategy.java:42)
at org.testcontainers.dockerclient.DockerClientProviderStrategy.lambda$getFirstValidStrategy$2(DockerClientProviderStrategy.java:113)
at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267)
org.testcontainers.dockerclient.DockerClientProviderStrategy.getFirstValidStrategy(DockerClientProviderStrategy.java:148)
at org.testcontainers.DockerClientFactory.client(DockerClientFactory.java:105)
at org.testcontainers.containers.GenericContainer.<init>(GenericContainer.java:142)
at org.testcontainers.containers.JdbcDatabaseContainer.<init>(JdbcDatabaseContainer.java:45)
at org.testcontainers.containers.PostgreSQLContainer.<init>(PostgreSQLContainer.java:30)
at com.lava.configuration.management.activity.AbstractIntegrationTest.<clinit>(AbstractIntegrationTest.java:21)
at sun.misc.Unsafe.ensureClassInitialized(Native Method)
at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:156)
at java.lang.reflect.Field.acquireFieldAccessor(Field.java:1088)
at java.lang.reflect.Field.getFieldAccessor(Field.java:1069)
at java.lang.reflect.Field.get(Field.java:393)
at org.junit.runners.model.FrameworkField.get(FrameworkField.java:73)
at org.junit.runners.model.TestClass.getAnnotatedFieldValues(TestClass.java:230)
at org.junit.runners.ParentRunner.classRules(ParentRunner.java:255)
at org.junit.runners.ParentRunner.withClassRules(ParentRunner.java:244)
at org.junit.runners.ParentRunner.classBlock(ParentRunner.java:194)
at org.junit.runners.ParentRunner.run(ParentRunner.java:362)
org.rnorth.ducttape.unreliables.Unreliables.lambda$retryUntilSuccess$0(Unreliables.java:41)
at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
at java.util.concurrent.FutureTask.run(FutureTask.java)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.sun.jna.LastErrorException: [13] Permission denied
at org.testcontainers.shaded.org.scalasbt.ipcsocket.UnixDomainSocketLibrary.connect(Native Method)
at org.testcontainers.shaded.org.scalasbt.ipcsocket.UnixDomainSocket.<init>(UnixDomainSocket.java:57)
... 36 common frames omitted
Above error is thrown whiling connecting to dockerized postgres for integration test. Below is the configuration code to connect. It seems like permission issue on the docker image.
#RunWith(SpringJUnit4ClassRunner.class)
#SpringBootTest(classes = LibraryConfigurationApplication.class, webEnvironment = WebEnvironment.RANDOM_PORT)
#ContextConfiguration(initializers = AbstractIntegrationTest.Initializer.class)
public abstract class AbstractIntegrationTest {
#ClassRule
public static PostgreSQLContainer postgreSQLContainer = new PostgreSQLContainer("kartoza/postgis:12.0")
.withDatabaseName("integration-tests-db")
.withUsername("docker")
.withPassword("docker");
static class Initializer
implements ApplicationContextInitializer<ConfigurableApplicationContext> {
public void initialize(ConfigurableApplicationContext configurableApplicationContext) {
TestPropertyValues.of(
"spring.datasource.url=" + postgreSQLContainer.getJdbcUrl(),
"spring.datasource.username=" + postgreSQLContainer.getUsername(),
"spring.datasource.password=" + postgreSQLContainer.getPassword()
).applyTo(configurableApplicationContext.getEnvironment());
}
}
}
Please helpout to fix this issue
You can try this I think it should solve your problem as it is mostly a missing permission
https://docs.docker.com/engine/install/linux-postinstall/
Related
I'm running Hasura GraphQL on docker instance(window machine) exposed at
http://192.168.99.100:8080/v1/graphql
I want to perform subscription in my verticle but getting the following exception:
com.apollographql.apollo.exception.ApolloNetworkException: Subscription failed
at com.apollographql.apollo.internal.RealApolloSubscriptionCall$SubscriptionManagerCallback.onNetworkError(RealApolloSubscriptionCall.java:246)
at com.apollographql.apollo.internal.subscription.RealSubscriptionManager$SubscriptionRecord.notifyOnNetworkError(RealSubscriptionManager.java:524)
at com.apollographql.apollo.internal.subscription.RealSubscriptionManager.onTransportFailure(RealSubscriptionManager.java:296)
at com.apollographql.apollo.internal.subscription.RealSubscriptionManager$SubscriptionTransportCallback$2.run(RealSubscriptionManager.java:556)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
at java.net.SocketInputStream.read(SocketInputStream.java:170)
at java.net.SocketInputStream.read(SocketInputStream.java:141)
Here's my sample code for apollo graphQL client:
public class MyFirstVerticle extends AbstractVerticle {
#Override
public void start(Future<Void> fut) {
OkHttpClient okHttp = new OkHttpClient().newBuilder().build();
ApolloClient subscribe = ApolloClient.builder()
.serverUrl("http://192.168.99.100:8080/v1/graphql")
.subscriptionTransportFactory(new WebSocketSubscriptionTransport.Factory(
"wss://192.168.99.100:8080/v1/graphql", okHttp))
.build();
subscribe.subscribe(new MySubscription(1)).execute(new ApolloSubscriptionCall.Callback<Optional<MySubscription.Data>>() {
.....
....
I found the solution, apparently it has to do with the type of connection you make. Im not an expert of this but, have you tried change wss to http?
:-)
I am trying to use the ConfigurableMongoDbMessageStore as a message-store in my spring-integration aggregator component. For some reason the following exception is thrown:
Caused by: org.springframework.data.mapping.model.MappingInstantiationException: Failed to instantiate org.springframework.messaging.support.GenericMessage using constructor NO_CONSTRUCTOR with arguments
at org.springframework.data.convert.ReflectionEntityInstantiator.createInstance(ReflectionEntityInstantiator.java:67)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:84)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:272)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:245)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.readValue(MappingMongoConverter.java:1491)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$MongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1389)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$AssociationAwareMongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1438)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter$AssociationAwareMongoDbPropertyValueProvider.getPropertyValue(MappingMongoConverter.java:1401)
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:71)
at org.springframework.data.mapping.model.SpELExpressionParameterValueProvider.getParameterValue(SpELExpressionParameterValueProvider.java:49)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator$EntityInstantiatorAdapter.extractInvocationArguments(ClassGeneratingEntityInstantiator.java:250)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator$EntityInstantiatorAdapter.createInstance(ClassGeneratingEntityInstantiator.java:223)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:84)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:272)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:245)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:194)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:190)
at org.springframework.data.mongodb.core.convert.MappingMongoConverter.read(MappingMongoConverter.java:78)
at org.springframework.data.mongodb.core.MongoTemplate$ReadDocumentCallback.doWith(MongoTemplate.java:3017)
at org.springframework.data.mongodb.core.MongoTemplate.executeFindMultiInternal(MongoTemplate.java:2673)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:2404)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:2387)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:823)
at org.springframework.data.mongodb.core.MongoTemplate.findOne(MongoTemplate.java:772)
at org.springframework.integration.mongodb.store.ConfigurableMongoDbMessageStore.getMessageGroup(ConfigurableMongoDbMessageStore.java:115)
at org.springframework.integration.mongodb.store.ConfigurableMongoDbMessageStore.addMessageToGroup(ConfigurableMongoDbMessageStore.java:138)
at org.springframework.integration.aggregator.AbstractCorrelatingMessageHandler.store(AbstractCorrelatingMessageHandler.java:757)
at org.springframework.integration.aggregator.AbstractCorrelatingMessageHandler.handleMessageInternal(AbstractCorrelatingMessageHandler.java:479)
at org.springframework.integration.handler.AbstractMessageHandler.handleMessage(AbstractMessageHandler.java:162)
... 83 common frames omitted
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.messaging.support.GenericMessage]: No default constructor found; nested exception is java.lang.NoSuchMethodException: org.springframework.messaging.support.GenericMessage.<init>()
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:129)
at org.springframework.data.convert.ReflectionEntityInstantiator.createInstance(ReflectionEntityInstantiator.java:64)
... 111 common frames omitted
Caused by: java.lang.NoSuchMethodException: org.springframework.messaging.support.GenericMessage.<init>()
at java.base/java.lang.Class.getConstructor0(Class.java:3350)
at java.base/java.lang.Class.getDeclaredConstructor(Class.java:2554)
at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:122)
... 112 common frames omitted
The bean is created and used as follows:
In SpringIntegrationBeans.java
#Autowired
private MongoTemplate mongoTemplate;
#Bean(name = "configurableMongoDbMessageStore")
public ConfigurableMongoDbMessageStore configurableMongoDbMessageStore() {
return new ConfigurableMongoDbMessageStore(mongoTemplate);
}
In spring-integration.xml
<int:aggregator id="myAggregator"
ref="testingAggregator"
message-store="configurableMongoDbMessageStore"/>
I am using the following spring-integration-mongodb package:
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-mongodb</artifactId>
<version>5.1.2.RELEASE</version>
</dependency>
I found similar q&a in Spring Integration Aggregator with MongoDbMessageStore: Failed to instantiate GenericMessage: No default constructor found but that seems to be applicable for MongoDbMessageStore rather than ConfigurableMongoDbMessageStore
I would appreciate any help/advice.
The mongo template needs a suitably configured converter; inject the db factory instead and the store will create an appropriate template.
#Autowired
private MongoDbFactory dbFactory;
#Bean(name = "configurableMongoDbMessageStore")
public ConfigurableMongoDbMessageStore configurableMongoDbMessageStore() {
return new ConfigurableMongoDbMessageStore(dbFactory);
}
When i run my pipeline from local machine, i can update the table which resides in the cloud Sql instance. But, when i moved this to run using DataflowRunner, the same is failing with the below exception.
To connect from my eclipse, I created the data source config as
.create("com.mysql.jdbc.Driver", "jdbc:mysql://<ip of sql instance > :3306/mydb") .
The same i changed to
.create("com.mysql.jdbc.GoogleDriver", "jdbc:google:mysql://<project-id>:<instance-name>/my-db") while running through the Dataflow runner.
Should i prefix the zone information of the instance to ?
The exception i get when i run this is given below :
Jun 22, 2017 6:53:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2017-06-22T13:23:51.583Z: (840be37ab35d3d0d): Starting 2 workers in us-central1-f...
Jun 22, 2017 6:53:58 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2017-06-22T13:23:51.634Z: (dabfae1dc9365d10): Executing operation JdbcIO.Read/Create.Values/Read(CreateSource)+JdbcIO.Read/ParDo(Read)+JdbcIO.Read/ParDo(Anonymous)+JdbcIO.Read/GroupByKey/Reify+JdbcIO.Read/GroupByKey/Write
Jun 22, 2017 6:54:49 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2017-06-22T13:24:44.762Z: (21395b94f8bf7f61): Workers have started successfully.
SEVERE: 2017-06-22T13:25:30.214Z: (3b988386f963503e): java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot load JDBC driver class 'com.mysql.jdbc.GoogleDriver'
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:289)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:261)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:55)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:43)
at com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:78)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:152)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.doWork(DataflowWorker.java:272)
at com.google.cloud.dataflow.worker.runners.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:244)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:125)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:105)
at com.google.cloud.dataflow.worker.runners.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:92)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: java.sql.SQLException: Cannot load JDBC driver class 'com.mysql.jdbc.GoogleDriver'
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at org.apache.beam.sdk.io.jdbc.JdbcIO$Read$ReadFn$auxiliary$M7MKjX9p.invokeSetup(Unknown Source)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:65)
at com.google.cloud.dataflow.worker.runners.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:47)
at com.google.cloud.dataflow.worker.runners.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:100)
at com.google.cloud.dataflow.worker.runners.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:70)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:365)
at com.google.cloud.dataflow.worker.runners.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:278)
... 14 more
Any help to fix this is really appreciated. This is my first attempt to run a beam pipeline as a dataflow job.
PipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class);
((DataflowPipelineOptions) options).setNumWorkers(2);
((DataflowPipelineOptions)options).setProject("xxxxx");
((DataflowPipelineOptions)options).setStagingLocation("gs://xxxx/staging");
((DataflowPipelineOptions)options).setRunner(DataflowRunner.class);
((DataflowPipelineOptions)options).setStreaming(false);
options.setTempLocation("gs://xxxx/tempbucket");
options.setJobName("sqlpipeline");
PCollection<Account> collection = dataflowPipeline.apply(JdbcIO.<Account>read()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration
.create("com.mysql.jdbc.GoogleDriver", "jdbc:google:mysql://project-id:testdb/db")
.withUsername("root").withPassword("root"))
.withQuery(
"select account_id,account_parent,account_description,account_type,account_rollup,Custom_Members from account")
.withCoder(AvroCoder.of(Account.class)).withStatementPreparator(new JdbcIO.StatementPreparator() {
public void setParameters(PreparedStatement preparedStatement) throws Exception {
preparedStatement.setFetchSize(1);
preparedStatement.setFetchDirection(ResultSet.FETCH_FORWARD);
}
}).withRowMapper(new JdbcIO.RowMapper<Account>() {
public Account mapRow(ResultSet resultSet) throws Exception {
Account account = new Account();
account.setAccount_id(resultSet.getInt("account_id"));
account.setAccount_parent(resultSet.getInt("account_parent"));
account.setAccount_description(resultSet.getString("account_description"));
account.setAccount_type(resultSet.getString("account_type"));
account.setAccount_rollup("account_rollup");
account.setCustom_Members("Custom_Members");
return account;
}
}));
Have you properly pulled in the com.google.cloud.sql/mysql-socket-factory maven dependency? Looks like you are failing to load the class.
https://cloud.google.com/appengine/docs/standard/java/cloud-sql/#Java_Connect_to_your_database
Hi I think it's better to move on with "com.mysql.jdbc.Driver" because google driver is supporting for app engine deployments
So as it goes this is what my pipeline configurations look alike and it works perfectly fine for me
PCollection < KV < Double, Double >> exchangeRates = p.apply(JdbcIO. < KV < Double, Double >> read()
.withDataSourceConfiguration(JdbcIO.DataSourceConfiguration.create("com.mysql.jdbc.Driver", "jdbc:mysql://ip:3306/dbname?user=root&password=root&useUnicode=true&characterEncoding=UTF-8")
)
.withQuery(
"SELECT PERIOD_YEAR, PERIOD_YEAR FROM SALE")
.withCoder(KvCoder.of(DoubleCoder.of(), DoubleCoder.of()))
.withRowMapper(new JdbcIO.RowMapper < KV < Double, Double >> () {
#Override
public KV<Double, Double> mapRow(java.sql.ResultSet resultSet) throws Exception {
LOG.info(resultSet.getDouble(1)+ "Came");
return KV.of(resultSet.getDouble(1), resultSet.getDouble(2));
}
}));
Hope it will help
I want to load data into titan ,which's backend is hbase.So I changed the example from https://github.com/thinkaurelius/titan/blob/titan10/titan-core/src/main/java/com/thinkaurelius/titan/example/GraphOfTheGodsFactory.java
a little.But it ocurred many Exceptions when i connect.
Exception in thread "main" com.thinkaurelius.titan.core.TitanException: Could not open global configuration
at com.thinkaurelius.titan.diskstorage.Backend.getStandaloneGlobalConfiguration(Backend.java:451)
at com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration.(GraphDatabaseConfiguration.java:1322)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:94)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:84)
at com.thinkaurelius.titan.core.TitanFactory$Builder.open(TitanFactory.java:139)
at titantest.TitanLoad.main(TitanLoad.java:106)
Caused by: com.thinkaurelius.titan.diskstorage.TemporaryBackendException: Temporary failure in storage backend
at com.thinkaurelius.titan.diskstorage.hbase.HBaseStoreManager.ensureTableExists(HBaseStoreManager.java:759)
at com.thinkaurelius.titan.diskstorage.hbase.HBaseStoreManager.ensureColumnFamilyExists(HBaseStoreManager.java:831)
at com.thinkaurelius.titan.diskstorage.hbase.HBaseStoreManager.openDatabase(HBaseStoreManager.java:456)
at com.thinkaurelius.titan.diskstorage.keycolumnvalue.KeyColumnValueStoreManager.openDatabase(KeyColumnValueStoreManager.java:29)
at com.thinkaurelius.titan.diskstorage.Backend.getStandaloneGlobalConfiguration(Backend.java:449)
... 5 more
Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: java.lang.IllegalAccessError: tried to access method com.google.common.base.Stopwatch.()V from class org.apache.hadoop.hbase.zookeeper.MetaTableLocator
at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:229)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:202)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:299)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:278)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:140)
at org.apache.hadoop.hbase.client.ClientScanner.(ClientScanner.java:135)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:845)
at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:600)
at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:364)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:281)
at com.thinkaurelius.titan.diskstorage.hbase.HBaseAdmin1_0.tableExists(HBaseAdmin1_0.java:70)
at com.thinkaurelius.titan.diskstorage.hbase.HBaseStoreManager.ensureTableExists(HBaseStoreManager.java:753)
... 9 more
Caused by: java.lang.IllegalAccessError: tried to access method com.google.common.base.Stopwatch.()V from class org.apache.hadoop.hbase.zookeeper.MetaTableLocator
at org.apache.hadoop.hbase.zookeeper.MetaTableLocator.blockUntilAvailable(MetaTableLocator.java:434)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getMetaRegionLocation(ZooKeeperRegistry.java:60)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateMeta(ConnectionManager.java:1139)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1106)
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:293)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:147)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:56)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
... 19 more
Here is the code:
TitanGraph g = TitanFactory.build().set("storage.backend","hbase").set("storage.hostname","192.168.200.121,192.168.200.115")
.set("storage.hbase.table","mytitan").set("storage.hbase.ext.zookeeper.znode.parent","/hbase-unsecure").set("storage.port",2181).open();
BTW,I can connect to the Hbase cluster,it looks fine.
I am using Java 8, Hibernate 5.1 and eclipse its working fine when using java code but when I setting proxy in java code or passing as VM argument in eclipse to connect AWS Cloud then hibernate connection get failed.
Java code which I setting proxy.
System.setProperty("java.net.useSystemProxies", "true");
System.setProperty("http.proxyHost", "comproxy.net");
System.setProperty("http.proxyPort", "80");
System.setProperty("http.nonProxyHosts", "localhost|127.0.0.1");
System.setProperty("http.proxyUser", "tm_user");
System.setProperty("http.proxyPassword", "password#135");
or VM argument
-Djava.net.useSystemProxies=true
-Dhttp.proxyHost=comproxy.is
-Dhttp.proxyPort=80
-Dhttp.nonProxyHosts=localhost
-Dhttp.proxyUser=tm_user
-Dhttp.proxyPassword=pwd#135
Exception
INFO: HHH000115: Hibernate connection pool size: 10 (min=1)
Exception in thread "main" org.hibernate.service.spi.ServiceException: Unable to create requested service [org.hibernate.engine.jdbc.env.spi.JdbcEnvironment]
at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:244)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:208)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:189)
at org.hibernate.engine.jdbc.internal.JdbcServicesImpl.configure(JdbcServicesImpl.java:51)
at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.configureService(StandardServiceRegistryImpl.java:94)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:217)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:189)
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.handleTypes(MetadataBuildingProcess.java:352)
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.complete(MetadataBuildingProcess.java:111)
at org.hibernate.boot.model.process.spi.MetadataBuildingProcess.build(MetadataBuildingProcess.java:83)
at org.hibernate.boot.internal.MetadataBuilderImpl.build(MetadataBuilderImpl.java:418)
at org.hibernate.boot.internal.MetadataBuilderImpl.build(MetadataBuilderImpl.java:87)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:692)
at com.singtel.mi.database.MIQueueDaoImpl.getSessionFactory(MIQueueDaoImpl.java:67)
at com.singtel.mi.database.MIQueueDaoImpl.openCurrentSession(MIQueueDaoImpl.java:44)
at com.singtel.mi.service.MessageInboxService.findAllCampaign(MessageInboxService.java:66)
at com.singtel.mi.service.APICallService.postJSONDatatoAWS(APICallService.java:25)
at com.singtel.mi.service.TestMessageInbox.main(TestMessageInbox.java:8)
Caused by: org.hibernate.exception.JDBCConnectionException: Error calling Driver#connect
at org.hibernate.engine.jdbc.connections.internal.BasicConnectionCreator$1$1.convert(BasicConnectionCreator.java:105)
at org.hibernate.engine.jdbc.connections.internal.BasicConnectionCreator.convertSqlException(BasicConnectionCreator.java:123)
at org.hibernate.engine.jdbc.connections.internal.DriverConnectionCreator.makeConnection(DriverConnectionCreator.java:41)
at org.hibernate.engine.jdbc.connections.internal.BasicConnectionCreator.createConnection(BasicConnectionCreator.java:58)
at org.hibernate.engine.jdbc.connections.internal.PooledConnections.addConnections(PooledConnections.java:106)
at org.hibernate.engine.jdbc.connections.internal.PooledConnections.<init>(PooledConnections.java:40)
at org.hibernate.engine.jdbc.connections.internal.PooledConnections.<init>(PooledConnections.java:19)
at org.hibernate.engine.jdbc.connections.internal.PooledConnections$Builder.build(PooledConnections.java:138)
at org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl.buildPool(DriverManagerConnectionProviderImpl.java:110)
at org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl.configure(DriverManagerConnectionProviderImpl.java:74)
at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.configureService(StandardServiceRegistryImpl.java:94)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.initializeService(AbstractServiceRegistryImpl.java:217)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.getService(AbstractServiceRegistryImpl.java:189)
at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.buildJdbcConnectionAccess(JdbcEnvironmentInitiator.java:145)
at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:66)
at org.hibernate.engine.jdbc.env.internal.JdbcEnvironmentInitiator.initiateService(JdbcEnvironmentInitiator.java:35)
at org.hibernate.boot.registry.internal.StandardServiceRegistryImpl.initiateService(StandardServiceRegistryImpl.java:88)
at org.hibernate.service.internal.AbstractServiceRegistryImpl.createService(AbstractServiceRegistryImpl.java:234)
... 17 more
Caused by: java.sql.SQLException: Io exception: The Network Adapter could not establish the connection
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:124)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:161)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:273)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:318)
at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:343)
at oracle.jdbc.driver.T4CConnection.<init>(T4CConnection.java:147)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:31)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:545)
at org.hibernate.engine.jdbc.connections.internal.DriverConnectionCreator.makeConnection(DriverConnectionCreator.java:38)
... 32 more
Its failing because of proxy setting or -Djava.net.useSystemProxies=true or because of cloud?
public class HibernateUtil {
private final static SessionFactory sessionFactory;
//private static final SessionFactory sessionFactory = buildSessionFactory();
static{
try {
sessionFactory = new Configuration().configure("/com/singtel/mi/utils/hibernate.cfg.xml").buildSessionFactory();
} catch (Exception e) {
System.err.println("Seesion Factory Creation Failed---> "+e);
throw new ExceptionInInitializerError();
}
}
public static SessionFactory getSessionFactory(){
return sessionFactory;
}
}
I am not able to get root cause for this.