I have an application on Glassfish v2 ( on Mac OS 10.5.8 and Java 1.6 ) that uses JavaDB and Toplinks that comes with the Glassfish bundle. Everything works fine.
I have installed PostgreSQL 8.4 and the JDBC v4 driver. Both Glassfish and Postgres server run on localhost. From Netbeans, I create a connection to a database on the Postgres server, and it works fine, I can manually create and delete tables.
I create a connection pool, resource and persistence unit for this connection to the Posgres server. When I deploy I have the following error :
ADM1041:Sent the event to instance:
[ResourceDeployEvent -- reference-added jdbc/jdbc/MyDatasource]
CORE5004: Resource Deployed: [jdbc:jdbc/MyDatasource].
TopLink, version: Oracle TopLink Essentials - 2.1 (Build b60e-fcs (12/23/2008))
Server: unknown
RAR5038:Unexpected exception while creating resource for pool MyConnectionPool.
Exception : Connection could not be allocated because:
FATAL: database "null" does not exist
I read that with Postgres 8.4, localhost request are accepted by default, so I haven't changed anything in postgres.conf.
I am missing something, but I cant see what.
Thanks in advance for any hint.
Tart
First ensure that MacOSX/GlassFish really uses the specified Java version (test with: java -version). Then try the following:
asadmin create-jdbc-connection-pool
--datasourceclassname org.postgresql.ds.PGSimpleDataSource
--restype javax.sql.DataSource --property portNumber=5432:password=secret:user=postgres:serverName=localhost:databaseName=postgres
test-pool
and
asadmin create-jdbc-resource --connectionpoolid test-pool jdbc/Postgres
remember to change the username, password, server, port and database to reflect your setup. Then test the datasource using:
asadmin ping-connection-pool test-pool
if this does not work then you have miss-configured your data source.
I don't know the stack, but it sounds like you haven't specified the database name in the connection. See http://jdbc.postgresql.org/documentation/84/connect.html for a list of parameters you can/should set on the connection.
Related
The Goal
I need to get data from a MongoDB updated every 15 minutes to use to build into a PowerBI report.
The Gear
I am connected from my windows machine via ssh to an RHEL server (server a). This server is running powerbi connector (SQLD) which is connected to my MongoDB that is running on a different server (server b). I'm also running MySQL on server b. My powerBI connector is installed on server b.
Exactly where I'm at
I am using the steps listed here (and all the associated pages) and have tried everything listed short of writing a config file, as the fact that things are working on mongosqld's end makes me think I don't need it... and if I can't get it working manually, having a config file won't exactly help.
https://docs.mongodb.com/bi-connector/current/connect/powerbi/
Using:
mongosqld --mongo-uri="mongodb://10.xxx.xxx.xx" --auth --mongo-username="ThisGuy" --mongo-password="test"
I successfully map the schema and show an active connection in the command window. I can also access my database from compass using an authorization enabled URL.
When I set up an ODBC connector I use the IP of server a, the user and password from my url, and port 3307. Nothing shows up in the dropdown, when I click 'test' I get the following message:
Connection Failed
[MongoDB][ODBC 1.4(w) Driver]Can't connect to MySQL server4 on '10.xxx.xxx.xxx' (10060)
I have also tried 3306, 27017, and 27015. Just to be safe I also added firewall rules for all traffic on these ports. I've tried this many times, including (just for the hell of it, and I'm kind of new to this stuff) the ip of server b, the ip of my machine, the credentials for MySQL, basically any combination of these things that I can think of.
In powerBI, my odbc driver shows up, and when selected in the dropdown, it asks for a username and password. I have tried both mongo credentials and MySQL. Not sure which I should be using?
regardless, I get the following error inside PowerBI:
Details: "ODBC: ERROR [HY000] [MySQL][ODBC 1.4(w) Driver]Can't connect to MySQL server on '10.xxx.xxx.xxx' (10061)
ERROR [HY000] [MySQL][ODBC 1.4(w) Driver]Can't connect to MySQL server on '10.xxx.xxx.xxx' (10061)"
Thoughts
I don't control either server, although I have root access, being new to this tech and company I am wary of screwing anything up that a co-worker will have to fix. I read in a different SO thread that maybe I need to downgrade the version of MySQL that is running on the server and that it could fix the problem, but I don't think that it will actually help and am afraid I might screw up something else on the server if I do this:
The C Authentication plugin was developed against MySQL 5.7.18 Community Edition (64-bit), and tested with MySQL 5.7.18 Community Edition and the latest version of MongoDB Connector for BI. The plugin is not compatible with MySQL Server or Connector/ODBC driver version 8 and later.
https://dba.stackexchange.com/questions/219550/access-denied-when-connecting-to-mongosqld-with-mysql
Maybe the problem is that server B is listening to server a on port 3307, and that there is another unknown port (not mentioned above) that my ODBC driver must be listening to? I'm not sure how to test for this when you get a step away like this.
So that's it. I'm really stuck and would love some help, I am going to try the downgrade tomorrow if nothing else shakes loose and will keep this thread updated.
Thank you for reading
We have an application on IBM WebSphere Application Server 7.x and it connects to a remote database on z/os DB2 10.x. For annual operation, DB2 shut down and restarted. After starting the database, we first get
com.ibm.websphere.ce.cm.StaleConnectionException
and then we get
The database manager is not able to accept new requests, has terminated all requests in progress, or has terminated this particular request due to unexpected error conditions detected at the target system. ERRORCODE=-4499, SQLSTATE=58009
The connection between WebSphere and DB2 tested by 'test Connection' in WAS datasource. Both systems are up and running but there is no correct connection between them! There was no change in DB2, WAS, and JDBC driver.
Update: The JDBC driver version is 4.15.134, connection properties is IBM WebSphere default setting and the connection is direct to DB2. Another problem later showed that while the connection still has the problem, executing the query directly on z/OS's DB2 gets the same the error. The query consist of a select with a join on two different tables, selecting on each table is ok, but the final query does not work and gets ERRORCODE=-4499, SQLSTATE=58009.
Update 2
The detail of environment is: IBM WebSphere Application Server 7.0.0.45, DB2 10.1, Java version 1.6 SR16 and z/OS 1.13.
This specific query gets the error in all environments, on all application server, z/os SPUFI, database viewer, such as DBeaver.
Any help is greatly appreciated.
Finally, we found the solution, ran REORG and RUNSTATS on both tables and on all their partitions, and the error vanished both on the application and SPUFI. I guess something went wrong during restart and tables corrupted. Now everything is ok.
If I got you correctly, you complain on inability of the driver to reestablish the database connections after the DB2 for Z/OS restart.
If yes, then have you tried to set the corresponding connection properties described at the following link?
Configuration of Sysplex workload balancing and automatic client reroute for Java clients
I am using Corda Enterprise 3.1 and trying to move the vault from H2 to Oracle 12c using the Database Migration tool supplied. I have made the changes in the node.conf configuration using values I know work to connect from IntelliJ. The driver is Oracle's ojdbc8.jar that came with Oracle SQL Developer. The connection string is below but with some specifics masked. It doesn't work. Any ideas?
dataSourceClassName = oracle.jdbc.pool.OracleDataSource
dataSource.url = "jdbc:oracle:thin:#xxxxx.wellsfargo.com:1539:XXXXXX"
Here is the error I get:
-- 2018-08-07T00:04:55,757Z migration.tool.handleCommand - Exporting the current db migrations ... Failed to create datasource. Please
check that the correct JDBC driver is installed in one of the
following folders:
- /apps/team/drivers/jdbc
Caused By java.sql.SQLRecoverableException: IO Error: The Network Adapter could
not establish the connection
This issue was caused by an error in the host URL.
I have installed the DSP(Dreamfactory Service Platform) locally on my Mac Book Pro using Bitnani.
I have a PostGreSQL server running locally on my Macbook, which I want to connect to using the DSP.
I am successfully able to connect to my PostGreSQL server from other applications, which essentially means that there is no problem with the setup.
However, on trying to connect the same from DSP I get the error:- "Failed to launch service "sql": CDbConnection failed to open the DB connection."
My connection string is :- "pgsql:host=localhost;dbname=Pinu"
Also, the password has been correctly entered.
The port is default as 5432. Whether or not I enter the same in the connection string, the connection always fails.
Even though I am trying to add the service as Remote SQL DB, I know that it's actually on the same local host. Not sure if that is the issue.
I also tried entering - 127.0.0.1 in place of localhost, but still I see the same issue.
Any help in this regard would be highly appreciated!
After talking to you via email, it looks like the root issue here is that you haven't successfully upgraded your DSP to the latest version. We are releasing DreamFactory version 1.8 on Bitnami tomorrow, so you should upgrade to the latest version.
As far as your PostgreSQL issue let's explore some options:
1) Connection strings:
a) pgsql:host=localhost;dbname=Pinu
b)
pgsql:host=localhost:5432;dbname-Pinu
c)
pgsql:host=localhost;port=5432;dbname=Pinu
2) If these don't work, try substituting your localhost with
127.0.0.1 (as you've tried previously, but test this in all scenarios).
3)
pgsql:host=localhost;port=5432;dbname=series1;schema=schema_name_here
Typically, Option A should work without a problem.
Give these a try if you would, and if you need some help upgrading then reach out to me again via email.
--Thanks,
Mark
I have a Tomcat and PostgreSQL installed on a server. I'm having a connection problem trying to connect from my servlet to PostgreSQL database using c3p0 pool.
I can reach DB if I'm running Tomcat locally on my laptop. Also I can connect from server to DB using psql (i.e. command line sql utility). But when I'm trying to deploy my servlet to server and establish a connection I'm getting the following error:
java.sql.SQLException: Connections could not be acquired from the underlying database!
com.mchange.v2.sql.SqlUtils.toSQLException(SqlUtils.java:106)
...
com.mchange.v2.resourcepool.CannotAcquireResourceException: A ResourcePool could not acquire a resource from its primary factory or source.
com.mchange.v2.resourcepool.BasicResourcePool.awaitAvailable(BasicResourcePool.java:1319)
com.mchange.v2.resourcepool.BasicResourcePool.prelimCheckoutResource(BasicResourcePool.java:557)
com.mchange.v2.resourcepool.BasicResourcePool.checkoutResource(BasicResourcePool.java:477)
What should I check to locate a problem? It should be a trivial issue but may be due to 4 a.m. I'm missing something :) Thanks in advance!
PS: Connection from all network interfaces are allowed to database. PostgreSQL JDBC driver and c3p0 pool are distributed in WAR. Tomcat configuration is very default. JNDI is not used.
You need to check a few things:
java.policy which tomcat is using
(e.g.
/etc/tomcat5.5/policy.d/02debian.policy)
db server settings (e.g.
/etc/postgresql/pg_hba.conf)
try connecting without pool first as
in my case c3p0 was hiding important information from me
Adding to #Alexey's answer, I had this issue with Tomcat and PostgreSQL 9.4. In my case, the md5 authentication method in postgres was causing the issue.
If you are using Windows server or RHEL server, make sure you update the authentication method in pg_hba.conf file. Modify it to trust and restart postgresql.