Can't query data on pgAdmin 4 but it works using psql and Python - postgresql

I have created a postgresql database on rds and I can connect and query it using psql on terminal or sqlalchemy on Python. When using pgAdmin4 I can't query or visualize the data but can create/alter tables, import data, ...
I have also tried connecting it to Metabase and DBeaver but the connection times out.
Any ideas why this is happening?
I have also connected the database using pgAdmin3 and was able to visualize the data and query it, but since I am using postgresql 11.4 it is not fully supported by pgAdmin3.
When I open the query tool on pgAdmin 4 I get this message:
could not send data to server. Socket is not connected could not send
startup packet: Socket is not connected.

Helped me:
changing 'localhost' in connection settings to '127.0.0.1'

Disconnecting from server, Changing the connection -Host to localhost under properties, fixed it for me, running PgAdmin 4 in windows 11 connecting to PgSQl in WSL2

Related

Why must I use an odbc-connection type to connect to a postgres database in Power BI?

Whenever I attempt to connect to my PostgreSQL database as a data source, I get the error message below.
Other users have recommended using an odbc connection? Why does an odbc connection work rather than a PostgreSQL connection for a PostgreSQL database in Power BI?
Had attempted to connect to my PostgreSQL database with all the correct credentials, failed with the error message mentioned. I had connected to the same database with an odbc connection perfectly.

Google Cloud SQL MySQL 5.7 denies connection when doing large import

I'm having difficulties migrating a database (~3gb sql file) from MySQL 5.6 to MySQL 5.7 on Google Cloud SQL.
First I made a dbdump of the MySQL 5.6 server database:
mysqldump -hxx.xx.xx.xx -uroot -pxxxx dbname --opt --hex-blob --default-character-set=utf8 --no-autocommit > dbname.sql
I then tried to import the database with cloudsql-import:
.go/bin/cloudsql-import --dump=dbname.sql --dsn='root:password#tcp(xx.xx.xx.xx:3306)/dbname'
The import starts but after a while (around 10 minutes) I receive the following error message:
2016/06/29 13:55:48 dial tcp xx.xx.xx.xx:3306: getsockopt: connection refused
Any further connection attempts to the MySQL server are denied with the following error message:
ERROR 2003 (HY000): Can't connect to MySQL server on 'xx.xx.xx.xx' (111)
Only a full restart (made from the google cloud platform console) makes it possible to connect to again.
I made a full migration from 5.5 to 5.6 using this method not so long ago. Any ideas why this doesn't work with 5.7?
would you check the storage disk usage on the Console Overview page of the instance? If the storage is full, you can increase the storage size of your instance by changing the storage size value in Edit page.
If the binary logging is enabled, lots of space will be taken by binary logs. You could consider to turn it off when you are running the import.
If you still have trouble with the instance, you can send an email to cloud-sql#google.com for further investigation. Thanks.
I tried analyzing the different rows where the import had timed out, but didn't find anything out of the ordinary. I then fiddled with the available parameters in Google cloud SQL and when using mysqldump.
I finally just tried using a better Machine Type (from two core 8GB Ram to 8 core 30GB ram) and it "solved" the problem.

Connecting locally to a HerokuConnect Postgres database

I'm building a simple web app that will deploy on Heroku, and using a Postgres database that is filled with an object from Salesforce. I did that with HerokuConnect and that works.
Now I want to see the contents of the database so that I know the table names, and I already know you can't do that on Heroku itself. So, I tried to connect to the database locally via heroku pg:psql but everytime I do that I get the error:
---> Connecting to DATABASE_URL
DL is deprecated, please use Fiddle
psql: could not connect to server: Connection refused (0x0000274D/10061)
Is the server running on host "***.***.***.**" (**.***.***.***) and accepting
TCP/IP connections on port ****?
So that doesn't work. I tried the following but to no avail:
explicitly specifiying the name of the database
specifying host, user, database name, password, port and setting sslmode=require per the user guide and this question about connecting to a heroku postgresql database.
using the method above, but instead of using heroku psql I just used psql. That asked for a password but for a user I never created (a user with my corporate system name, because this is a corporate laptop)
reinstalling Postgres
Running heroku pg:info correctly lists my database information.
I am missing something, but I don't know what it is.
"DL is deprecated, please use Fiddle" is not an error but it's only a warning.
This is due to an issue with Ruby 2 on Windows.

Protocol issue connecting Derby by db2 ODBC

This is a problem when I tried to connect Derby Database by DB2 ODBC
I started the Derby Network Server and created a database named "mydb"
but,after I did as the steps described in an article on IBM's site,I countered a problem on the server side.
Step 1->db2 catalog tcpip node MYDERBY remote localhost server 1527
Step 2->db2 catalog db mydb at node MYDERBY authentication server
Step 3->db2 connect to mydb user abc using abc
Problem:
Execution failed because of a Distributed Protocol error:DRDA_Proto_SYNTAXRM;
CODPNT arg=112e;error code value=14;Plaintext connection attempt from an SSL enabled client?
What can I do to accomplish or it is unworkable??
db2_v10.12_winx64_expc
db-derby-10.11.1.1-bin
Current versions of the Derby Network Server do not communicate with the DB2 ODBC client, as far as I know. You might look here: http://apache-database.10148.n7.nabble.com/ODBC-Driver-for-Derby-td102587.html or here: http://www.easysoft.com/products/data_access/odbc-derby-driver/index.html#section=tab-1 or here: Derby Database ODBC Connection for some more ideas.
Note: I gave a relatively similar answer to a relatively similar question here: Derby Database ODBC Connection and it was deleted, so for all I know this answer may be deleted as well.

PostgreSQL: One database to multiple user

I have PostgreSQL 9.3 version. I have created database name db1 now I need it to share with other users who all are connected with the LAN's to connect other applications with the same database.
In SQL Server: We can do this by selecting server name with login details.
Question:
Is it possible in PostgreSQL?
If yes, how can do this?
What is the procedure?
You will need to modify pg_hba.conf to allow remote connections to the database. Information about pg_hba.conf can be found here.
After that, you can connect programatically with a connection string, or similar to your image, with a GUI application like pgAdmin.
To connect (remotely or locally) from pgAdmin choose File -> Add Server... and enter the connection information into the dialog box. Here's an example of the window:
Your client computers will also need to have PostgreSQL drivers as well. If you're doing this in Windows, you'll probably be using ODBC. The PostgreSQL ODBC drivers are here. Info on the connection string format can be found here.
Here's an example of what pgAdmin looks like: