connect: connection timed out - postgresql

I have successfully connected to a Postgres database using the go sql package:
...
db, err := sql.Open("postgres", connStr)
I then use the returned database to execute a (long running) query:
rows, err := db.Query(...)
And am getting the error:
dial tcp xx.xxx.xxx.xx:5432: connect: connection timed out
I have a couple of questions regarding this:
why is the connection timing out?
is there anything I can do to prevent it timing out?

sql.Open() may just validate its arguments without creating a connection to
the database. To verify that the data source name is valid, call Ping.
The sql.Open() function has only created an object, your pool is currently empty. In simple words connection with the database hasn't been established yet.
You need to call db.Ping() to make sure your pool has a working connection.

Related

Mongodb server selection error due network timeout

I have a Go program which uses Mongo DB. I run it from my home pc and I keep getting this error very often:
panic: server selection error: server selection timeout, current
topology: { Type: ReplicaSetNoPrimary, Servers: [{ Addr:
cluster0-shard-00-00.tvm1o.mongodb.net:27017, Type: Unknown, Last
error: connection() error occurred during connection handshake: dial
tcp
3.216.112.85:27017: i/o timeout }, { Addr: cluster0-shard-00-01.tvm1o.mongodb.net:27017, Type: Unknown, Last
error: connection() error occurred during connection handshake: dial
tcp 34.197.85.254:27017: i/o timeout }, { Addr:
cluster0-shard-00-02.tvm1o.mongodb.net:27017, Type: Unknown, Last
error: connection() error occurred during connection handshake: dial
tcp 18.206.5.2:27017: i/o timeout }, ] }
And this is the exact code where it breaks:
if err := clientMongo.Ping(context.TODO(), readpref.Primary()); err != nil {
panic(err)
}
I understand this is a connection timeout, but I don't understand how can this happen at all during a simple client connection. I made a speedtest and my current upload speed is 22 Mbps, I am not uploading big json arrays or anything. It happens always when I try to connect to the client. So I would like to know if this can be caused because my internet connection or something on Mongo's end?
You might need to add your IP to the whitelist of MongoDB.
A few things --
we would need to see the complete code for creating a connection. I'm going to assume you're using exactly what is in the documentation here?
You should try to connect with mongosh and Compass also. If you have problems with another tool, then the odds are it is your Atlas cluster OR your security settings on the cluster, rather than your application code.
that being said about 95% of the time the issue is whitelist or database users. do you have a user created in database access area of the UI that has admin/read/write any database? is your IP in the whitelist?
if 3 is good, and 2 doesn't work, there is free Atlas support available in the green chat button of the UI in MongoDB.

Postgresql connection with gorm dial error cannot assign requested address

I'm trying to use the new gorm v2 implementation with Postgresql (I use Docker for the Golang app and for Postgres). I tried to do it as shown in the gorm documentation.
That gave me the following error:
web_1 | 2020/09/19 19:25:57 /go/src/caiqueservice/main.go:36 failed
to connect to host=/tmp user=admin database=caique: dial error (dial
unix /tmp/.s.PGSQL.5432: connect: no such file or directory)
So since the documentation didn't specify host, but the error message set that to /tmp I set that value.
dsn := fmt.Sprintf("host=%v user=%v password=%v dbname=%v port=%v sslmode=disable",
os.Getenv("DB_HOST"),
os.Getenv("DB_USERNAME"),
os.Getenv("DB_PASSWORD"),
os.Getenv("DB_DATABASE"),
os.Getenv("DB_PORT"),
)
db, err := gorm.Open(postgres.Open(dsn), &gorm.Config{})
.env
DB_HOST=localhost
DB_PORT=5432
DB_DATABASE=caique
DB_USERNAME=admin
DB_PASSWORD=password
Doing so gives me the following error message:
web_1 | 2020/09/19 19:36:47 /go/src/caiqueservice/main.go:36 failed
to connect to host=localhost user=admin database=caique: dial error
(dial tcp [::1]:5432: connect: cannot assign requested address)
The postgres db is reachable by pgadmin.
I don't know what to do next and help would be very much appreciated.
Inside a container, localhost refers to the container itself, not to the host machine. If you are using docker-compose, you should be able to connect to the postgres container (from the app container) using the postgres container name as the host name. If you are running both containers separately, you will need to connect them to the same docker network or link them.

Golang PostgreSQL Error: "getaddrinfow: The specified class was not found."

I am having an issue with Go and performing standard operations on my PostgreSQL database.
I first started coding with GORM, and was getting the following error message while connecting:
dial tcp: lookup tcp/fullstack_api: getaddrinfow: The specified class was not found.
After switching to the standard "database/sql" package with the _ "github.com/lib/pq" postgreSQL dialect, connecting was no longer throwing this error. However, now I get this error when trying to perform any query on the connected database, which I assume GORM was doing initially.
The following code causes this error on my system:
// Connect initiates a DB connection.
func (dbConn *PostgresConnection) Connect() error {
handle, connErr := sql.Open("postgres", dbConn.getConnectionString())
if connErr != nil { // Does NOT cause an error
return connErr
}
if pingErr := handle.Ping(); pingErr != nil { // Causes the above error
return pingErr
}
dbConn.handle = handle
return nil
}
I have checked that the PostgreSQL service is running, and the database exists.
While writing this question, I checked my connection string / env variables again.
I realized that I had a stupid copy/paste error from the day before that I hadn't validated:
Connection string: host=127.0.0.1 port=5432 port=new_database user=db_user password=XXXXXX
As you can see, there is an additional port variable that should have been the dbname. After fixing this issue, everything worked as expected.
Connection string: host=127.0.0.1 port=5432 dbname=new_database user=db_user password=XXXXXX
TLDR: Always re-validate every piece of your connection information when getting this (cryptic) error!

Delphi mORMot No connection could be made because the target machine actively refused it

I test mORMot component.
I compiled standart demo "28 - Simple RESTful ORM Server"
Run it and get an eror.
Code:
var
aModel: TSQLModel;
aProps: TSQLDBConnectionProperties;
aRestServer: TSQLRestServerDB;
aHttpServer: TSQLHttpServer;
begin
// set logging abilities
SQLite3Log.Family.Level := LOG_VERBOSE;
//SQLite3Log.Family.EchoToConsole := LOG_VERBOSE;
SQLite3Log.Family.PerThreadLog := ptIdentifiedInOnFile;
// ODBC driver e.g. from http://ftp.postgresql.org/pub/odbc/versions/msi
aProps := TODBCConnectionProperties.Create('','Driver=PostgreSQL Unicode'+
{$ifdef CPU64}'(x64)'+{$endif}';Database=postgres;'+
'Server=127.0.0.1;Port=5433;UID=postgres;Pwd=postgres','','');
//readln;
try
// get the shared data model
aModel := DataModel;
// use PostgreSQL database for all tables
VirtualTableExternalRegisterAll(aModel,aProps);
try
// create the main mORMot server
aRestServer := TSQLRestServerDB.Create(aModel,':memory:',false); // authentication=false
try
// optionally execute all PostgreSQL requests in a single thread
aRestServer.AcquireExecutionMode[execORMGet] := amBackgroundORMSharedThread;
aRestServer.AcquireExecutionMode[execORMWrite] := amBackgroundORMSharedThread;
// create tables or fields if missing
aRestServer.CreateMissingTables;
// serve aRestServer data over HTTP
aHttpServer := TSQLHttpServer.Create(SERVER_PORT,[aRestServer],'+',useHttpApiRegisteringURI);
try
aHttpServer.AccessControlAllowOrigin := '*'; // allow cross-site AJAX queries
writeln('Background server is running.'#10);
write('Press [Enter] to close the server.');
readln;
finally
aHttpServer.Free;
end;
finally
aRestServer.Free;
end;
finally
aModel.Free;
end;
finally
aProps.Free;
end;
end.
error
{"Message":"TODBCLib error: [08001] Could not connect to the server;\nNo connection could be made because the target machine actively refused it.\r\n [127.0.0.1:5433]
How to clear it.
I just tested it with Delphi Seattle 10 and the ODBC driver from http://ftp.postgresql.org/pub/odbc/versions/msi/psqlodbc_09_03_0400.zip - with no problem.
Ensure your PostgreSQL server has been defined to run on the same port as expected by the source. Edit the 5433 value into 5432 if you used the default port number.
Being paranoid, I try to always change the default port, which reduces network scan attacks (at least from fast scan). I never use port 22 for ssh, nor 5432 for PostgreSQL. Sorry for the inconvenience.

Database hangs if not used

I have a web application I am starting. Works fine upon startup but if I leave it (for say, an hour) and hit it with another request the query hangs. I thought about closing it after each query then opening up a new connection but the docs explicitly say "It is rare to Close a DB, as the DB handle is meant to be long-lived and shared between many goroutines.". What am I doing wrong?
package main
import (
"database/sql"
"log"
"net/http"
_ "github.com/lib/pq"
)
var Db *sql.DB
func main() {
var err error
Db, err = sql.Open("postgres", "user=me password=openupitsme host=my.host.not.yours dbname=mydb sslmode=require")
if err != nil {
log.Fatal("Cannot connect to db: ", err)
}
http.HandleFunc("/page", myHandler)
http.ListenAndServe(":8080", nil)
}
func myHandler(w http.ResponseWriter, r *http.Request) {
log.Println("Handling Request....", r)
query := `SELECT pk FROM mytable LIMIT 1`
rows, err := Db.Query(query)
if err != nil {
log.Println(err)
}
defer rows.Close()
for rows.Next() {
var pk int64
if err := rows.Scan(&pk); err != nil {
log.Println(err)
}
log.Println(pk)
}
log.Println("Request Served...")
}
EDIT #1:
My postgres log shows:
2015-07-08 18:10:01 EDT [7710-1] user#here LOG: could not receive data from client: Connection reset by peer
2015-07-08 18:20:01 EDT [7756-1] user#here LOG: could not receive data from client: Connection reset by peer
I have experienced similar issues. In our case, the problem was caused by a connection tracking firewall located between the client machine and the database.
Such firewalls keep track of TCP level connections, and in order to limit resource usage, then will time out connections which to them appear inactive for an extended period. The symptoms we observed in this case were very similar to yours: at the client end, the connection appears to be hanging, while at the server end you can see connection reset by peer.
One way to prevent this is to ensure that TCP Keepalives are enabled, and that the keepalive interval is less than the timeout of the firewalls, routers, etc which are causing your connection issue. This is controlled by the libpq connection parameters keepalives, keepalives_idle, keepalives_interval and keepalives_count which you can set in the connection string. See the manual for a description of these parameters.
keepalive determines if the keepalive function is enabled or not. It defaults to 1 (enabled) so you probably do not need to specify this.
keepalives_idle determines the amount of idle time before it will send a keepalive. If you do not specify this, it will default to the default for the operating system.
In a Linux system you can see the default by examining /proc/sys/net/ipv4/tcp_keepalive_time - in my server it is set to 7200 seconds, which would be too long in your case, since your observation is that the connection is dropped after ~1 hour.
You could try setting it to, say, 2500 seconds.
The Linux Documentation Project provides a useful TCP Keepalive HOWTO document that describes how they work in some detail.
Note that not all operating systems support TCP keepalives. If you are unable to enable keepalives here are some other options you might like to consider:
If it is in your control, reconfigure the firewall/router which is dropping the connection so that it will not do so for Postgresql client connections
At an application level, you might be able to send some traffic that will keep the DB handles active - for example sending a statement such as SELECT 1; every hour or so. If your programming environment provides connection caching (from the comments I gather it does) then this might be tricky.