I have been given a knexfile like this:
require('dotenv').config()
module.exports = {
client: 'pg',
connection: process.env.DB_CONNECTION,
pool: {
min: 2,
max: 10
},
migrations: {
tableName: 'knex_migrations'
}
};
The connection string I supply is:
Host=localhost;Database=heypay;Username=postgres;Password=1234
However, Knex keeps issuing the error:
password authentication failed for user "user"
Apparently, the username I have given is not user. Moreover, I have tried to hardcore the connection string into the connection filed under module.exports. This still ended up in vain.
The trick is, the connection property can either be a string or an object. That's why you were able to supply an environment variable (it's a string).
The reason your original string was failing is not a Knex problem: Postgres connection strings have a slightly different format. You can use a similar approach as your first attempt, but pay attention to the key names:
host=localhost port=5432 dbname=mydb connect_timeout=10
Also note spaces, not semicolons. However in my experience most people use a Postgres URI:
postgresql://[user[:password]#][netloc][:port][,...][/dbname][?param1=value1&...]
So in your example, you'd use:
module.exports = {
client: 'pg',
connection: 'postgresql://your_database_user:password#localhost/myapp_test',
pool: {
min: 2,
max: 10
},
migrations: {
tableName: 'knex_migrations'
}
};
I was using a .NET style connection string, the correct one would be in the following format:
{
host : '127.0.0.1',
user : 'your_database_user',
password : 'your_database_password',
database : 'myapp_test'
}
Related
I'm trying to connect to my postgres database in Heroku with Knex.
const db = knex({
client: "pg",
connection: {
connectionString:process.env.DATABASE_URL,
ssh: true,
},
});
process.env.DATABASE_URL is undefined, and when I use the connection string instead that I get from Heroku, it still doesn't work.
EDIT:
I fixed this issue by replacing process.env.DATABASE_URL with a
Thanks reading my issue.
Currently, I am using postgres (hobby-dev) on Heroku and facing this issue every time that I connect to the database.
error: Uncaught (in promise) Error: Unknown auth message code 1397113172
throw new Error(`Unknown auth message code ${code}`);
^
at Connection.handleAuth (connection.ts:197:15)
at Connection.startup (connection.ts:155:16)
at async Pool._createConnection (pool.ts:32:5)
at async pool.ts:61:7
at async Promise.all (index 0)
at async Pool._startup (pool.ts:63:25)
My application using Deno now
import { Pool } from "https://deno.land/x/postgres/mod.ts";
import { config } from "./config.ts";
const port = config.DB_PORT ? parseInt(config.DB_PORT || "") : undefined;
const POOL_CONNECTIONS = 20;
const dbPool = new Pool({
port,
hostname: config.DB_HOST,
user: config.DB_USER,
database: config.DB_NAME,
password: config.DB_PASS
}, POOL_CONNECTIONS);
export { dbPool };
Here is debug screen.
I have found this issue post and it mentioned about lacking ssl. Not sure how to do it on heroku.
I have tried some solutions, even change lib to pg and it still not work. I am very appreciated if any clue or help to fix this issue.
Note:
I read a document on heroku about "Heroku Postgres Connection Pooling is not available for Hobby-tier databases.". Then I switched to use Client with syntax similar like this to connect to Heroku postgres this:
import { Client } from "https://deno.land/x/postgres/mod.ts";
let config;
config = {
hostname: "localhost",
port: 5432,
user: "user",
database: "test",
applicationName: "my_custom_app"
};
// alternatively
config = "postgres://user#localhost:5432/test?application_name=my_custom_app";
const client = new Client(config);
await client.connect();
await client.end();
ref: https://deno-postgres.com/#/
In GitHub actions, I am running a JavaScript file which connects to PostgreSQL and creates the table and extension for the database.
my script looks like this:
const { Client } = require('pg')
const pgclient = new Client({
host: process.env.POSTGRES_HOST,
port: process.env.POSTGRES_PORT,
user: process.env.POSTGRES_USER,
password: process.env.POSTGRES_PASSWORD,
database: process.env.POSTGRES_DB,
})
pgclient.connect()
const createDB = `
drop database mydb;
create database mydb;
\c mydb;
CREATE EXTENSION "pgcrypto";
`
pgclient.query(createDB, (err, res) => {
if (err) throw err
pgclient.end()
})
When I run the script, I get an error
error: syntax error at or near "c"
Which I am guessing is coming from \c flag.
How do I use PostgreSQL commands like this?
you can not use \c here because it is a psql meta-command, which I think you do not use here: See https://www.postgresql.org/docs/current/app-psql.html.
You need to reconnect to the new DB like so:
const pgclient_mydb = new Client({
host: process.env.POSTGRES_HOST,
port: process.env.POSTGRES_PORT,
user: process.env.POSTGRES_USER,
password: process.env.POSTGRES_PASSWORD,
database: 'mydb',
})
pgclient_mydb.connect()
See also https://stackoverflow.com/a/43670984/10743176
I am using golang in my application server and gorm as the ORM. I am using postgresql as the database in google cloud sql.
I created a 2 read replica's for postgres which are being used by the application server.
Previously, I used node.js and sequelize and there, I am able to define the read replicas as
read: [
{ host: '8.8.8.8', username: 'anotherusernamethanroot', password: 'lolcats!' },
{ host: 'localhost', username: 'root', password: null }
],
write: { host: 'localhost', username: 'root', password: null }
},
However for gorm, I dont see any way to do that(in the documentation).
So, is there a way that I can define read replicas and gorm takes care of it. If not, what is the best practice for this use case?
Now that Gorm V2 is out you can use the dbresolver plugin just for this use case. Replicating what you gave as an example would look like:
import (
"gorm.io/gorm"
"gorm.io/plugin/dbresolver"
"gorm.io/driver/postgres"
)
db, err := gorm.Open(postgres.Open("host=localhost user=root"), &gorm.Config{})
db.Use(dbresolver.Register(dbresolver.Config{
Replicas: []gorm.Dialector{
postgres.Open("host=8.8.8.8 user=anotherusernamethanroot password=lolcats!"),
postgres.Open("host=localhost user=root"),
},
Policy: dbresolver.RandomPolicy{},
})
Check out the documentation: https://gorm.io/docs/dbresolver.html
I have a node-mysql pool configuration of
var db_init={
host : 'ip_address_of_GCS_SQL',
user : 'user_name_of_GCS_SQL',
password : 'password here',
database : 'db here',
supportBigNumbers: true,
connectionLimit:100
};
Pool was created using
GLOBAL.db_foobar = mysql.createPool(db_init);
I basically just left the connection on for a couple of hours and I saw this error reported by my connection.query Request (after getConnection of course):
prodAPI-104 (out): { status: 'Error',
prodAPI-104 (out): details: '[foobar_function]Error in query',
prodAPI-104 (out): err: '{ [Error: read ETIMEDOUT]\n code: \'ETIMEDOUT\',\n errno: \'ETIMEDOUT\',\n syscall: \'read\',\n fatal: true }',
prodAPI-104 (out): query: 'SELECT * FROM `foobar_table`;' }
Why is this happening? The MySQL in Google-Cloud-SQL didn't report a query taking too long to create so I dunno why this happened.
I suspect the reason is that keepalive is not enabled on the connection to the MySQL server.
node-mysql does not have an option to enable keepalive and neither does node-mysql2, but node-mysql2 provides a way to supply a custom function for creating sockets which we can use to enable keepalive:
var mysql = require('mysql2');
var net = require('net');
var pool = mysql.createPool({
connectionLimit : 100,
host : '123.123.123.123',
user : 'foo',
password : 'bar',
database : 'baz',
stream : function(opts) {
var socket = net.connect(opts.config.port, opts.config.host);
socket.setKeepAlive(true);
return socket;
}
});