Node postgres - (node:43028) UnhandledPromiseRejectionWarning: error: sorry, too many clients already - postgresql

I am trying to add data to my databases. There are multiple entries in an array
dataArryay.forEach((element) => {
queryData(elementData)
}
The function is called multiple times
async function queryData(data) {
const queryString = `INSERT............')`;
const query = {
text: queryString
};
const pool = await new pg.Pool({
host: 'localhost',
port: 'xxxx',
user: 'xxxxxx',
database: 'xxxxx',
max: 100,
idleTimeoutMillis: 50000,
connectionTimeoutMillis: 3000,
});
await pool.connect();
await pool.query(query)
await pool.end()
}
It does the insert but it does throw the too many connections error. I have tried .release() and .end()
When querying the connections I get
max_conn = 500
used = 6
res_for_super = 3
res_for_normal = 491
I don't really know what these men but they seem to add up to 500.

Related

Typesense: Firebase Function timeout using Typesense client

I have a Firebase function trigger for when an item is updated. In most instances this function completes and updates the item in the Typesense database however, occasionaly this function fails with a timeout.
This happens seemingly randomly for this and other onCreate and onUpdate Firebase Functions using the Typesense client.
Failed Function Screenshot Example (output from https://console.cloud.google.com/logs)
Logs when function times out
Succesful Function Screenshot Example
Logs when function behaves as expected
Firebase Function:
// Firebase function triggered when a program is updated
export const onUpdate = functions
// 256MB memory and timeout of 2 minutes
.runWith({ memory: '256MB', timeoutSeconds: 120 })
.firestore.document('programs/{docId}')
.onCreate(async (snap, context) => {
const id = context.params.docId;
const data = snap.data();
const client = useTypesense();
const item = { ...data, id };
console.log('Typesense: Updating Program');
await client
.collections('programs')
.documents()
.upsert(item)
.then(() => {
console.log(`Typesense: Program ${item.id} successfully updated!`);
})
.catch((error) => {
console.error('Typesense: Error creating Program: ' + item.id, error);
});
return null;
});
useTypesense() function (Initializing the client)
import Typesense from 'typesense';
const useTypesense = () => {
const host = 'api.oniworkout.app'
const port = '443'
const protocol = 'https'
const apiKey = 'b3fxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx';
console.log(
`Typesense: Initializing client with host: ${host}, port: ${port}, protocol: ${protocol}, apiKey: ${apiKey.substring(
0,
3
)}`
);
const client = new Typesense.Client({
nodes: [
{
host,
port,
protocol,
},
],
apiKey,
connectionTimeoutSeconds: 120,
retryIntervalSeconds: 120,
});
return client;
};
I have tried extending the duration of timeout on the Firebase Function from 60 seconds to 120 seconds. The issue persists.
I haved checked https://api.oniworkout.app/health which returns {"ok":true}

Minimal Sveltekit + pg integration fails with "status" error

I'm trying to get Postgres working with sveltekit and a very minimal example is giving me issues. This is probably a configuration thing but the error I'm getting back from sveltekit makes no sense to me.
I start by installing a new project:
npm create svelte#latest my-testapp
Then I install "pg" to get Postgres pooling:
npm i pg
Then I add a page under src/lib/db.js:
import { Client, Pool } from 'pg';
const pool = new Pool({
user: 'xxx',
host: 'xxx',
database: 'xxx',
password: 'xxx',
port: 5432,
})
export const connectToDB = async () => await pool.connect();
Finally I add src/hooks.server.js to give me access to the pool within routes:
import { connectToDB } from '$lib/db';
export const handle = async ({event, resolve}) => {
const dbconn = await connectToDB();
event.locals = { dbconn };
const response = await resolve(event);
dbconn.release();
}
The server fails to compile with a couple of these errors:
Cannot read properties of undefined (reading 'status')
TypeError: Cannot read properties of undefined (reading 'status')
at respond (file:///C:/Users/user/code/svelte/my-testapp/node_modules/#sveltejs/kit/src/runtime/server/index.js:314:16)
at async file:///C:/Users/user/code/svelte/my-testapp/node_modules/#sveltejs/kit/src/exports/vite/dev/index.js:406:22
Not sure where "status" is coming from, seems to be part of the initial scaffolding. Any help appreciated.
Also - if there is a more straightforward way to integrate pg with sveltekit then I'm happy to hear about it. Thanks
My bad - the hooks function wasn't returning the response.
Hooks.server.js should read:
import { connectToDB } from '$lib/db';
export const handle = async ({event, resolve}) => {
const dbconn = await connectToDB();
event.locals = { dbconn };
const response = await resolve(event);
dbconn.release();
return response
}

Exceeded timeout of 5000 ms for a test with Jest and MongoDB

I'm trying to implement database population by using a migration function. The code works perfectly, it saves all the data into the database, but the test for the function is failing, and now I would like to know why?
I'm getting the "Exceeded timeout of 5000 ms" error for this particular test. I've written 166 tests for this app and all of them are passing.
Here is the function I want to test:
const doMigration = async ({ model, data }) => {
await model.collection.insertMany(data)
}
And here is the test:
const { Amodel } = require('../../../models/Amodel')
const { doMigration } = require('../../../database/migrations')
describe('Database Population', () => {
it ('Should populate the database using migrations', async () => {
const data = [{ name: 'A' }, { name: 'B' }]
const model = Amodel
const migration = { name: 'Amodel', model, data }
await doMigration(migration)
const countAfter = await Amodel.count()
expect(countAfter).toBe(2)
})
})
In this test I simply import the function, the model and create a migration object that then is passed to the function.
What did I try?
Tried using just the countAfter without using the doMigration function, and it still generates the same timeout error.
Tried increasing the time for this test to 30000, failed with error saying that the mongodb time exceeded the 10000 ms.
Here is the github repository: https://github.com/Elvissamir/Fullrvmovies
What is happening, how can I solve this error?
The problem was the way the mongodb connection was handled. When testing, the app created a connection to the db on startup, and then the jest tests used that connection, that caused some issues.
The solution was to connect to the database on startup only if the environment is set to testing, otherwise the connection will be handled by each set of tests.
In each set I added a beforeAll and afterAll to open and close the connection to the database.
Hope it helps anyone that finds the same problem or has similar issues.
The orientation is that the message reflect the actual reason, So i recommand to follow the following steps:
use the following code to check mongo state:
const { MongoMemoryServer } = require("mongodb-memory-server");
const mongoose = require("mongoose");
(async () => {
mongod = await MongoMemoryServer.create();
const mongoUri = mongod.getUri();
await mongoose.connect(mongoUri, {
useNewUrlParser: true,
useUnifiedTopology: true,
}).then((result) => {
console.log(result.connection.readyState)
console.log(result.connection.host)
}).catch((err) => {
});;
})();
if you are using mongodb-memory-server add "testTimeout" attribute:
"jest": {
"preset": "ts-jest",
"testEnvironment": "node",
"setupFilesAfterEnv": [
"./src/test/setup.ts"
],
"testTimeout": 15000
},
If all above still huppens check the time-out of all inter-test operation

Store cookie information with express-rate-limit

Is there a way by which I can store user cookies (jwt) in my mongodb database with express-rate-limit and rate-limit-mongo packages?
Code that I am currently using :
var limiter = new RateLimit({
store: new MongoStore({
uri: process.env.MONGO_URI,
expireTimeMs: 60 * 1000 * 60,
}),
max: 150,
windowMs: 10 * 60 * 1000,
message: "Too many requests in a short duration, IP Banned for an hour.",
});
I want to know the jwt cookie (if it exists) associated with the request too somehow so that I can know who the culprit was.
Basically, how can I access the request object and store it in the rate limiting database
I was able to do this with the onLimitReached function available in express-rate-limit like so:
var queslimiter = new RateLimit({
store: new MongoStore({
uri: process.env.MONGO_URI,
expireTimeMs: 60 * 1000 * 60,
collectionName: "ansForce",
}),
max: 30,
message: "Too many requests in a short duration, IP Banned for an hour.",
onLimitReached: async function (req, res) {
try {
const blacklist = new Blacklist({
team: req.cookies.team,
});
try {
const bad = await blacklist.save();
} catch (e) {
console.log(e);
}
} catch (e) {}
},
});

ConnectionError: Connection lost - read ECONNRESET in protractor

I am using protractor 52.2 and cucumber 3.2.2. I am using selenium grid(selenium-server-standalone-3.14.0.jar) with the protractor and running my script in 4 browsers of 4 different nodes. I have a table of 600 rows in the DB. Initially, I am accessing data from this table and entering the data of each row through my protractor script and updating the DB column after successful entering of each row. But after entering some rows successfully, protractor script abruptly ends with error "ConnectionError: Connection lost - read ECONNRESET in protractor".And I am getting an error message in update SQL query, that "RequestError: Resource ID: 1. The request limit for the database is 60 and has been reached. See 'http://go.microsoft.com/fwlink/?LinkId=267637' for assistance." The update query which I am using is given below(i am using azure sql). I am not getting a clear idea of how to solve this. Thanks in advance.
var Connection = require('tedious').Connection;
var Request = require('tedious').Request;
var config =
{
userName: 'xxx',
password: 'xxxxx',
server: 'xxxxxx',
options:
{
database: 'xxx' ,
encrypt: true,
rowCollectionOnRequestCompletion: true
}
}
var connection = new Connection(config);
defineSupportCode(function ({ setDefaultTimeout, Given, When, Then }) {
setDefaultTimeout(30000 * 1000);
function updatedb(LPAID){
request = new Request("UPDATE COM_Location_Post with (rowlock) SET IsPublished = 1 WHERE Id ="+LPAID,function(err,rowCount, rows) {
if(err){
console.log(err)
}
});
connection.execSql(request);
}
});
You did not use connection close in your script.
By your question its clear after max instance you face this problem.
Try closing your connection every time for each transaction.
(async () => {
const config = {
user: 'User',
password: 'iPg$',
server: 'cp-sql',
database: 'DBI',
options: {
encrypt: true // Use this if you're on Windows Azure
}
}
try {
let pool = await sql.connect(config)
var envcode, testcode;
let result1 = await pool.request()
.query(`query 1 goes here`)
// console.dir(result1)
pool.close();
sql.close();
let pool1 = await sql.connect(config)
let result2 = await pool1.request()
.query(`query 2 goes here`)
// console.dir(result2)
pool1.close();
sql.close();
resolve(result2);
} catch (err) {
console.log(err)
}
})()