Typesense: Firebase Function timeout using Typesense client - google-cloud-firestore

I have a Firebase function trigger for when an item is updated. In most instances this function completes and updates the item in the Typesense database however, occasionaly this function fails with a timeout.
This happens seemingly randomly for this and other onCreate and onUpdate Firebase Functions using the Typesense client.
Failed Function Screenshot Example (output from https://console.cloud.google.com/logs)
Logs when function times out
Succesful Function Screenshot Example
Logs when function behaves as expected
Firebase Function:
// Firebase function triggered when a program is updated
export const onUpdate = functions
// 256MB memory and timeout of 2 minutes
.runWith({ memory: '256MB', timeoutSeconds: 120 })
.firestore.document('programs/{docId}')
.onCreate(async (snap, context) => {
const id = context.params.docId;
const data = snap.data();
const client = useTypesense();
const item = { ...data, id };
console.log('Typesense: Updating Program');
await client
.collections('programs')
.documents()
.upsert(item)
.then(() => {
console.log(`Typesense: Program ${item.id} successfully updated!`);
})
.catch((error) => {
console.error('Typesense: Error creating Program: ' + item.id, error);
});
return null;
});
useTypesense() function (Initializing the client)
import Typesense from 'typesense';
const useTypesense = () => {
const host = 'api.oniworkout.app'
const port = '443'
const protocol = 'https'
const apiKey = 'b3fxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx';
console.log(
`Typesense: Initializing client with host: ${host}, port: ${port}, protocol: ${protocol}, apiKey: ${apiKey.substring(
0,
3
)}`
);
const client = new Typesense.Client({
nodes: [
{
host,
port,
protocol,
},
],
apiKey,
connectionTimeoutSeconds: 120,
retryIntervalSeconds: 120,
});
return client;
};
I have tried extending the duration of timeout on the Firebase Function from 60 seconds to 120 seconds. The issue persists.
I haved checked https://api.oniworkout.app/health which returns {"ok":true}

Related

Minimal Sveltekit + pg integration fails with "status" error

I'm trying to get Postgres working with sveltekit and a very minimal example is giving me issues. This is probably a configuration thing but the error I'm getting back from sveltekit makes no sense to me.
I start by installing a new project:
npm create svelte#latest my-testapp
Then I install "pg" to get Postgres pooling:
npm i pg
Then I add a page under src/lib/db.js:
import { Client, Pool } from 'pg';
const pool = new Pool({
user: 'xxx',
host: 'xxx',
database: 'xxx',
password: 'xxx',
port: 5432,
})
export const connectToDB = async () => await pool.connect();
Finally I add src/hooks.server.js to give me access to the pool within routes:
import { connectToDB } from '$lib/db';
export const handle = async ({event, resolve}) => {
const dbconn = await connectToDB();
event.locals = { dbconn };
const response = await resolve(event);
dbconn.release();
}
The server fails to compile with a couple of these errors:
Cannot read properties of undefined (reading 'status')
TypeError: Cannot read properties of undefined (reading 'status')
at respond (file:///C:/Users/user/code/svelte/my-testapp/node_modules/#sveltejs/kit/src/runtime/server/index.js:314:16)
at async file:///C:/Users/user/code/svelte/my-testapp/node_modules/#sveltejs/kit/src/exports/vite/dev/index.js:406:22
Not sure where "status" is coming from, seems to be part of the initial scaffolding. Any help appreciated.
Also - if there is a more straightforward way to integrate pg with sveltekit then I'm happy to hear about it. Thanks
My bad - the hooks function wasn't returning the response.
Hooks.server.js should read:
import { connectToDB } from '$lib/db';
export const handle = async ({event, resolve}) => {
const dbconn = await connectToDB();
event.locals = { dbconn };
const response = await resolve(event);
dbconn.release();
return response
}

Exceeded timeout of 5000 ms for a test with Jest and MongoDB

I'm trying to implement database population by using a migration function. The code works perfectly, it saves all the data into the database, but the test for the function is failing, and now I would like to know why?
I'm getting the "Exceeded timeout of 5000 ms" error for this particular test. I've written 166 tests for this app and all of them are passing.
Here is the function I want to test:
const doMigration = async ({ model, data }) => {
await model.collection.insertMany(data)
}
And here is the test:
const { Amodel } = require('../../../models/Amodel')
const { doMigration } = require('../../../database/migrations')
describe('Database Population', () => {
it ('Should populate the database using migrations', async () => {
const data = [{ name: 'A' }, { name: 'B' }]
const model = Amodel
const migration = { name: 'Amodel', model, data }
await doMigration(migration)
const countAfter = await Amodel.count()
expect(countAfter).toBe(2)
})
})
In this test I simply import the function, the model and create a migration object that then is passed to the function.
What did I try?
Tried using just the countAfter without using the doMigration function, and it still generates the same timeout error.
Tried increasing the time for this test to 30000, failed with error saying that the mongodb time exceeded the 10000 ms.
Here is the github repository: https://github.com/Elvissamir/Fullrvmovies
What is happening, how can I solve this error?
The problem was the way the mongodb connection was handled. When testing, the app created a connection to the db on startup, and then the jest tests used that connection, that caused some issues.
The solution was to connect to the database on startup only if the environment is set to testing, otherwise the connection will be handled by each set of tests.
In each set I added a beforeAll and afterAll to open and close the connection to the database.
Hope it helps anyone that finds the same problem or has similar issues.
The orientation is that the message reflect the actual reason, So i recommand to follow the following steps:
use the following code to check mongo state:
const { MongoMemoryServer } = require("mongodb-memory-server");
const mongoose = require("mongoose");
(async () => {
mongod = await MongoMemoryServer.create();
const mongoUri = mongod.getUri();
await mongoose.connect(mongoUri, {
useNewUrlParser: true,
useUnifiedTopology: true,
}).then((result) => {
console.log(result.connection.readyState)
console.log(result.connection.host)
}).catch((err) => {
});;
})();
if you are using mongodb-memory-server add "testTimeout" attribute:
"jest": {
"preset": "ts-jest",
"testEnvironment": "node",
"setupFilesAfterEnv": [
"./src/test/setup.ts"
],
"testTimeout": 15000
},
If all above still huppens check the time-out of all inter-test operation

Node postgres - (node:43028) UnhandledPromiseRejectionWarning: error: sorry, too many clients already

I am trying to add data to my databases. There are multiple entries in an array
dataArryay.forEach((element) => {
queryData(elementData)
}
The function is called multiple times
async function queryData(data) {
const queryString = `INSERT............')`;
const query = {
text: queryString
};
const pool = await new pg.Pool({
host: 'localhost',
port: 'xxxx',
user: 'xxxxxx',
database: 'xxxxx',
max: 100,
idleTimeoutMillis: 50000,
connectionTimeoutMillis: 3000,
});
await pool.connect();
await pool.query(query)
await pool.end()
}
It does the insert but it does throw the too many connections error. I have tried .release() and .end()
When querying the connections I get
max_conn = 500
used = 6
res_for_super = 3
res_for_normal = 491
I don't really know what these men but they seem to add up to 500.

Image returned from REST API always displays broken

I am building a content management system for an art portfolio app, with React. The client will POST to the API which uses Mongoose to insert into a MongoDB. The API then queries the DB for the newly inserted image, and returns it to the client.
Here's my code to connect to MongoDB using Mongoose:
mongoose.connect('mongodb://localhost/test').then(() =>
console.log('connected to db')).catch(err => console.log(err))
mongoose.Promise = global.Promise
const db = mongoose.connection
db.on('error', console.error.bind(console, 'MongoDB connection error:'))
const Schema = mongoose.Schema;
const ImgSchema = new Schema({
img: { data: Buffer, contentType: String }
})
const Img = mongoose.model('Img', ImgSchema)
I am using multer and fs to handle the image file. My POST endpoint looks like this:
router.post('/', upload.single('image'), (req, res) => {
if (!req.file) {
res.send('no file')
} else {
const imgItem = new Img()
imgItem.img.data = fs.readFileSync(req.file.path)
imgItem.contentType = 'image/png'
imgItem
.save()
.then(data =>
Img.findById(data, (err, findImg) => {
console.log(findImg.img)
fs.writeFileSync('api/uploads/image.png', findImg.img.data)
res.sendFile(__dirname + '/uploads/image.png')
}))
}
})
I can see in the file structure that writeFileSync is writing the image to the disk. res.sendFile grabs it and sends it down to the client.
Client side code looks like this:
handleSubmit = e => {
e.preventDefault()
const img = new FormData()
img.append('image', this.state.file, this.state.file.name)
axios
.post('http://localhost:8000/api/gallery', img, {
onUploadProgress: progressEvent => {
console.log(progressEvent.loaded / progressEvent.total)
}
})
.then(res => {
console.log('responsed')
console.log(res)
const returnedFile = new File([res.data], 'image.png', { type: 'image/png' })
const reader = new FileReader()
reader.onloadend = () => {
this.setState({ returnedFile, returned: reader.result })
}
reader.readAsDataURL(returnedFile)
})
.catch(err => console.log(err))
}
This does successfully place both the returned file and the img data url on state. However, in my application, the image always displays broken.
Here's some screenshots:
How to fix this?
Avoid sending back base64 encoded images (multiple images + large files + large encoded strings = very slow performance). I'd highly recommend creating a microservice that only handles image uploads and any other image related get/post/put/delete requests. Separate it from your main application.
For example:
I use multer to create an image buffer
Then use sharp or fs to save the image (depending upon file type)
Then I send the filepath to my controller to be saved to my DB
Then, the front-end does a GET request when it tries to access: http://localhost:4000/uploads/timestamp-randomstring-originalname.fileext
In simple terms, my microservice acts like a CDN solely for images.
For example, a user sends a post request to http://localhost:4000/api/avatar/create with some FormData:
It first passes through some Express middlewares:
libs/middlewares.js
...
app.use(cors({credentials: true, origin: "http://localhost:3000" })) // allows receiving of cookies from front-end
app.use(morgan(`tiny`)); // logging framework
app.use(multer({
limits: {
fileSize: 10240000,
files: 1,
fields: 1
},
fileFilter: (req, file, next) => {
if (!/\.(jpe?g|png|gif|bmp)$/i.test(file.originalname)) {
req.err = `That file extension is not accepted!`
next(null, false)
}
next(null, true);
}
}).single(`file`))
app.use(bodyParser.json()); // parses header requests (req.body)
app.use(bodyParser.urlencoded({ limit: `10mb`, extended: true })); // allows objects and arrays to be URL-encoded
...etc
Then, hits the avatars route:
routes/avatars.js
app.post(`/api/avatar/create`, requireAuth, saveImage, create);
It then passes through some user authentication, then goes through my saveImage middleware:
services/saveImage.js
const createRandomString = require('../shared/helpers');
const fs = require("fs");
const sharp = require("sharp");
const randomString = createRandomString();
if (req.err || !req.file) {
return res.status(500).json({ err: req.err || `Unable to locate the requested file to be saved` })
next();
}
const filename = `${Date.now()}-${randomString}-${req.file.originalname}`;
const filepath = `uploads/${filename}`;
const setFilePath = () => { req.file.path = filepath; return next();}
(/\.(gif|bmp)$/i.test(req.file.originalname))
? fs.writeFile(filepath, req.file.buffer, (err) => {
if (err) {
return res.status(500).json({ err: `There was a problem saving the image.`});
next();
}
setFilePath();
})
: sharp(req.file.buffer).resize(256, 256).max().withoutEnlargement().toFile(filepath).then(() => setFilePath())
If the file is saved, it then sends a req.file.path to my create controller. This gets saved to my DB as a file path and as an image path (the avatarFilePath or /uploads/imagefile.ext is saved for removal purposes and the avatarURL or [http://localhost:4000]/uploads/imagefile.ext is saved and used for the front-end GET request):
controllers/avatars.js (I'm using Postgres, but you can substitute for Mongo)
create: async (req, res, done) => {
try {
const avatarurl = `${apiURL}/${req.file.path}`;
await db.result("INSERT INTO avatars(userid, avatarURL, avatarFilePath) VALUES ($1, $2, $3)", [req.session.id, avatarurl, req.file.path]);
res.status(201).json({ avatarurl });
} catch (err) { return res.status(500).json({ err: err.toString() }); done();
}
Then when the front-end tries to access the uploads folder via <img src={avatarURL} alt="image" /> or <img src="[http://localhost:4000]/uploads/imagefile.ext" alt="image" />, it gets served up by the microservice:
libs/server.js
const express = require("express");
const path = app.get("path");
const PORT = 4000;
//============================================================//
// EXPRESS SERVE AVATAR IMAGES
//============================================================//
app.use(`/uploads`, express.static(`uploads`));
//============================================================//
/* CREATE EXPRESS SERVER */
//============================================================//
app.listen(PORT);
What it looks when logging requests:
19:17:54 INSERT INTO avatars(userid, avatarURL, avatarFilePath) VALUES ('08861626-b6d0-11e8-9047-672b670fe126', 'http://localhost:4000/uploads/1536891474536-k9c7OdimjEWYXbjTIs9J4S3lh2ldrzV8-android.png', 'uploads/1536891474536-k9c7OdimjEWYXbjTIs9J4S3lh2ldrzV8-android.png')
POST /api/avatar/create 201 109 - 61.614 ms
GET /uploads/1536891474536-k9c7OdimjEWYXbjTIs9J4S3lh2ldrzV8-android.png 200 3027 - 3.877 ms
What the user sees upon successful GET request:

Write test axios-mock-adapter with axios.create()

I want to test my http service but get error.
So, my test file
api.js
import axios from 'axios';
export const api = axios.create();
fetchUsers.js
import api from './api';
export const fetchUsers = (params) api.get('/api/users', { params })
.then(({data}) => data)
fetchUsers.spec.js
import MockAdapter from 'axios-mock-adapter'
import api from './api';
const mock = new MockAdapter(api);
describe('fetchUsers', () => {
it('should send request', (done) => {
const data = { data: ['user'] };
mock.onGet('/api/users').reply(200, data);
fetchUsers().then((response) => {
expect(response).toEqual(data.data);
done();
});
});
});
But I get error here
Error: connect ECONNREFUSED 127.0.0.1:80
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1158:14)
If I replace in api.js axios.create() with axios its' working. But how to test with created axios instance? I'll need to ass there parameters when create it.
Anyone can help with that?
Hi I had the same issue and had to answer myself here https://stackoverflow.com/a/51414152/73323
Here is the gist:
First off, you don't need the axios-mock-adapter library.
Create a mock for axios in src/__mocks__:
// src/__mocks__/axios.ts
const mockAxios = jest.genMockFromModule('axios')
// this is the key to fix the axios.create() undefined error!
mockAxios.create = jest.fn(() => mockAxios)
export default mockAxios
Then in your test file, the gist would look like:
import mockAxios from 'axios'
import configureMockStore from 'redux-mock-store'
import thunk from 'redux-thunk'
// for some reason i need this to fix reducer keys undefined errors..
jest.mock('../../store/rootStore.ts')
// you need the 'async'!
test('Retrieve transaction data based on a date range', async () => {
const middlewares = [thunk]
const mockStore = configureMockStore(middlewares)
const store = mockStore()
const mockData = {
'data': 123
}
/**
* SETUP
* This is where you override the 'post' method of your mocked axios and return
* mocked data in an appropriate data structure-- {data: YOUR_DATA} -- which
* mirrors the actual API call, in this case, the 'reportGet'
*/
mockAxios.post.mockImplementationOnce(() =>
Promise.resolve({ data: mockData }),
)
const expectedActions = [
{ type: REQUEST_TRANSACTION_DATA },
{ type: RECEIVE_TRANSACTION_DATA, data: mockData },
]
// work
await store.dispatch(reportGet())
// assertions / expects
expect(store.getActions()).toEqual(expectedActions)
expect(mockAxios.post).toHaveBeenCalledTimes(1)
})