How to call pg-promise helpers.insert function? - pg-promise

What is the best way to provide 'pgp' to the following function?
Considering 'db' can be db connection, transaction (tx), or task from the above.
module.exports = async (db, tableName, records) => {
const record_ = records[0]
const columns_ = []
for (const c_ in record_) { columns_.push(c_) }
const insert_ = pgp.helpers.insert(records, columns_, tableName)
return db.result(insert_, null, r => r.rowCount)
}

See Where should I initialize pg-promise, which shows you:
module.exports = {
pgp, db
};
...so you can import it and use whenever you need.

Related

React-Query useQueries hook to run useInfiniteQuery hooks in parallel

I am new to React-Query, but I have not been able to find an example to the following question:
Is it possible to use useInfiniteQuery within useQueries?
I can see from the parallel query documentation on GitHub, that it's fairly easy to set-up a map of normal queries.
The example provided:
function App({ users }) {
const userQueries = useQueries({
queries: users.map(user => {
return {
queryKey: ['user', user.id],
queryFn: () => fetchUserById(user.id),
}
})
})
}
If I have an infinite query like the following, how would I be able to provide the individual query options, specifically the page parameter?:
const ids: string[] = ['a', 'b', 'c'];
const useGetDetailsById = () => {
return useInfiniteQuery<GetDetailsByIdResponse, AxiosError>(
['getDetailsById', id],
async ({ pageParam = '' }) => {
const { data } = await getDetailsById(
id, // I want to run queries for `id` in _parallel_
pageParam
);
return data;
},
{
getNextPageParam: (lastPage: GetDetailsByIdResponse) =>
lastPage.nextPageToken,
retry: false,
}
);
};
No, I'm afraid there is currently no such thing as useInfiniteQueries.

MongoDB and Mocha

Has anyone ever used async/await in their mocha tests?
I'm creating this simple test to check if my code successfully saves an object to the database:
const mocha = require('mocha');
const assert = require('assert');
const marioChar = require('../models/mariochar');
async function saveAMarioChar (paramname,paramweight) {
var char = new marioChar({
name: paramname,
weight: paramweight
});
const saveresult = await char.save()
return !saveresult.isNew;
}
describe(
'saving record',
() => {
it('Save a mariochar',
async () => {
const result = await saveAMarioChar('luigi',64);
assert(result)
}
)
}
)
Sorry for the trouble guys, this code is working fine, I used "mongooose" with a triple "o" on my schema creation.

Mongoose query - how to create an object for every dataset that is returned

I'm query the database and returning an array of objects, which I then want to create an objet for each set of data based on new object properties as well as push each new object into an array. I believe I'm having problems with the promise not resolved, but can't figure out how to resolve it.
The data from the query returns fine, but its when it enter the for-loop, the object isn't created. It goes into the catch statement.
const express = require('express');
const router = express.Router();
const userTxModel = require('../models/userTx.model');
var RecurringTxObj = (name, user_id, next_amt, next_date, transactions) => {
this.name = name;
this.user_id = user_id;
this.next_amt = next_amt;
this.next_date = next_date;
this.transactions = [];
};
router.get('/getRecurringTx', (req, res) => {
const recurringTxArr = [];
userTxModel
.find({ recurring: true })
.exec()
.then((recurringTxData) => {
for (let data of recurringTxData) {
recurringTxArr.push(
new RecurringTxObj(
data.name,
data.user_id,
data.amount,
data.date,
[]
)
);
}
res.status(200).send(recurringTxArr);
})
.catch((err) => {
console.log('Could not find recurring transactions');
res.status(500).send('Could not find recurring transactions');
});
});
router.get('/error', (req, res) => {
throw new Error('Something went wrong');
});
module.exports = router;

apollostack/graphql-server - how to get the fields requested in a query from resolver

I am trying to figure out a clean way to work with queries and mongdb projections so I don't have to retrieve excessive information from the database.
So assuming I have:
// the query
type Query {
getUserByEmail(email: String!): User
}
And I have a User with an email and a username, to keep things simple. If I send a query and I only want to retrieve the email, I can do the following:
query { getUserByEmail(email: "test#test.com") { email } }
But in the resolver, my DB query still retrieves both username and email, but only one of those is passed back by apollo server as the query result.
I only want the DB to retrieve what the query asks for:
// the resolver
getUserByEmail(root, args, context, info) {
// check what fields the query requested
// create a projection to only request those fields
return db.collection('users').findOne({ email: args.email }, { /* projection */ });
}
Of course the problem is, getting information on what the client is requesting isn't so straightforward.
Assuming I pass in request as context - I considered using context.payload (hapi.js), which has the query string, and searching it through various .split()s, but that feels kind of dirty. As far as I can tell, info.fieldASTs[0].selectionSet.selections has the list of fields, and I could check for it's existence in there. I'm not sure how reliable this is. Especially when I start using more complex queries.
Is there a simpler way?
In case you don't use mongDB, a projection is an additional argument you pass in telling it explicitly what to retrieve:
// telling mongoDB to not retrieve _id
db.collection('users').findOne({ email: 'test#test.com' }, { _id: 0 })
As always, thanks to the amazing community.
2020-Jan answer
The current answer to getting the fields requested in a GraphQL query, is to use the graphql-parse-resolve-info library for parsing the info parameter.
The library is "a pretty complete solution and is actually used under the hood by postgraphile", and is recommended going forward by the author of the other top library for parsing the info field, graphql-fields.
Use graphql-fields
Apollo server example
const rootSchema = [`
type Person {
id: String!
name: String!
email: String!
picture: String!
type: Int!
status: Int!
createdAt: Float
updatedAt: Float
}
schema {
query: Query
mutation: Mutation
}
`];
const rootResolvers = {
Query: {
users(root, args, context, info) {
const topLevelFields = Object.keys(graphqlFields(info));
return fetch(`/api/user?fields=${topLevelFields.join(',')}`);
}
}
};
const schema = [...rootSchema];
const resolvers = Object.assign({}, rootResolvers);
// Create schema
const executableSchema = makeExecutableSchema({
typeDefs: schema,
resolvers,
});
Sure you can. This is actually the same functionality that is implemented on join-monster package for SQL based db's. There's a talk by their creator: https://www.youtube.com/watch?v=Y7AdMIuXOgs
Take a look on their info analysing code to get you started - https://github.com/stems/join-monster/blob/master/src/queryASTToSqlAST.js#L6-L30
Would love to see a projection-monster package for us mongo users :)
UPDATE:
There is a package that creates a projection object from info on npm: https://www.npmjs.com/package/graphql-mongodb-projection
You can generate MongoDB projection from info argument. Here is the sample code that you can follow
/**
* #description - Gets MongoDB projection from graphql query
*
* #return { object }
* #param { object } info
* #param { model } model - MongoDB model for referencing
*/
function getDBProjection(info, model) {
const {
schema: { obj }
} = model;
const keys = Object.keys(obj);
const projection = {};
const { selections } = info.fieldNodes[0].selectionSet;
for (let i = 0; i < keys.length; i++) {
const key = keys[i];
const isSelected = selections.some(
selection => selection.name.value === key
);
projection[key] = isSelected;
}
console.log(projection);
}
module.exports = getDBProjection;
With a few helper functions you can use it like this (typescript version):
import { parceGqlInfo, query } from "#backend";
import { GraphQLResolveInfo } from "graphql";
export const user = async (parent: unknown, args: unknown, ctx: unknown, info: GraphQLResolveInfo): Promise<User | null> => {
const { dbQueryStr } = parceGqlInfo(info, userFields, "id");
const [user] = await query(`SELECT ${dbQueryStr} FROM users WHERE id=$1;`, [1]);
return user;
};
Helper functions.
Few points:
gql_uid used as ID! string type from primary key to not change db types
required option is used for dataloaders (if field was not requested by user)
allowedFields used to filter additional fields from info like '__typename'
queryPrefix is used if you need to prefix selected fields like select u.id from users u
const userFields = [
"gql_uid",
"id",
"email"
]
// merge arrays and delete duplicates
export const mergeDedupe = <T>(arr: any[][]): T => {
// #ts-ignore
return ([...new Set([].concat(...arr))] as unknown) as T;
};
import { parse, simplify, ResolveTree } from "graphql-parse-resolve-info";
import { GraphQLResolveInfo } from "graphql";
export const getQueryFieldsFromInfo = <Required = string>(info: GraphQLResolveInfo, options: { required?: Required[] } = {}): string[] => {
const { fields } = simplify(parse(info) as ResolveTree, info.returnType) as { fields: { [key: string]: { name: string } } };
let astFields = Object.entries(fields).map(([, v]) => v.name);
if (options.required) {
astFields = mergeDedupe([astFields, options.required]);
}
return astFields;
};
export const onlyAllowedFields = <T extends string | number>(raw: T[] | readonly T[], allowed: T[] | readonly T[]): T[] => {
return allowed.filter((f) => raw.includes(f));
};
export const parceGqlInfo = (
info: GraphQLResolveInfo,
allowedFields: string[] | readonly string[],
gqlUidDbAlliasField: string,
options: { required?: string[]; queryPrefix?: string } = {}
): { pureDbFields: string[]; gqlUidRequested: boolean; dbQueryStr: string } => {
const fieldsWithGqlUid = onlyAllowedFields(getQueryFieldsFromInfo(info, options), allowedFields);
return {
pureDbFields: fieldsWithGqlUid.filter((i) => i !== "gql_uid"),
gqlUidRequested: fieldsWithGqlUid.includes("gql_uid"),
dbQueryStr: fieldsWithGqlUid
.map((f) => {
const dbQueryStrField = f === "gql_uid" ? `${gqlUidDbAlliasField}::Text AS gql_uid` : f;
return options.queryPrefix ? `${options.queryPrefix}.${dbQueryStrField}` : dbQueryStrField;
})
.join(),
};
};

How can I wrap sails-mongo db methods for profiling?

I'm trying to setup a sails hook with miniprofiler to help profile mongo usage. I'm struggling for how to wrap the db methods in a function that will execute the profile. I'm trying to do this via a user hook:
setupMiniprofilerMongo(req, res, next) {
const adapter = sails.hooks.orm.datastores.default.adapter;
const adapterPrototype = Object.getPrototypeOf(adapter);
const originalMethod = adapter.adapter.find;
methodPrototype.find = function profiledMongoCommand(connectionName, collectionName, options, cb) {
sails.log.info(`${collectionName}.find`);
return originalMethod.call(adapter, connectionName, collectionName, options, cb);
};
}
That causes the following error to be thrown:
TypeError: Cannot read property 'collections' of undefined
at Object.module.exports.adapter.find (/Users/jgeurts/dev/platform/node_modules/sails-mongo/lib/adapter.js:349:40)
at Object.profiledMongoCommand [as find] (/Users/jgeurts/dev/platform/config/http.js:234:37)
Any help would be appreciated. I tried to wrap the methods on mongodb package, but that doesn't seem to work either. :/
I got this working by wrapping waterline query methods. There is room for improvement, though.
setupMiniprofilerWaterline(req, res, next) {
const dbOperations = [
'count',
'create',
'createEach',
'define',
'describe',
'destroy',
'drop',
'find',
'join',
// 'native',
// 'registerConnection',
'update',
];
const waterlineMethodByModels = {};
const miniprofilerWaterline = () => {
return {
name: 'mongodb',
handler(req, res, next) {
if (!req.miniprofiler || !req.miniprofiler.enabled) {
return next();
}
const profiler = req.miniprofiler;
for (const modelName of _.keys(sails.models)) {
for (const dbOperation of dbOperations) {
const model = sails.models[modelName];
if (!model[dbOperation]) {
continue;
}
if (!waterlineMethodByModels[modelName]) {
waterlineMethodByModels[modelName] = {};
}
// Prevent wrapping a method more than once
if (waterlineMethodByModels[modelName][dbOperation]) {
continue;
}
waterlineMethodByModels[modelName][dbOperation] = true;
const originalMethod = model[dbOperation];
model[dbOperation] = function profiledMongoCommand(...args) {
const query = args && args.length ? args[0] : '';
const lastArg = args && args.length ? args[args.length - 1] : null;
const modelAndMethod = `${modelName}.${dbOperation}`;
if (lastArg && typeof lastArg === 'function') {
sails.log.debug(`mongo::${modelAndMethod} - ${JSON.stringify(query)}`);
const callback = args.pop();
const timing = profiler.startTimeQuery('mongodb', query ? JSON.stringify(query || '') : '');
// In general, the callstack is kind of useless to us for these profiles
// The model/db method is more useful in the miniprofiler UI
timing.callStack = `\n\nMethod: ${modelAndMethod}`;
return originalMethod.call(this, ...args, function profiledResult(...results) {
profiler.stopTimeQuery(timing);
callback(...results);
});
}
const methodResult = originalMethod.call(this, ...args);
const methodResultPrototype = Object.getPrototypeOf(methodResult);
const isDeferred = !!methodResultPrototype.exec;
// If methodResult is a Deferred object type, then the query method will be profiled above when the deferred is executed (with a callback)
// So we only care to log this if the methodResult is not a deferred object
if (!isDeferred) {
sails.log.warn(`Was not able to profile mongo::${modelAndMethod}. Maybe its a promise? query: ${JSON.stringify(query)}`);
}
return methodResult;
};
}
}
next();
},
};
};
miniprofiler.express.for(miniprofilerWaterline())(req, res, next);
},
The code is available as miniprofiler-waterline if you want to contribute/use it in your own projects