I am trying to query the database (Postgres) through Prisma. My query is
const products = await prisma.products.findMany({
where: { category: ProductsCategoryEnum[category] },
include: {
vehicles: {
include: {
manufacturers: { name: { in: { manufacturers.map(item => `"${item}"`) } } },
},
},
},
});
The error message is
Type '{ name: { in: { manufacturers: string; "": any; }; }; }' is not assignable to type 'boolean | manufacturersArgs'.
Object literal may only specify known properties, and 'name' does not exist in type 'manufacturersArgs'.ts(2322)
Manufacturers have the field name and it is unique; I am not sure why this is not working or how I can update this code to be able to query the database. It is like I should cast the values into Prisma arguments.
The TypeScript error is pretty self-explanatory: the name property does not exist in manufacturersArgs. The emitted Prisma Client does a great job of telling you what properties do and do not exist when filtering.
If you are trying to perform a nested filter, you need to use select instead of include.
Documentation: https://www.prisma.io/docs/concepts/components/prisma-client/relation-queries#filter-a-list-of-relations
Your query is going to look something like this:
const products = await prisma.products.findMany({
where: { category: ProductsCategoryEnum[category] },
select: {
// also need to select any other fields you need here
vehicles: {
// Updated this
select: { manufacturers: true },
// Updated this to add an explicit "where" clause
where: {
manufacturers: { name: { in: { manufacturers.map(item => `"${item}"`) } } },
},
},
},
});
The final code ultimately depends on your Prisma schema. If you are using an editor like VS Code, it should provide Intellisense into the Prisma Client's TypeScript definitions. You can use that to navigate the full Prisma Client and construct your query based on exactly what is and is not available. In VS Code, hold control [Windows] or command [macOS] and click on findMany in prisma.products.findMany. This lets you browse the full Prisma Client and construct your query!
The in keyword isn't working for me. I use hasSome to find items in an array. hasEvery is also available depending what the requirements are.
hasSome: manufacturers.map(item => `"${item}"`),
See https://www.prisma.io/docs/reference/api-reference/prisma-client-reference#scalar-list-filters
I have the following query in Prisma that basically returns all users where campaign id is one from the array I provide and they are added to the system within the defined time range. Also I have another entity Click for each user that should be included in the response.
const users = await this.prisma.user.findMany({
where: {
campaign: {
in: [
...campaigns.map((campaign) => campaign.id),
...campaigns.map((campaign) => campaign.name),
],
},
createdAt: {
gte: dateRange.since,
lt: dateRange.until,
},
},
include: {
clicks: true,
},
});
The problem is this query runs fine in localhost where I don't have much data, but in the production database there are nearly 500.000 users and 250.000 clicks in total, so I am not sure if that is the root case but the query fails with the following exception:
Error:
Invalid `this.prisma.user.findMany()` invocation in
/usr/src/app/dist/xyx/xyx.service.js:135:58
132 }
133 async getUsers(campaigns, dateRange) {
134 try {
→ 135 const users = await this.prisma.user.findMany(
Can't reach database server at `xyz`:`25060`
Please make sure your database server is running at `xyz`:`25060`.
Prisma error code is P1001.
xyz replaced for obvious reasons in the paths and connection string to the DB.
the only solution we found is to check what is the limit for your query and then use pagination (skip, take) params in loop to download data part by part and glue them back together then ... not optimal, but, it works. See existing bug report for example
https://github.com/prisma/prisma/issues/8832
I need to add a column to my table of riders, allowing us to store the name of the image that will display on that rider's card. I then need to update all of the records with the auto-generated image names.
I've done a bunch of searching, and all roads seem to lead back to this thread or this one. I've tried the code from both of these threads, swapping in my own table and column names, but I still can't get it to work.
This is the latest version of the code:
export async function up(knex, Promise) {
return knex.transaction(trx => {
const riders = [
{
name: 'Fabio Quartararo',
card: 'rider_card_FabioQuartararo'
},
...24 other riders here...
{
name: 'Garrett Gerloff',
card: 'rider_card_GarrettGerloff'
},
];
return knex.schema.table('riders', (table) => table.string('card')).transacting(trx)
.then(() =>{
const queries = [];
riders.forEach(rider => {
const query = knex('riders')
.update({
card: rider.card
})
.where('name', rider.name)
.transacting(trx); // This makes every update be in the same transaction
queries.push(query);
});
Promise.all(queries) // Once every query is written
.then(() => trx.commit) // We try to execute all of them
.catch(() => trx.rollback); // And rollback in case any of them goes wrong
});
});
}
When I run the migration, however, it fails with the following error:
migration file "20211202225332_update_rider_card_imgs.js" failed
migration failed with error: Cannot read properties of undefined (reading 'all')
Error running migrations: TypeError: Cannot read properties of undefined (reading 'all')
at D:\Users\seona\Documents\_Blowfish\repos\MotoGP\dist\database\migrations\20211202225332_update_rider_card_imgs.js:134:25
at processTicksAndRejections (node:internal/process/task_queues:96:5)
So it's clearly having some sort of problem with Promise.all(), but I can't for the life of me figure out what. Searching has not turned up any useful results.
Does anyone have any ideas about how I can get this working? Thanks in advance.
I think you might be following some older documentation and/or examples (at least that's what I was doing).
The Promise argument is no longer passed into the migration up and down functions.
So, the signature should be something like this:
function up(knex) {
// Use the built in Promise class
Promise.all(<ARRAY_OF_QUERY_PROMISES>);
...
}
I am trying to handle errors using findOne in meteor-mongo.
From this stackoverflow question, it appears that I should be able to handle errors by doing collection.findOne({query}, function(err, result){ <handleError> }, but doing so results in an errormessage:
"Match error: Failed Match.OneOf, Match.Maybe or Match.Optional validation"
The following code works:
export default createContainer((props) => {
let theID = props.params.theID;
Meteor.subscribe('thePubSub');
return {
x: theData.findOne({_id: theID}),
};
}, App);
The following code does not:
export default createContainer((props) => {
let theID = props.params.theID;
Meteor.subscribe('thePubSub');
return {
x: theData.findOne({_id: theID}, function(err,result){
if(!result){
return {}
};
}),
};
}, App);
What am I doing wrong and how should I be resolving this error? Is this a meteor specific error?
Any help is greatly appreciated!
What kind of error are you exactly trying to handle with your callback?
Meteor's findOne is different from node's mongodb driver's findOne that the post you link to uses.
The expected signature is:
collection.findOne([selector], [options])
There is no callback involved, since the method runs synchronously (but is reactive).
If you want to return a default value when the document is not found, you can simply use a JS logical OR:
// Provide an alternative value on the right that will be used
// if the left one is falsy.
theData.findOne({_id: theID}) || {};
A more rigorous approach would be to compare its type with
typeof queryResult === 'undefined'
Note that if theData collection is fed by the above subscription Meteor.subscribe('thePubSub'), I doubt Meteor will have time to populate the collection on the client by the time you query it…
I am running the latest, as of 5.3.2016, minified Lokijs from Lokijs.org and NW.js v0.12.3-win-x64. I have a document already saved in Lokijs:
"collections":[{"name":"admins","data":[{"username":"erik","meta":{"revision":1,"created":1459028934981,"version":0,"updated":1462333795190},"$loki":1}],"idIndex":[1],"binaryIndices":{},"constraints":null,"uniqueNames":["username"],"transforms":{},"objType":"admins","dirty":true,"cachedIndex":null,"cachedBinaryIndex":null,"cachedData":null,"transactional":false,"cloneObjects":false,"cloneMethod":"parse-stringify","asyncListeners":false,"disableChangesApi":true,"autoupdate":false,"ttl":{"age":null,"ttlInterval":null,"daemon":null},"maxId":2,"DynamicViews":[],"events":{"insert":[null],"update":[null],"pre-insert":[],"pre-update":[],"close":[],"flushbuffer":[],"error":[null],"delete":[null],"warning":[null]},"changes":[],"username":{"name":"username","regExp":{}}}.
I am trying to generate an error when I attempt to insert a duplicate key value. I have added a unique constraint on the 'username' key in this collection and have verified the collection.uniqueNames array contains 'username'.
When I run the code below, as expected, no additional documents are inserted into the collection.data array and the database is saved. However, no errors are generated. Also, when I console.log the document object after the insert method has run, it changes to:
Object {username: "erik", meta: Object, $loki: 2}.
When I change the key value to something else, the unique document is then inserted and saved properly.
How do I go about generating an error when attempting to insert a document containing key(s) that violate unique constraints? Thank you.
insertDocument: function(objParameters) {
var collection = objParameters.insert.collection;
collection.ensureUniqueIndex('username');
var document = {username: ''};
document.username = 'erik';
collection.on('error', function(err) {
return console.log(err);
});
collection.insert(document);
return thisModule.$body.triggerHandler('app.database.save');
}
EDIT: loki.db to test clone
{"filename":"loki.db","collections":[{"name":"test","data":[{"name":"erik","meta":{"revision":0,"created":1462493328062,"version":0},"$loki":1}],"idIndex":[1],"binaryIndices":{},"constraints":null,"uniqueNames":["name"],"transforms":{},"objType":"test","dirty":true,"cachedIndex":null,"cachedBinaryIndex":null,"cachedData":null,"transactional":false,"cloneObjects":true,"cloneMethod":"parse-stringify","asyncListeners":false,"disableChangesApi":true,"autoupdate":false,"ttl":{"age":null,"ttlInterval":null,"daemon":null},"maxId":2,"DynamicViews":[],"events":{"insert":[null],"update":[null],"pre-insert":[],"pre-update":[],"close":[],"flushbuffer":[],"error":[null],"delete":[null],"warning":[null]},"changes":[]}],"databaseVersion":1.1,"engineVersion":1.1,"autosave":false,"autosaveInterval":5000,"autosaveHandle":null,"options":{},"persistenceMethod":"fs","persistenceAdapter":null,"verbose":false,"events":{"init":[null],"loaded":[],"flushChanges":[],"close":[],"changes":[],"warning":[]},"ENV":"NODEJS"}
Code to test clone:
var loki = require('lokijs-1.3.min.js');
var db = new loki();
var collection = db.addCollection('test', {
clone: true,
unique: 'name'
});
collection.on('error', function(error) {
return console.log(error);
});
collection.insert({ name: 'erik'});
collection.insert({ name: 'erik'});
db.saveDatabase();
If you don't use clone: true then you need to call coll.update(document) to force index recomputation, which will trigger the error.