Dataloader did not return an array of the same length? - mongodb

I Am building an express JS application with graphql, and mongodb (mongoose). I am using facebooks Dataloader to batch and cache requests.
Its working perfectly fine except for this use case.
I have a database filled with users posts. Each post contains the users ID for reference. When i make a call to return all the posts in the database. The posts are returned fine but if i try to get the user in each post. Users with multiple posts will only return a single user because the key for the second user is cached. So 2 posts(keys) from user "x" will only return 1 user object "x".
However Dataloader has to return the same amount of promises as keys that it recieves.
It has a option to specify cache as false so each key will make a request. But this doesnt seem to work for my use case.
Sorry if i havn't explained this very well.
this is my graphql request
query {
getAllPosts {
_id // This is returned fine
user {
_id
}
}
}
Returned error:
DataLoader must be constructed with a function which accepts Array<key> and returns Promise<Array<value>>, but the function did not return a Promise of an Array of the same length as the Array of keys.

are you trying to batch post keys [1, 2, 3] and expecting to get user results [{ user1 }, {user2}, { user1 }]?
or are you trying to batch user keys [1, 2] and expecting to get post results [{ post1}, {post3}] and [{ post2 }]?
seems like only in the second case will you run into a situation where you have length of keys differing from length of results array.
to solve the second, you could do something like this in sql:
const loader = new Dataloader(userIDs => {
const promises = userIDs.map(id => {
return db('user_posts')
.where('user_id', id);
});
return Promise.all(promises);
})
loader.load(1)
loader.load(2)
so you return [[{ post1}, {post3}], [{ post2 }]] which dataloader can unwrap.
if you had done this instead:
const loader = new Dataloader(userIDs => {
return db('user_posts')
.where('user_id', [userIDs]);
})
loader.load(1)
loader.load(2)
you will instead get [{ post1}, {post3}, { post2 }] and hence the error: the function did not return a Promise of an Array of the same length as the Array of keys
not sure if the above is relevant / helpful. i can revise if you can provide a snippet of your batch load function

You need to map the data returned from the database to the Array of keys.
Dataloader: The Array of values must be the same length as the Array of keys
This issue is well explained in this YouTube Video - Dataloader - Highly recommended

Related

Mongoose $in [ObjectIds] returns 0 records

In our Mongoose model, we have a product referring to an article.
this is a piece of the schema:
const product = new Schema({
article_id: Schema.Types.ObjectId,
title: String,
description: String,
...
In the API we are looking for products that are referring to a list of specific articles, and I wanted to use the $in operator:
const articles = ["5dcd2a95d7e2999332441825",
"5dcd2a95d7e2999332441827",
"5dcd2a96d7e2999332441829"]
filter.article_id = {
$in: articles.map(
article => new mongoose.Types.ObjectId(article)
),
};
return Product.find({ ...filter })
This returns 0 records, whereas I know for sure it should have returned at least 3. Looking at the console log, all that has happened is that the double quotes have been removed from the array during the ObjectId conversion.
Then I tried a different approach by returning {$oid: "id goes here"} for each mapped array item:
const articles = ["5dcd2a95d7e2999332441825",
"5dcd2a95d7e2999332441827",
"5dcd2a96d7e2999332441829"]
filter.article_id = {
$in: articles.map(
article => ({$oid: article})
),
};
return Product.find({ ...filter })
This gives a different array:
console.log(filter);
// {article_id: {$in: [{$oid: "5dcd2a95d7e2999332441825"}, {$oid: "5dcd2a95d7e2999332441827"}, {$oid: "5dcd2a96d7e2999332441829"}]}}
But in this case I get following error:
CastError: Cast to ObjectId failed for value "\"{$oid: \"5dcd2a95d7e2999332441825\"}\"".
Though - if I take that particular console logged filter and pass it in Studio 3T as a filter, I do indeed get the desired results.
Any idea what I doing wrong in this case?
Frick me! I am just a big idiot.. Apparently there was a .skip(10) added after the find() method -.-'.... Now I understand why 0 records where returned... Been spending hours on this..
For future references, Mongoose casts strings to ObjectIds automatically if defined in Schema. Therefor following is working exactly as it should given you don't skip the first 10 records:
const articles = ["5dcd2a95d7e2999332441825",
"5dcd2a95d7e2999332441827",
"5dcd2a96d7e2999332441829"]
filter.article_id = {
$in: articles
};
return Product.find({ ...filter }) // Note that I DON'T put .skip() here..

vue js 2 - for loop in multiple rest calls fetchData

I am trying to get wp-rest and Vuejs 2 to work together, so far things are coming along nicely apart from this one rest call that requires another request for the design to be complete. Essentially I want to be able to iterate / loop through the first request and dynamically change update the second request.
And my second question is performance, overall the rest calls are taking a bit longer to load - is there something I can do to optimize?
Context:
The first result data gives me an id, slug and title to all the posts I want to display only on the homepage as featured - through that id or slug I want to pass it to the second request - so I can pull in more information about those posts - like featured image and other meta field data.
<pre>export default {
name: 'work',
data () {
return {
loading: false,
page: null,
pagesingle: null,
error: null
}
},
created() {
this.fetchData()
},
methods: {
fetchData() {
this.$http.get('/cms/wp-json/wp/v2/pages/?slug=work&_embed')
.then(result => {
this.page = result.data
this.$http.get('/cms/wp-json/wp/v2/cases-studes/?slug=case-study-name').then(
result => this.pagesingle = result.data
);
})
}
}
}</pre>
I think you want to look at Promise.all. It will take an array of promises, wait for them all to complete, and then resolve with an array of results.
You would build your array of promises based on the array of slugs and ids in your first request. Maybe something like
const promises = result.data.articles.map((article) =>
this.$http.get(`/cms/wp-json/wp/v2/cases-studies/?slug=${encodeURIComponent(article.slug)}`)
);
Getting the results is as easy as
Promise.all(promises).then((results) => {
this.arrayOfSinglePages = results.map((result) => result.data);
});
Now your this.page has the array of id (and stuff) and this.arrayOfSinglePages has the page details for each of them in the same order.

Query sailsjs blueprint endpoints by id array using request

I'm using the request library to make calls from one sails app to another one which exposes the default blueprint endpoints. It works fine when I query by non-id fields, but I need to run some queries by passing id arrays. The problem is that the moment you provide an id, only the first id is considered, effectively not allowing this kind of query.
Is there a way to get around this? I could switch over to another attribute if all else fails but I need to know if there is a proper way around this.
Here's how I'm querying:
var idArr = [];//array of ids
var queryParams = { id: idArr };
var options: {
//headers, method and url here
json: queryParams
};
request(options, function(err, response, body){
if (err) return next(err);
return next(null, body);
});
Thanks in advance.
Sails blueprint APIs allow you to use the same waterline query langauge that you would otherwise use in code.
You can directly pass the array of id's in the get call to receive the objects as follows
GET /city?where={"id":[1, 2]}
Refer here for more.
Have fun!
Alright, I switched to a hacky solution to get moving.
For all models that needed querying by id arrays, I added a secondary attribute to the model. Let's call it code. Then, in afterCreate(), I updated code and set it equal to the id. This incurs an additional database call, but it's fine since it's called just once - when the object is created.
Here's the code.
module.exports = {
attributes: {
code: {
type: 'string'//the secondary attribute
},
// other attributes
},
afterCreate: function (newObj, next) {
Model.update({ id: newObj.id }, { code: newObj.id }, next);
}
}
Note that newObj isn't a Model object as even I was led to believe. So we cannot simply update its code and call newObj.save().
After this, in the queries having id arrays, substituting id with code makes them work as expected!

Subscribing to Meteor.Users Collection

// in server.js
Meteor.publish("directory", function () {
return Meteor.users.find({}, {fields: {emails: 1, profile: 1}});
});
// in client.js
Meteor.subscribe("directory");
I want to now get the directory listings queried from the client like directory.findOne() from the browser's console. //Testing purposes
Doing directory=Meteor.subscribe('directory')/directory=Meteor.Collection('directory') and performing directory.findOne() doesn't work but when I do directory=new Meteor.Collection('directory') it works and returns undefined and I bet it CREATES a mongo collection on the server which I don't like because USER collection already exists and it points to a new Collection rather than the USER collection.
NOTE: I don't wanna mess with how Meteor.users collection handles its function... I just want to retrieve some specific data from it using a different handle that will only return the specified fields and not to override its default function...
Ex:
Meteor.users.findOne() // will return the currentLoggedIn users data
directory.findOne() // will return different fields taken from Meteor.users collection.
If you want this setup to work, you need to do the following:
Meteor.publish('thisNameDoesNotMatter', function () {
var self = this;
var handle = Meteor.users.find({}, {
fields: {emails: 1, profile: 1}
}).observeChanges({
added: function (id, fields) {
self.added('thisNameMatters', id, fields);
},
changed: function (id, fields) {
self.changed('thisNameMatters', id, fields);
},
removed: function (id) {
self.removed('thisNameMatters', id);
}
});
self.ready();
self.onStop(function () {
handle.stop();
});
});
No on the client side you need to define a client-side-only collection:
directories = new Meteor.Collection('thisNameMatters');
and subscribe to the corresponding data set:
Meteor.subscribe('thisNameDoesNotMatter');
This should work now. Let me know if you think this explanation is not clear enough.
EDIT
Here, the self.added/changed/removed methods act more or less as an event dispatcher. Briefly speaking they give instructions to every client who called
Meteor.subscribe('thisNameDoesNotMatter');
about the updates that should be applied on the client's collection named thisNameMatters assuming that this collection exists. The name - passed as the first parameter - can be chosen almost arbitrarily, but if there's no corresponding collection on the client side all the updates will be ignored. Note that this collection can be client-side-only, so it does not necessarily have to correspond to a "real" collection in your database.
Returning a cursor from your publish method it's only a shortcut for the above code, with the only difference that the name of an actual collection is used instead of our theNameMatters. This mechanism actually allows you to create as many "mirrors" of your datasets as you wish. In some situations this might be quite useful. The only problem is that these "collections" will be read-only (which totally make sense BTW) because if they're not defined on the server the corresponding `insert/update/remove' methods do not exist.
The collection is called Meteor.users and there is no need to declare a new one on neither the server nor the client.
Your publish/subscribe code is correct:
// in server.js
Meteor.publish("directory", function () {
return Meteor.users.find({}, {fields: {emails: 1, profile: 1}});
});
// in client.js
Meteor.subscribe("directory");
To access documents in the users collection that have been published by the server you need to do something like this:
var usersArray = Meteor.users.find().fetch();
or
var oneUser = Meteor.users.findOne();

Mongoose - update after populate (Cast Exception)

I am not able to update my mongoose schema because of a CastERror, which makes sence, but I dont know how to solve it.
Trip Schema:
var TripSchema = new Schema({
name: String,
_users: [{type: Schema.Types.ObjectId, ref: 'User'}]
});
User Schema:
var UserSchema = new Schema({
name: String,
email: String,
});
in my html page i render a trip with the possibility to add new users to this trip, I retrieve the data by calling the findById method on the Schema:
exports.readById = function (request, result) {
Trip.findById(request.params.tripId).populate('_users').exec(function (error, trip) {
if (error) {
console.log('error getting trips');
} else {
console.log('found single trip: ' + trip);
result.json(trip);
}
})
};
this works find. In my ui i can add new users to the trip, here is the code:
var user = new UserService();
user.email = $scope.newMail;
user.$save(function(response){
trip._users.push(user._id);
trip.$update(function (response) {
console.log('OK - user ' + user.email + ' was linked to trip ' + trip.name);
// call for the updated document in database
this.readOne();
})
};
The Problem is that when I update my Schema the existing users in trip are populated, means stored as objects not id on the trip, the new user is stored as ObjectId in trip.
How can I make sure the populated users go back to ObjectId before I update? otherwise the update will fail with a CastError.
see here for error
I've been searching around for a graceful way to handle this without finding a satisfactory solution, or at least one I feel confident is what the mongoosejs folks had in mind when using populate. Nonetheless, here's the route I took:
First, I tried to separate adding to the list from saving. So in your example, move trip._users.push(user._id); out of the $save function. I put actions like this on the client side of things, since I want the UI to show the changes before I persist them.
Second, when adding the user, I kept working with the populated model -- that is, I don't push(user._id) but instead add the full user: push(user). This keeps the _users list consistent, since the ids of other users have already been replaced with their corresponding objects during population.
So now you should be working with a consistent list of populated users. In the server code, just before calling $update, I replace trip._users with a list of ObjectIds. In other words, "un-populate" _users:
user_ids = []
for (var i in trip._users){
/* it might be a good idea to do more validation here if you like, to make
* sure you don't have any naked userIds in this array already, as you would
*/in your original code.
user_ids.push(trip._users[i]._id);
}
trip._users = user_ids;
trip.$update(....
As I read through your example code again, it looks like the user you are adding to the trip might be a new user? I'm not sure if that's just a relic of your simplification for question purposes, but if not, you'll need to save the user first so mongo can assign an ObjectId before you can save the trip.
I have written an function which accepts an array, and in callback returns with an array of ObjectId. To do it asynchronously in NodeJS, I am using async.js. The function is like:
let converter = function(array, callback) {
let idArray;
async.each(array, function(item, itemCallback) {
idArray.push(item._id);
itemCallback();
}, function(err) {
callback(idArray);
})
};
This works totally fine with me, and I hope should work with you as well