I have a RTK Query API that fetches a user by ID. This ID is stored in state.
const useUser = () => {
// This can be a string or undefined.
const userID = useTSelector((state) => state.users.userId)
// If the userID above becomes undefined,
// for example if it is set from somewhere else
// then we should clear the result of the below query from the cache.
const {data: user} = useGetUserByIdQuery(userId)
return { user }
}
However, if that ID becomes undefined, I would then like to remove the cached user from the query.
Is this possible out of the box?
You can set skip or use skipToken, which will reset the full hook state and not fire a query.
import { skipToken } from '#reduxjs/toolkit/query/react'
useGetUserByIdQuery(userId == undefined ? skipToken : userId)
Related
I have a collection with the following schema:
const CategorySchema = Schema({
name: String,
order: Number,
});
I'm trying to update the order field of the categories. The why I'm planning to do it is to have a local array with the ids of the categories in the order I want. Then, I'd fetch all categories (they are not many), and I'd start looping over the local array of ids. For each id, I'll locate it in the fetched array, and update the order according to the index of that id in the local array. The issue now is how to save it. Below is what I'm trying to do:
// Get all categories.
const categories = await Category.find({}, 'order');
console.log(categories);
// Get the order from the request.
const orderedItemIds = req.body.itemIds || [];
orderedItemIds.forEach((id, idx) => {
categories.find(x => x._id === id).order = idx;
});
// Save.
try {
await categories.save();
res.sendStatus(200);
} catch (e) {
console.log(e);
res.sentStatus(423);
}
When you query your categories, mongoose by default returns an array of instances of the Mongoose Document class. That means you can call their save() method whenever you mutate them.
So you can save your docs immediately after you assign the idx variable:
const orderedItemIds = req.body.itemIds || [];
orderedItemIds.forEach((id, idx) => {
const cat = categories.find(x => x._id.toString() === id);
cat.order = idx;
cat.save();
});
Note a few things about this code.
I assume that req.body.itemIds is a array of strings representing ObjectIds (e.g. '602454847756575710020545'). So In order to find a category in categories, you will need to use the .toString() method of the x._id object, because otherwise you will be trying to compare an Object and a string, which will never be true.
You can save the category right after assigning idx to cat.order without having to await it, because the next update is not depending on the save status of the previous.
I would like to update several row of my db with the same object.
let say I have a column customText type jsonb which contains an array of object
here my sequelize model :
customText: {
type: DataTypes.JSONB,
allowNull: true,
field: "custom_text"
}
Now from client I send an object:
const obj = {}
const data = {
textid: "d9fec1d4-0f7a-2c00-9d36-0c5055d64d04",
textLabel: null,
textValue: null
};
obj.customText = data
api.service("activity").patch(null, obj).catch(err => console.log(err));
Like the documentation from feathers.js said if I want to replace multiple record, I send an id equal to null.
So now here come the problem, if I do that my column customText will contain the new object only but I want an array of object, so I want to push the new data in the array. How can I patch the data?
My guess is to use a hook in feathers.js and a raw query with sequelize. But I'm not sure how to do that.
I'm not really sure of my answer but this hook work :
module.exports = function() {
return async context => {
debugger;
const sequelize = context.app.get("sequelizeClient");
const customText = JSON.stringify(context.data.customText[0]);
console.log(customField);
let query =
"UPDATE activity SET custom_text = custom_text || '" +
customText +
"' ::jsonb";
console.log(query);
await sequelize
.query(query)
.then(results => {
console.log(results);
context.results = results;
})
.catch(err => console.log(err));
return context;
I still have a problem because after this hook in feathers, the patch continue so it will update my db again.. so i put a disallow() hook.
Also, with this hook i lost the abilities to listening to event
Also i have a concern with the query, i'm not sure if it's better to use :jsonb_insert over ||
I am working on a NodeJs application and I am using mongoose node package.
Sample Code
I am using following method to create dynamic collections and these collections sometimes fail to persist the data in database -
const Mongoose = require("mongoose");
const Schema = new Mongoose.Schema({
// schema goes here
});
module.exports = function (suffix) {
if (!suffix || typeof suffix !== "string" || !suffix.trim()) {
throw Error("Invalid suffix provided!");
}
return Mongoose.model("Model", Schema, `collection_${suffix}`);
};
I am using this exported module to create dynamic collections based on unique ids passed as suffix parameter. Something like this (skipping unnecessary code) -
const saveData = require("./data-service");
const createModel = require("./db-schema");
// test 1
it("should save data1", function (done) {
const data1 = [];
const response1 = saveData(request1); // here response1.id is "cjmt8litu0000ktvamfipm9qn"
const dbModel1 = createModel(response1.id);
dbModel1.insertMany(data1)
.then(dbResponse1 => {
// assert for count
done();
});
});
// test 2
it("should save data2", function (done) {
const data2 = [];
const response2 = saveData(request2); // here response2.id is "cjmt8lm380006ktvafhesadmo"
const dbModel2 = createModel(response2.id);
dbModel2.insertMany(data2)
.then(dbResponse2 => {
// assert for count
done();
});
});
Problem
The issue is, test 2 fails! It the insertmany API results in 0 records failing the count assert.
If we swap the the order of the tests, test 1 will fail.
If I run the two tests separately, both will pass.
If there are n tests, only first test will pass and remaining will fail.
Findings
I suspected the mongoose model creation step to be faulty as it is using the same model name viz. Model while creating multiple model instances.
I changed it to following and the tests worked perfectly fine in all scenarios -
return Mongoose.model(`Model_${suffix}`, Schema, `collection_${suffix}`);
Questions
This leaves me with following questions -
Am I following correct coding conventions while creating dynamic collections?
Is suspected code the actual cause of this issue (should the model name also be unique)?
If yes, why is it failing? (I followed mongoose docs but it doesn't provide any information regarding uniqueness of the model name argument.
Thanks.
I you are calling insertMany method on dbModel1, where you variable is declared to dbModel2.
Change your test 2 from:
dbModel1.insertMany(data2)
.then(dbResponse1 => {
// assert for count
done()
});
To:
dbModel2.insertMany(data2)
.then(dbResponse1 => {
// assert for count
done()
});
I am running a firestore query to get data but the query is returning data from cached data queries earlier and then returns additional data (which was not queried earlier) in the second pass from server. Is there a way I can disable caching for firestore queries so that request goes to DB every time I query something.
this.parts$ = this.db.collection<OrderBom>('OrderBom', ref => {
let query : firebase.firestore.Query = ref;
query = query.where('orderPartLC', '==', this.searchValue.toLowerCase());
return query;
}).valueChanges();
Change that .valueChanges() to a .snapshotChanges() then you can apply a filter. See the example below.
I dont like changing default behavior (default configurations). I saw it's a desired behavior and the good practice is to show the data as soon as possible to the user, even if you refresh twice the screen.
I dont think is a bad practice to filter on fromCache === false when we dont have a choise. (In my case I do more requests after i receive this first one so due to promises and other async 'tasks' cache/server order is completly lost )
See this closed issue
getChats(user : User) {
return this.afs.collection<Chat>("chats",
ref => ref.where('participantsId', 'array-contains', user.id)
.snapshotChanges()
.pipe(filter(c=> c.payload.doc.metadata.fromCache === false)).
.pipe(map(//probaly want to parse your object here))
}
if using AngularFire2 you can try:
I read on the Internet that you can disable offline persistence - which caches your results -by not calling enablePersistence() on AngularFireStoreModule.
I have done the first and still had no success, but try it first. What I managed to do to get rid of caching results was to use the get() method from class DocumentReference. This method receives as parameter a GetOptions, which you can force the data to come from server. Usage example:
// fireStore is a instance of AngularFireStore injected by AngularFire2
let collection = fireStore.collection<any>("my-collection-name");
let options:GetOptions = {source:"server"}
collection.ref.get(options).then(results=>{
// results contains an array property called docs with collection's documents.
});
Persistence and caching should be disabled for angular/fire by default but it is not and there is no way to turn it off. As such, #BorisD's answer is correct but he hasn't explained it too well. Here's a full example for converting valueChanges to snapshotChanges.
constructor(private afs: AngularFirestore) {}
private getSequences(collection: string): Observable<IPaddedSequence[]> {
return this.afs.collection<IFirestoreVideo>('videos', ref => {
return ref
.where('flowPlayerProcessed', '==', true)
.orderBy('sequence', 'asc')
}).valueChanges().pipe(
map((results: IFirestoreVideo[]) => results.map((result: IFirestoreVideo) => ({ videoId: result.id, sequence: result.sequence })))
)
}
Converting the above to use snapshotChanges to filter out stuff from cache:
constructor(private afs: AngularFirestore) {}
private getSequences(collection: string): Observable<IPaddedSequence[]> {
return this.afs.collection<IFirestoreVideo>('videos', ref => {
return ref
.where('flowPlayerProcessed', '==', true)
.orderBy('sequence', 'asc')
}).snapshotChanges().pipe(
filter((actions: DocumentChangeAction<any>[], idx: number) => idx > 0 || actions.every(a => a.payload.doc.metadata.fromCache === false)),
map((actions: DocumentChangeAction<any>[]) => actions.map(a => ({ id: a.payload.doc.id, ...a.payload.doc.data() }))),
map((results: IFirestoreVideo[]) => results.map((result: IFirestoreVideo) => ({ videoId: result.id, sequence: result.sequence })))
)
}
The only differences are that valueChanges changes to snapshotChanges and then add the filter DocumentChangeAction and map DocumentChangeAction lines at the top of the snapshotChanges pipe, everything else remains unchanged.
This approach is discussed here
I want to update the itemsToUpdate collection.
This collection is already used in a query thus the resulting entities are already tracked in the context local property.
What is the most efficient way of overriding properties of the context.items.Local property from the itemsToUpdate collection?
private async Task<IEnumerable<item>> GetitemsAsync(IEnumerable<item> itemIds)
{
return await context.items.Where(t => itemIds.Select(x => x.Id).Contains(t.Id)).ToListAsync();
}
public async Task Update(...)
{
// Update
var queryUpdateitems = await GetitemsAsync(itemsToUpdate);
bool canUpdate = queryUpdateitems.All(t => t.UserId == userId);
if (!canUpdate)
{
throw new NotAuthorizedException();
}
else
{
// update here the itemsToUpdate collection
}
context.SaveChangesAsync();
}
In your case, you know that you have to update all these items, you just want to make sure that current user can update all items (by comparing Item.UserId). Instead of fetching all the existing items from database to make the check, you can query database to give result of the check and then you can just send update to database if check is true.
var itemIds = itemsToUpdate.Select(x => x.Id).ToList();
var canUpdate = await db.Blogs.Where(b => itemIds.Contains(b.Id)).AllAsync(t => t.UserId == userId);
if (canUpdate)
{
db.UpdateRange(itemsToUpdate);
}
else
{
throw new NotSupportedException();
}
await db.SaveChangesAsync();
Here, you have to make list of itemIds first because EF cannot inline list of items in a query and will do evaluation on client otherwise. That means EF is fetching whole table. Same is true for your GetitemsAsync method. It also queries whole table. Consider creating itemIds locally in that method too.
Once you pass in List<int> in the method EF will be happy to inline it in query and for query of canUpdate it will sent single query to database and fetch just true/false from database. Then you can use UpdateRange directly since there are nomore tracking records. Since it does not fetch all items from database, it will be faster too.