Persist react-query pagination after unmount - react-query

So I've got a paginated query that looks like this:
const [filters, setFilters] = useState({
search_text: undefined,
page_size: 10,
page: 1,
});
return useQuery({
queryKey: ['users', { filters }],
queryFn: () => /* some fetch */,
keepPreviousData: true,
});
My problem is I want to know the last filters used when so when I unmount my component, I can return to it and have the exact same data AND filters. Obviously if I keep the filters as state I lose this state when the component unmounts. I was wondering if react-query can somehow help me with this?

Since this is a client state problem, react-query can’t really help here. You would need to move that client state up, either to a parent component, a global client state manager like zustand, or eventually the url - which works very well for search queries and pagination because it also gives you shareable urls for free.

Related

How to call multiple different apis using useEffect hook in React

I have a concern regarding calling the apis using axios in useEffect hook. For example, I have a react page where I am showing the all the list of consignments. A consignment can have a user from user list, a carrier from carrier list, an account from account list, a status from status list and a service from service list. All these lists are enumerated data. So, in this page where I have to get all the enumerated list before rendering the page, because in the react component, I have to display them as dropdown, so that users can apply filter on top of that. But getting the list of those enumerated data, I have to call the separate api. For example, getting users I have to call /users api and getting customers I have to call /customers api. My concern is do I need to call them using a single useEffect hook using the axios. Because I have to hit the server multiple times for getting those enumerated data. If the number of lists of enumerated data increases then my api request to the server will also increase. I don't know what is the best practice to deal with this kind of situation. Whether to create a single api so that the server is hit only once and all the enumerated data are returned or have the separate api and hit the server request to get separately to enumerated data. And hitting the server multiple times to get the enumerated data does it impact performance on the client-side I mean some memory leak? Just need some advice on that. Thanks in advance.
useEffect(() => {
const loadData = async () => {
try {
dispatch(getLoad(true));
const services = await Axios.get("/Services");
const customers = await Axios.get("/Accounts/Customers");
const resCarrier = await Axios.get("/Accounts/Carriers");
const resStatuses = await Axios.get("/Status");
setFilterData((prev) => ({
...prev,
services: services.data,
customers: customers.data,
carriers: resCarrier.data,
statuses: resStatuses.data,
}));
dispatch(getLoad(false));
} catch (error) {
dispatch(getLoad(false));
}
};
}, []);
You can use axios.all
axios.all([
axios.get(`/Services`),
axios.get(`/Accounts/Customers`),
axios.get(`/Accounts/Carriers`),
axios.get(`/Status`)
])
.then(axios.spread((services, customers, carriers, status) => {
setFilterData((prev) => ({
...prev,
services: services.data,
customers: customers.data,
carriers: carriers.data,
statuses: status.data,
}));
}));

Working with URL parameters in custom Kibana plugin

I am working on a custom plugin to Kibana (7.5.2). The plugin is of type 'app'. I would like to be able to pass parameters to this plugin in order to pre-load some data from Elasticsearch. I.e., I need to provide users with some specific URLs containing parameters that will be used by the plugin to show only a relevant portion of data.
My problem is that I was not able to find sufficient documentation on this and I do not know what the correct approach should be. I will try to summarize what I know/have done so far:
I have read the official resources on plugin development
I am aware of the fact that _g and _a URL parameters are used to pass state in Kibana applications. However, a) I am not sure if this is the correct approach in my case and b) I also failed to find any information on how my plugin should access the data from these parameters.
I checked the sources of other known plugins, but again, failed to find any clues.
I am able to inject some configuration values using injectUiAppVars in the init method of my plugin (index.js) and retrieve these values in my app (main.js):
index.js:
export default function (kibana) {
return new kibana.Plugin({
require: ['elasticsearch'],
name: ...,
uiExports: {
...
},
...
init(server, options) { // eslint-disable-line no-unused-vars
server.injectUiAppVars('logviewer', async () => {
var kibana_vars = await server.getInjectedUiAppVars('kibana');
var aggregated_vars = { ...kibana_vars, ...{ mycustomparameter: "some value" } }
return aggregated_vars
});
...
}
});
}
main.js
import chrome from 'ui/chrome';
. . .
const mycustomparameter = chrome.getInjected('mycustomparameter');
Providing that I manage to obtain parameters from URL, this would allow me to pass them to my app (via mycustomparameter), but again, I am not sure if this approach is correct.
I tried to get some help via the Elastic forum, but did not receive any answer yet.
My questions
1. Is there any source of information on this particular topic? I am aware of the fact that the plugin API changes frequently, hence I do not expect to find an extensive documentation. Maybe a good example?
Am I completely off course with the way I am trying to achieve it?
Thanks for reading this, any help would be much appreciated!

How RestBase wiki handle caching

Following the installation of RestBase using standard config, I have a working version of summary API.
The problem that the caching mechanism seems strange to me.
The piece of code would decide whether to look at a table cache for fast response. But I cannot make it a server-cache depend on some time-constrain (max-age when the cache is written for example). It means that the decision to use cache or not entirely depend on clients.
Can someone explain the workflow of RestBase caching mechanism?
// Inside key.value.js
getRevision(hyper, req) {
//This one get the header from client request and decide to use cache
or not depend on the value. Does it mean server caching is non-existent?
if (mwUtil.isNoCacheRequest(req)) {
throw new HTTPError({ status: 404 });
}
//If should use cache, below run
const rp = req.params;
const storeReq = {
uri: new URI([rp.domain, 'sys', 'table', rp.bucket, '']),
body: {
table: rp.bucket,
attributes: {
key: rp.key
},
limit: 1
}
};
return hyper.get(storeReq).then(returnRevision(req));
}
Cache invalidation is done by the change propagation service, which is triggered on page edits and similar events. Cache control headers are probably set in the Varnish VCL logic. See here for a full Wikimedia infrastructure diagram - it is outdated but gives you the generic idea of how things are wired together.

Sails pubsub how to subscribe to a model instance?

I am struggling to receive pubsub events in my client. The client store (reflux) gets the data from a project using its id. As I understand it this automatically subscribes the Sails socket for realtime events (from version 0.10), but I don't see it happening.
Here's my client store getting data from sails
(this is ES6 syntax)
onLoadProject(id) {
var url = '/api/projects/' + id;
io.socket.get(url, (p, jwres) => {
console.log('loaded project', id);
this.project = p;
this.trigger(p);
});
io.socket.on("project", function(event){
console.log('realtime event', event);
});
},
Then I created a test "touch" action in my project controller, just to have the modifiedAt field updated.
touch: function(req, res){
var id = req.param('id');
Project.findOne(id)
.then(function(project) {
if (!project) throw new Error('No project with id ' + id);
return Project.update({id: id}, {touched: project.touched+1});
})
.then(function(){
// this should not be required right?
return Project.publishUpdate(id);
})
.done(function() {
sails.log('touched ok');
res.ok();
}, function(e) {
sails.log("touch failed", e.message, e.stack);
res.serverError(e.message);
});
}
This doesn't trigger any realtime event in my client code. I also added a manual Project.publishUpdate(), but this shouldn't be required right?
What am I missing?
-------- edit ----------
There was a complication a result of my model touched attribute, since I set it to 'number' instead of 'integer' and the ORM exception wasn't caught by the promise error handling without a catch() part. So the code above works, hurray! But the realtime events are received for every instance of Project.
So let me rephrase my question:
How can I subscribe the client socket to an instance instead of a model? I could check the id on the client side and retrieve the updated instance data but that seems inefficient since every client receives a notification about every project even though they only should care about a single one.
----- edit again ------
So nevermind. The reason I was getting updates from every instance is simply because at the start of my application I triggered a findAll to get a list of available projects. As a result my socket got subscribed for all of them. The workaround would be to either initiate that call via plain http instead of a socket, or use a separate controller action for retrieving the list (therefor bypassing the blueprint route). I picked the second option because in my case it's silly to fetch all project data prior to picking one.
So to answer my own question. The reason I was getting updates from every instance is simply because at the start of my application I triggered a findAll to get a list of available projects. As a result my socket got subscribed for all of them.
The workaround would be to either initiate that call via plain http instead of a socket, or use a separate controller action for retrieving the list (therefor bypassing the blueprint route). I picked the second option because in my case it's silly to fetch all resources data prior to selecting one.
Here's the function I used to list all resources, where I filter part of the data which is not relevant for browsing the list initially.
list: function(req, res) {
Project.find()
.then(function(projects) {
var keys = [
'id',
'name',
'createdAt',
'updatedAt',
'author',
'description',
];
return projects.map(function(project){
return _.pick(project, keys);
});
})
.catch(function (e){
res.serverError(e.message);
})
.done(function(list){
res.json(list);
}, function(e) {
res.serverError(e.message);
});
},
Note that when the user loads a resource (project in my case) and then switches to another resource, the client is will be subscribed to both resources. I believe it requires a request to an action where you unsubscribe the socket explicitly to prevent this. In my case this isn't such a problem, but I plan to solve that later.
I hope this is helpful to someone.

Angular JS: Full example of GET/POST/DELETE/PUT client for a REST/CRUD backend?

I've implemented a REST/CRUD backend by following this article as an example: http://coenraets.org/blog/2012/10/creating-a-rest-api-using-node-js-express-and-mongodb/ . I have MongoDB running locally, I'm not using MongoLabs.
I've followed the Google tutorial that uses ngResource and a Factory pattern and I have query (GET all items), get an item (GET), create an item (POST), and delete an item (DELETE) working. I'm having difficulty implementing PUT the way the backend API wants it -- a PUT to a URL that includes the id (.../foo/) and also includes the updated data.
I have this bit of code to define my services:
angular.module('realmenServices', ['ngResource']).
factory('RealMen', function($resource){
return $resource('http://localhost\\:3000/realmen/:entryId', {}, {
query: {method:'GET', params:{entryId:''}, isArray:true},
post: {method:'POST'},
update: {method:'PUT'},
remove: {method:'DELETE'}
});
I call the method from this controller code:
$scope.change = function() {
RealMen.update({entryId: $scope.entryId}, function() {
$location.path('/');
});
}
but when I call the update function, the URL does not include the ID value: it's only "/realmen", not "/realmen/ID".
I've tried various solutions involving adding a "RealMen.prototype.update", but still cannot get the entryId to show up on the URL. (It also looks like I'll have to build the JSON holding just the DB field values myself -- the POST operation does it for me automatically when creating a new entry, but there doesn't seem to be a data structure that only contains the field values when I'm viewing/editing a single entry).
Is there an example client app that uses all four verbs in the expected RESTful way?
I've also seen references to Restangular and another solution that overrides $save so that it can issue either a POST or PUT (http://kirkbushell.me/angular-js-using-ng-resource-in-a-more-restful-manner/). This technology seems to be changing so rapidly that there doesn't seem to be a good reference solution that folks can use as an example.
I'm the creator of Restangular.
You can take a look at this CRUD example to see how you can PUT/POST/GET elements without all that URL configuration and $resource configuration that you need to do. Besides it, you can then use nested resources without any configuration :).
Check out this plunkr example:
http://plnkr.co/edit/d6yDka?p=preview
You could also see the README and check the documentation here https://github.com/mgonto/restangular
If you need some feature that's not there, just create an issue. I usually add features asked within a week, as I also use this library for all my AngularJS projects :)
Hope it helps!
Because your update uses PUT method, {entryId: $scope.entryId} is considered as data, to tell angular generate from the PUT data, you need to add params: {entryId: '#entryId'} when you define your update, which means
return $resource('http://localhost\\:3000/realmen/:entryId', {}, {
query: {method:'GET', params:{entryId:''}, isArray:true},
post: {method:'POST'},
update: {method:'PUT', params: {entryId: '#entryId'}},
remove: {method:'DELETE'}
});
Fix: Was missing a closing curly brace on the update line.
You can implement this way
$resource('http://localhost\\:3000/realmen/:entryId', {entryId: '#entryId'}, {
UPDATE: {method: 'PUT', url: 'http://localhost\\:3000/realmen/:entryId' },
ACTION: {method: 'PUT', url: 'http://localhost\\:3000/realmen/:entryId/action' }
})
RealMen.query() //GET /realmen/
RealMen.save({entryId: 1},{post data}) // POST /realmen/1
RealMen.delete({entryId: 1}) //DELETE /realmen/1
//any optional method
RealMen.UPDATE({entryId:1}, {post data}) // PUT /realmen/1
//query string
RealMen.query({name:'john'}) //GET /realmen?name=john
Documentation:
https://docs.angularjs.org/api/ngResource/service/$resource
Hope it helps