Why is ag-Grid clearing filter after being set in onGridReady - ag-grid

I am experiencing strange behavior where if I set a filter model in the onGridReady event it is deleted afterwords. I've been logging the filterChanged events and I see it being called when i set the filter but it is never being called again yet the filter is cleared without a filter changed event. when I was using community I didn't experience this but when i upgraded to enterprise and started using the setFilter this began happening. any ideas?
onGridReady(params: ICellRendererParams): void {
this.gridApi = params.api
this.gridApi.sizeColumnsToFit()
this.resetDefaults()
window.addEventListener('resize', function() {
setTimeout(function() {
params.api.sizeColumnsToFit()
})
})
}
resetDefaults(): void {
this.gridApi.setFilterModel({
ColorStatus: {
filterType: 'set',
values: [ColorStatus.red.toString(), ColorStatus.yellow.toString()]
}
})
this.gridApi.onFilterChanged(); //I've tried with and without this line
}
Oddly when I set sorting in onGridReady the sort model is not affected only the filter models get cleared.
In the mean time I've moved resetDefaults() to the onFirstDataRendered event but this is not ideal because the user will see all data for a moment then before it
gets filtered.

Try with below approach instead of setting filterModel using gridApi.setFilterModel.
Get the filter instance of the Column using colId (set it while defining ColDef)
setModel using the filter instance.
// define filterModel
const filterModel = {
ColorStatus: {
filterType: 'set',
values: [ColorStatus.red.toString(), ColorStatus.yellow.toString()]
}
};
const filterInstance = this.gridApi.getFilterInstance(colId); // <- ColorStatus column columnId
// you need to set it inside ColDef for this column
filterInstance.setModel(filterModel);

My coworker found that adding newRowsAction: 'keep' to the filterParams of the column in question resolves the issue

Related

Saving ag-grid filter model across page reloads

I have an ag-grid with infinite scroll and data retrieved from an IDatasource.
What I'm trying to do is to save the filter model to session storage when it changes, and then load it and apply it when the grid is reloaded, i.e. when the user leaves the page and then comes back.
I have an onFilterChanged event handler that does
onFilterChanged(params) {
sessionStorage["myFilters"] = JSON.stringify(this.gridApi.getFilterModel());
}
And what I'm trying to do is
onGridReady(params) {
this.gridApi = params.api;
setTimeout(() => {
if(sessionStorage["myFilters"] !== undefined) {
const filters = JSON.parse(sessionStorage["myFilters"]);
this.gridApi.setFilterModel(filters);
}
this.gridApi.setDatasource(this.myDataSource);
}, 0);
}
However, even if the JSON saved to session storage is correct, when getRows is invoked on my IDatasource, its filterModel param has empty values for the filters:
Does this have to do with the fact that my filter is a set filter and the values for the set are loaded dynamically from another API endpoint?
Is there a way to do this?
Turns out I had a bug in my set filter, which was not implementing setModel and getModel properly; the solution was to store the value of the filter in the filter component itself when calling setModel and to check against it when calling getModel:
getModel() {
return {
filter: this.items
.filter((item) => item.checked || item.name === this.selected)
.map((item) => item.name)
.join(),
};
}
setModel(model: any): void {
if (model && model.filter) {
this.selected = model.filter.name || model.filter;
}
}
This way the filter is able to compare the value retrieved from sessionStorage against the existing items, and it works as expected.

Deleted/Modified/Added records from extjs store

Im having the below store and modal for a grid in extjs 4.2.
Ext.define('myapp.store.myStore',{
extends:'Ext.data.Store',
modal:'myapp.modal.myModal',
storeId:'myGridStore',
data:[],//used this only when trying inline data
proxy: {
type:'memory',
reader:{
type:'json',
}
}
});
Ext.define('myapp.modal.myModal',{
extends:'Ext.data.Modal',
fields:['bla','blha']
});
The mapping to the grid,store and modal looks fine and data is populated properly loaded in the grid.
The problem is when there are modifications to the store like
grid.getStore().removeAt(rowIndex)
or
grid.getStore().add(record)
im not able to get those through the
getRemovedRecords()
and
getNewRecords()
when i load the data into the store with the
grid.getStore().loadData(ajaxCallResponse).
It works fine when i give the data inline.
Pls help me understand what im doing wrong...
if(record.phantom != true){
record.phantom = true;
}
store.loadData(record);
Check that phantom is true first then loadData, and try to use store.getNewRecords(), if phantom is true only the records will be there in getNewRecords().
try store.getModifiedRecords() instead. This will get you the new and edited records. To check if its a new record just check the records "phantom" property which will equal true.
also, if getNewRecords() and getRemovedRecords() arent returning any records, try store.sync() after a record has been added/deleted.
When adding a new record to the store I had to set the phantom attribute to true (as suggested by #Naren Sathya in a previous answer) in order for the getModifiedRecords() method to actually list those newly added records:
// Create a new default record
var newServerConfig = new App.model.mConfigServer({
id: idForNewRecord,
server: 'server',
port: 8443,
username: 'user',
password: ''
});
/* Setting the phantom property to 'true' will ensure this record will be listed
* when trying to retrieve those new records with store.getModifiedRecords() later on
*/
newServerConfig.phantom = true;
// Add it to the store
storeServers.add(newServerConfig);

Updating MongoDB in Meteor Router Filter Methods

I am currently trying to log user page views in meteor app by storing the userId, Meteor.Router.page() and timestamp when a user clicks on other pages.
//userlog.js
Meteor.methods({
createLog: function(page){
var timeStamp = Meteor.user().lastActionTimestamp;
//Set variable to store validation if user is logging in
var hasLoggedIn = false;
//Checks if lastActionTimestamp of user is more than an hour ago
if(moment(new Date().getTime()).diff(moment(timeStamp), 'hours') >= 1){
hasLoggedIn = true;
}
console.log("this ran");
var log = {
submitted: new Date().getTime(),
userId: Meteor.userId(),
page: page,
login: hasLoggedIn
}
var logId = Userlogs.insert(log);
Meteor.users.update(Meteor.userId(), {$set: {lastActionTimestamp: log.submitted}});
return logId;
}
});
//router.js This method runs on a filter on every page
'checkLoginStatus': function(page) {
if(Meteor.userId()){
//Logs the page that the user has switched to
Meteor.call('createLog', page);
return page;
}else if(Meteor.loggingIn()) {
return 'loading';
}else {
return 'loginPage';
}
}
However this does not work and it ends up with a recursive creation of userlogs. I believe that this is due to the fact that i did a Collection.find in a router filter method. Does anyone have a work around for this issue?
When you're updating Meteor.users and setting lastActionTimestamp, Meteor.user will be updated and send the invalidation signal to all reactive contexts which depend on it. If Meteor.user is used in a filter, then that filter and all consecutive ones, including checkLoginStatus will rerun, causing a loop.
Best practices that I've found:
Avoid using reactive data sources as much as possible within filters.
Use Meteor.userId() where possible instead of Meteor.user()._id because the former will not trigger an invalidation when an attribute of the user object changes.
Order your filters so that they run with the most frequently updated reactive data source first. For example, if you have a trackPage filter that requires a user, let it run after another filter called requireUser so that you are certain you have a user before you track. Otherwise if you'd track first, check user second then when Meteor.logginIn changes from false to true, you'd track the page again.
This is the main reason we switched to meteor-mini-pages instead of Meteor-Router because it handles reactive data sources much easier. A filter can redirect, and it can stop() the router from running, etc.
Lastly, cmather and others are working on a new router which is a merger of mini-pages and Meteor.Router. It will be called Iron Router and I recommend using it once it's out!

Incrementally update Kendo UI autocomplete

I have a Kendo UI autocomplete bound to a remote transport that I need to tweak how it works and am coming up blank.
Currently, I perform a bunch of searches on the server and integrate the results into a JSON response and then return this to the datasource for the autocomplete. The problem is that this can take a long time and our application is time sensitive.
We have identified which searches are most important and found that 1 search accounts for 95% of the chosen results. However, I still need to provide the data from the other searches. I was thinking of kicking off separate requests for data on the server and adding them the autocomplete as they return. Our main search returns extremely fast and would be the first items added to the list. Then as the other searches return, I would like them to add dynamically to the list.
Our application uses knockout.js and I thought about making the datasource part of our view model, but from looking around, Kendo doesn't update based on changes to your observables.
I am currently stumped and any advice would be welcomed.
Edit:
I have been experimenting and have had some success simulating what I want with the following datasource:
var dataSource = new kendo.data.DataSource({
transport: {
read: {
url: window.performLookupUrl,
data: function () {
return {
param1: $("#Input").val()
};
}
},
parameterMap: function (options) {
return {
param1: options.param1
};
}
},
serverFiltering: true,
serverPaging: true,
requestEnd: function (e) {
if (e.type == "read") {
window.setTimeout(function() {
dataSource.add({ Name: "testin1234", Id: "X1234" })
}, 2000);
}
}
});
If the first search returns results, then after 2 seconds, a new item pops into the list. However, if the first search fails, then nothing happens. Is it proper to use (abuse??) the requestEnd like this? My eventual goal is to kick off the rest of the searches from this function.
I contacted Telerik and they gave me the following jsbin that I was able to modify to suit my needs.
http://jsbin.com/ezucuk/5/edit

Mongoose won't remove embedded documents

I'm scratching my head here, as usual it seems with node projects, and I'm not sure if I'm doing something wrong or if I've run into a bug.
I've got a schema of Server that can have any number of embedded docs called services. I'm running into a problem though where, even though I've successfully removed the individual service from the server object, when I tell it to save it doesn't remove it from the database. The save function is working because it's saving any changes I've made and is also pushing in new embedded docs, it's just not removing one that are already there.
Here is a relatively simplified example of my code:
app.put('/server/:id', function(req, res, next){
app.Server.findOne({_id: req.params.id}, function(err, server) {
server.updated = new Date();
...
for (var num = _.size(req.body.server.services) - 1; num >= 0; num--){
// Is this a new service or an existing one
if (server.services[num]) {
// Is it marked for deletion? If so, delete it
if (req.body.server.services[num].delete == "true") {
server.services[num].remove()
} else { // else, update it
server.services[num].type = req.body.server.services[num].type
...
}
} else {
// It's new, add it
delete req.body.server.services[num]["delete"]
server.services.push(req.body.server.services[num]);
}
}
server.save(function(err){
if (!err) {
req.flash('success', 'Server updated')
} else {
req.flash('error', 'Err, Something broke when we tried to save your server. Sorry!')
console.log(err)
}
res.redirect('/')
});
})
});
So the remove() is actually removing the service. If I do a server.toObject() before the save, it's not there. Any ideas why it wouldn't be removing it from the database when it saves?
Edit: I suppose the version numbers would be helpful. node#0.4.2, mongoose#1.1.5 express#2.0.0rc
I could be wrong, since I've not tested your example, but this sounds like Mongoose isn't detecting that the embedded document is modified.
From the schema types documentation page:
Since it is a schema-less type, you can change the value to anything else you like, but Mongoose loses the ability to auto detect/save those changes. To "tell" Mongoose that the value of a Mixed type has changed, call the .markModified(path) method of the document passing the path to the Mixed type you just changed.
person.anything = { x: [3, 4, { y: "changed" }] };
person.markModified('anything');
person.save(); // anything will now get saved
So you answer might be as simple as using the markModified() function.
I found a way to temporary fix this problem.
What I did is load the embedded documents into an array, splice the one to be deleted and replace the array. Something like this:
var oldusers = dl.users;
oldusers.splice(dl.users.indexOf(req.currentUser.id), 1);
dl.users = oldusers;
dl.save(function(err) {...
I know that depending on the size of the document it will