Deleted/Modified/Added records from extjs store - extjs4.2

Im having the below store and modal for a grid in extjs 4.2.
Ext.define('myapp.store.myStore',{
extends:'Ext.data.Store',
modal:'myapp.modal.myModal',
storeId:'myGridStore',
data:[],//used this only when trying inline data
proxy: {
type:'memory',
reader:{
type:'json',
}
}
});
Ext.define('myapp.modal.myModal',{
extends:'Ext.data.Modal',
fields:['bla','blha']
});
The mapping to the grid,store and modal looks fine and data is populated properly loaded in the grid.
The problem is when there are modifications to the store like
grid.getStore().removeAt(rowIndex)
or
grid.getStore().add(record)
im not able to get those through the
getRemovedRecords()
and
getNewRecords()
when i load the data into the store with the
grid.getStore().loadData(ajaxCallResponse).
It works fine when i give the data inline.
Pls help me understand what im doing wrong...

if(record.phantom != true){
record.phantom = true;
}
store.loadData(record);
Check that phantom is true first then loadData, and try to use store.getNewRecords(), if phantom is true only the records will be there in getNewRecords().

try store.getModifiedRecords() instead. This will get you the new and edited records. To check if its a new record just check the records "phantom" property which will equal true.
also, if getNewRecords() and getRemovedRecords() arent returning any records, try store.sync() after a record has been added/deleted.

When adding a new record to the store I had to set the phantom attribute to true (as suggested by #Naren Sathya in a previous answer) in order for the getModifiedRecords() method to actually list those newly added records:
// Create a new default record
var newServerConfig = new App.model.mConfigServer({
id: idForNewRecord,
server: 'server',
port: 8443,
username: 'user',
password: ''
});
/* Setting the phantom property to 'true' will ensure this record will be listed
* when trying to retrieve those new records with store.getModifiedRecords() later on
*/
newServerConfig.phantom = true;
// Add it to the store
storeServers.add(newServerConfig);

Related

Flutter Parse Server Sdk not saving the second object in the table (class)

this function takes a ServicePoint object as argument, which has the following attributes:
adminId (String)
name (String)
serviceType (enum)
I want this function to create a new Table with name: "name+adminId". This is achieved.
Also I want this function to create a new Table (if it is not there already) by the name ServicePoints.
ServicePoints stores the relationship between user (with objectId = adminId) and the new Table.
To achieve this, I set "serviceTable" attribute with value as the new Table created, acting as a pointer.
When I run the code first time, I achieve the required tables. But, when I run the function second time, it doesn't add the new row/record to ServicePoints table.
I don't know why.
UPDATE I found that set ParseObject operation is the culprit. But, to my surprize, it executes successfully for the very first time. But fails every next time. This is really absurd behaviour from parse_server_sdk_flutter.
Future<bool> createServicePoint(ServicePoint servicePoint) async {
String newServicePointName = servicePoint.name + servicePoint.adminId;
var newServiceTable = ParseObject(newServicePointName);
var response = await newServiceTable.save();
if (response.success) {
print('Now adding new row to ServicePoints table');
var servicePointsTable = ParseObject('ServicePoints')
..set<String>("serviceName", servicePoint.name)
..set<String>("adminId", servicePoint.adminId)
..set<String>("serviceType", _typeToLabel[servicePoint.serviceType])
..set<ParseObject>("serviceTable", newServiceTable);
var recentResponse = await servicePointsTable.save();
return recentResponse.success;
} else {
return false;
}
}
If anyone runs into this problem, you need to check the result after saving the ParseObject. If there is error like "Can't save into non-existing class/table", then just go to the dashboard and create the table first.

Why is ag-Grid clearing filter after being set in onGridReady

I am experiencing strange behavior where if I set a filter model in the onGridReady event it is deleted afterwords. I've been logging the filterChanged events and I see it being called when i set the filter but it is never being called again yet the filter is cleared without a filter changed event. when I was using community I didn't experience this but when i upgraded to enterprise and started using the setFilter this began happening. any ideas?
onGridReady(params: ICellRendererParams): void {
this.gridApi = params.api
this.gridApi.sizeColumnsToFit()
this.resetDefaults()
window.addEventListener('resize', function() {
setTimeout(function() {
params.api.sizeColumnsToFit()
})
})
}
resetDefaults(): void {
this.gridApi.setFilterModel({
ColorStatus: {
filterType: 'set',
values: [ColorStatus.red.toString(), ColorStatus.yellow.toString()]
}
})
this.gridApi.onFilterChanged(); //I've tried with and without this line
}
Oddly when I set sorting in onGridReady the sort model is not affected only the filter models get cleared.
In the mean time I've moved resetDefaults() to the onFirstDataRendered event but this is not ideal because the user will see all data for a moment then before it
gets filtered.
Try with below approach instead of setting filterModel using gridApi.setFilterModel.
Get the filter instance of the Column using colId (set it while defining ColDef)
setModel using the filter instance.
// define filterModel
const filterModel = {
ColorStatus: {
filterType: 'set',
values: [ColorStatus.red.toString(), ColorStatus.yellow.toString()]
}
};
const filterInstance = this.gridApi.getFilterInstance(colId); // <- ColorStatus column columnId
// you need to set it inside ColDef for this column
filterInstance.setModel(filterModel);
My coworker found that adding newRowsAction: 'keep' to the filterParams of the column in question resolves the issue

How to return back ODataModel in SAPUI5 to its original state after using setDeferredGroups?

I have a SAPUI5 application that uses OData V2.
In one part of the application for deleting of the items in a list I have to close change set after each call.
Then I use the following code:
sGroupId = "dmsch" + new Date().getTime();
oDataModel.setDeferredGroups([sGroupId]);
for (var i = 0; i < aSelectedContexts.length; i++) {
var sObjectPath = aSelectedContexts[i].getPath();
this._deleteObject(sObjectPath, sGroupId, fnAllRequestCompleted, fnAllRequestFailed);
}
oDataModel.submitChanges({
groupId: sGroupId
});
And in the _deleteObject function I set different changeSetId for each request, the b:
_deleteObject: function(sObjectPath, sGroupId, fnSuccessCallBackFunction, fnFailedCallBackFunction) {
var oDataModel = this.getModel();
var sChangeSetId = "cs" + (new Date().getTime() * (1 + Math.random()));
oDataModel.remove(sObjectPath, {
groupId: sGroupId,
changeSetId: sChangeSetId,
......
Now after a successful delete as soon as I create a new entry by using the createEntry function it tries to send the data of that entry to the server.
The question is how can I reset the effect of setDeferredGroups function.
Note: I need to use setDeferredGroups, and I am sure it is reason of sending newly created entries automatically to the server by each change. I need to set the setting of the ODataModel back to its original state.
Note2: Here is something regarding oData Version 4 that explain this automatic behavior after a failure.
The SAP docs here - I've tried to summarize below.
The default change groups are
{"*": {
groupId: "changes"
}
}
And the default deferred groups are
["changes"]
You can reset the data model change groups to default using
oModel.setChangeGroups({"*": {
groupId: "changes"
}
});
oModel.setDeferredGroups(["changes"]);
With this default configuration, all changes to all entity types will be collected in the changes group, and are deferred (not sent to the server automatically).
So oModel.setChangeGroups(...) is how change groups are defined, and oModel.setDeferredGroups is how each of those groups is determined to be deferred or not
The reason I mention the default change groups AND the default deferred groups, is because if not set properly, you may see unexpected behavior when using two way data binding.
For example: removing the default change group by calling oModel.setChangeGroups({}) will result in all changes to all entity types NOT getting collected into any change group, and thus not being deferred. You will see any changes made sent to the server automatically.
So lets say you have an entity type Employee and you want any changes made to this entity type to be collected in one group and be deferred:
var oChangeGroups = oModel.getChangeGroups();
oChangeGroups.Employee = {groupId: "employees"};
oModel.setChangeGroups(oChangeGroups);
var aDeferredGroups = oModel.getDeferredGroups();
aDeferredGroups.push("employees");
oModel.setDeferredGroups(aDeferredGroups);
Now you have two change groups, * with ID changes and Employee with ID employees. Any changes made to any Employee entities will be in the employees group, and all other changes will be in the changes group.
So now any create/delete/update of an employee can be submitted separately from any other changes to other entity types
oModel.createEntry("/EmployeeSet", {
groupId: "employees",
properties: {
name: "New Guy"
}
});
oModel.submitChanges({groupId: "employees"});
From this point, to go back to the default and get rid of the employees change group, you can use what I wrote above to reset everything back to default.

Updating MongoDB in Meteor Router Filter Methods

I am currently trying to log user page views in meteor app by storing the userId, Meteor.Router.page() and timestamp when a user clicks on other pages.
//userlog.js
Meteor.methods({
createLog: function(page){
var timeStamp = Meteor.user().lastActionTimestamp;
//Set variable to store validation if user is logging in
var hasLoggedIn = false;
//Checks if lastActionTimestamp of user is more than an hour ago
if(moment(new Date().getTime()).diff(moment(timeStamp), 'hours') >= 1){
hasLoggedIn = true;
}
console.log("this ran");
var log = {
submitted: new Date().getTime(),
userId: Meteor.userId(),
page: page,
login: hasLoggedIn
}
var logId = Userlogs.insert(log);
Meteor.users.update(Meteor.userId(), {$set: {lastActionTimestamp: log.submitted}});
return logId;
}
});
//router.js This method runs on a filter on every page
'checkLoginStatus': function(page) {
if(Meteor.userId()){
//Logs the page that the user has switched to
Meteor.call('createLog', page);
return page;
}else if(Meteor.loggingIn()) {
return 'loading';
}else {
return 'loginPage';
}
}
However this does not work and it ends up with a recursive creation of userlogs. I believe that this is due to the fact that i did a Collection.find in a router filter method. Does anyone have a work around for this issue?
When you're updating Meteor.users and setting lastActionTimestamp, Meteor.user will be updated and send the invalidation signal to all reactive contexts which depend on it. If Meteor.user is used in a filter, then that filter and all consecutive ones, including checkLoginStatus will rerun, causing a loop.
Best practices that I've found:
Avoid using reactive data sources as much as possible within filters.
Use Meteor.userId() where possible instead of Meteor.user()._id because the former will not trigger an invalidation when an attribute of the user object changes.
Order your filters so that they run with the most frequently updated reactive data source first. For example, if you have a trackPage filter that requires a user, let it run after another filter called requireUser so that you are certain you have a user before you track. Otherwise if you'd track first, check user second then when Meteor.logginIn changes from false to true, you'd track the page again.
This is the main reason we switched to meteor-mini-pages instead of Meteor-Router because it handles reactive data sources much easier. A filter can redirect, and it can stop() the router from running, etc.
Lastly, cmather and others are working on a new router which is a merger of mini-pages and Meteor.Router. It will be called Iron Router and I recommend using it once it's out!

Mongoose won't remove embedded documents

I'm scratching my head here, as usual it seems with node projects, and I'm not sure if I'm doing something wrong or if I've run into a bug.
I've got a schema of Server that can have any number of embedded docs called services. I'm running into a problem though where, even though I've successfully removed the individual service from the server object, when I tell it to save it doesn't remove it from the database. The save function is working because it's saving any changes I've made and is also pushing in new embedded docs, it's just not removing one that are already there.
Here is a relatively simplified example of my code:
app.put('/server/:id', function(req, res, next){
app.Server.findOne({_id: req.params.id}, function(err, server) {
server.updated = new Date();
...
for (var num = _.size(req.body.server.services) - 1; num >= 0; num--){
// Is this a new service or an existing one
if (server.services[num]) {
// Is it marked for deletion? If so, delete it
if (req.body.server.services[num].delete == "true") {
server.services[num].remove()
} else { // else, update it
server.services[num].type = req.body.server.services[num].type
...
}
} else {
// It's new, add it
delete req.body.server.services[num]["delete"]
server.services.push(req.body.server.services[num]);
}
}
server.save(function(err){
if (!err) {
req.flash('success', 'Server updated')
} else {
req.flash('error', 'Err, Something broke when we tried to save your server. Sorry!')
console.log(err)
}
res.redirect('/')
});
})
});
So the remove() is actually removing the service. If I do a server.toObject() before the save, it's not there. Any ideas why it wouldn't be removing it from the database when it saves?
Edit: I suppose the version numbers would be helpful. node#0.4.2, mongoose#1.1.5 express#2.0.0rc
I could be wrong, since I've not tested your example, but this sounds like Mongoose isn't detecting that the embedded document is modified.
From the schema types documentation page:
Since it is a schema-less type, you can change the value to anything else you like, but Mongoose loses the ability to auto detect/save those changes. To "tell" Mongoose that the value of a Mixed type has changed, call the .markModified(path) method of the document passing the path to the Mixed type you just changed.
person.anything = { x: [3, 4, { y: "changed" }] };
person.markModified('anything');
person.save(); // anything will now get saved
So you answer might be as simple as using the markModified() function.
I found a way to temporary fix this problem.
What I did is load the embedded documents into an array, splice the one to be deleted and replace the array. Something like this:
var oldusers = dl.users;
oldusers.splice(dl.users.indexOf(req.currentUser.id), 1);
dl.users = oldusers;
dl.save(function(err) {...
I know that depending on the size of the document it will