I am learning BackboneJS. With a REST backend I am trying to issue a HTTP DELETE in line: this.at(0).destroy(); in the code below:
var Task = Backbone.Model.extend({
defaults: {
name: 'Testing Just',
},
url: 'http://localhost:8080/todos/',
});
var Tasks = Backbone.Collection.extend({
model: Task,
url: 'http://localhost:8080/todos/'
});
var tasks = new Tasks();
tasks.fetch({
context: tasks
}).done(function() {
console.log("Tasks:" + this.length)
console.log(this.at(0).get('name'));
this.at(0).destroy();
console.log("Tasks:" + this.length);
console.log(this.at(0).get('name'));
});
The model is deleted from the collection but no REST DELETE occurs to the backend. The deletion on the REST backend works with 'localhost:8080/todos/0'.
Please advise what I am missing.
The deletion on the REST backend works with 'localhost:8080/todos/0'
You probably thought that backbone issues requests based on the index of model in collection, which is not the case. The param which is appended to collection's url is the id of the model, and id is what backbone uses to check whether a model is persisted or not.
Your models probably doesn't have an id attribute, or idAttribute set, so backbone thinks that model is not yet saved in persistence layer and there is no need to issue a DELETE request.
Related
Maybe I don't really understand the this.getView().getModel().refresh(true) or updateBindings.. Somehow it doesn't refresh the model, or my main idea is wrong. I mean; I can do a workaround to call a function that reads the odata service again, but this is not really beautiful. So, I read the Model in the onInit
onInit: function () {
var that = this;
var oViewModel = new sap.ui.model.json.JSONModel({});
this.getView().setModel(oViewModel, "detailView");
sap.ui.getCore().setModel(oViewModel,"detailView");
var oFilter = [];
var zAppFilter = new sap.ui.model.Filter("XXX", sap.ui.model.FilterOperator.EQ, "XXXX");
oFilter.push(zAppFilter);
var oModel = that.getView().getModel();
oModel.setDefaultBindingMode("TwoWay");
oModel.read("/XXXXSet", {
filters: oFilter,
success: function (oData) {
that.getView().getModel("detailView").setData(oData.results);
},
// ...
});
},
I use this "detailView"-JSONModel model in my view for bindings. This works.. Now, the add or delete function for example:
onDelete: function (oEvent) {
var that = this;
var oModel = this.getOwnerComponent().getModel();
var oSelectedItem = oEvent.getSource().getParent();
var oSourceID = oSelectedItem.getBindingContext("detailView").getObject().Zid;
oModel.remove("/XXX(XXX='XXX',XXXX='" + XXXX+ "')", {
method: "DELETE",
success: function(data) {
that.getView().getModel("detailView").refresh(true);
sap.ui.getCore().getModel("detailView").refresh(true);
},
// ...
});
},
That does not work.. but why? I mean also when I do updateBindings or something else. Am I understanding or doing something wrong?
Your JSONModel is not connected to anything. It's just a bunch of JSON data. So if you tell it to refresh, how should it know where to get the new data?
What refresh does not do is getting new data.
What refresh actually does (in a JSONModel) is telling the bindings that it has new data. One of these bindings can be the items of a sap.m.List for example. The list then knows that it needs to rerender to show the new data.
If you don't fetch new data and call refresh nothing will happen. The actual data is still the same.
i can do a workaround to call a function that reads agean the odata service but this is not really beautyfull
Well using an additional JSONModel when you already have a perfectly fine ODataModel isn't beautiful in the first place. If you just dropped your JSONModel and bound your view to your ODataModel then the view would automatically update after calling remove.
To bind the view to your ODataModel you can start with
<Table id="table0" items="{/XXXXSet}">
Don't forget to remove detailView from your cells.
You're mixing a client-side model (JSONModel) with a server-side model (ODataModel), expecting them to synchronize.
Client-side models and server-side models are two separate models serving two different purposes.
Client-side models
The main purpose of the client-side models is to provide and to sync data that are only available during the runtime of the application. If the app is gone, the data are gone. Some of the prominent use cases of client side models are:
Device model via JSONModel which provides information about user's device and its states.
ResourceModel which provides client side translatable UI texts for i18n purposes.
Synchronizing states from UI or application
The models here are not aware of any server-side data, and they shouldn't since it's not their purpose.
When dealing with a remote data provider that complies with a certain specification (e.g. OData or FHIR), the appropriate server-side model should be used instead.
Server-side models
Server-side models, such as ODataModel, have the advantage that they're server aware.
They know how to fetch, delete, update, create data, and even call functions from the backend system. They can be used to share states between the client and the server efficiently.
How? Simply use the server-side model in the binding definition directly. With OData as the default model for example:
<List items="{
path: '/MyEntitySet',
filters: [
{
path: 'ThatProperty',
operator: 'EQ',
value1: 'something'
}
]
}"> <!-- given "MyEntitySet", "ThatProperty", "EntityTitle", and "EntityDesc" are defined in $metadata -->
<StandardListItem title="{EntityTitle}" description="{EntityDesc}" />
</List>
This creates an ODataListBinding instance which will send a request to the service with the following URL:
https://....svc/MyEntitySet?$filter=ThatProperty eq 'something'
When the request succeeds, the list will show the entities accordingly. Afterwards, when calling myODataModel.remove(...);, the corresponding list will be refreshed automatically.
TL;DR
Am I understanding or doing something wrong?
Yes. Having an intermediate JSONModel in such cases is a common anti-pattern creating high maintenance costs. Try using the ODataModel only. The framework will do the work for you.
Sails enables passing an id property when creating an entity,
i want to ignore the id value the user sent and just set my own with autoincrement
how can i do this?
You can do this per model in sails lifecycle callbacks. For example, if you have a User model, in models/User.js you can add:
module.exports = {
attributes: {
// etc
},
beforeCreate: function(attribs, cb) {
// modify the attributes as needed here
delete attribs.id;
cb();
}
}
There are similar callbacks for beforeUpdate, etc. Unfortunately, this would have to be done in every model you want to affect.
One way to remove ids from every blueprint create request would be to use a policy. Create a policy that strips id from req.body, then apply that policy to the route POST /:model (there's an example of applying policies directly to routes here). If you do this, be careful as this could mask other POST routes you are trying to use.
I am struggling to receive pubsub events in my client. The client store (reflux) gets the data from a project using its id. As I understand it this automatically subscribes the Sails socket for realtime events (from version 0.10), but I don't see it happening.
Here's my client store getting data from sails
(this is ES6 syntax)
onLoadProject(id) {
var url = '/api/projects/' + id;
io.socket.get(url, (p, jwres) => {
console.log('loaded project', id);
this.project = p;
this.trigger(p);
});
io.socket.on("project", function(event){
console.log('realtime event', event);
});
},
Then I created a test "touch" action in my project controller, just to have the modifiedAt field updated.
touch: function(req, res){
var id = req.param('id');
Project.findOne(id)
.then(function(project) {
if (!project) throw new Error('No project with id ' + id);
return Project.update({id: id}, {touched: project.touched+1});
})
.then(function(){
// this should not be required right?
return Project.publishUpdate(id);
})
.done(function() {
sails.log('touched ok');
res.ok();
}, function(e) {
sails.log("touch failed", e.message, e.stack);
res.serverError(e.message);
});
}
This doesn't trigger any realtime event in my client code. I also added a manual Project.publishUpdate(), but this shouldn't be required right?
What am I missing?
-------- edit ----------
There was a complication a result of my model touched attribute, since I set it to 'number' instead of 'integer' and the ORM exception wasn't caught by the promise error handling without a catch() part. So the code above works, hurray! But the realtime events are received for every instance of Project.
So let me rephrase my question:
How can I subscribe the client socket to an instance instead of a model? I could check the id on the client side and retrieve the updated instance data but that seems inefficient since every client receives a notification about every project even though they only should care about a single one.
----- edit again ------
So nevermind. The reason I was getting updates from every instance is simply because at the start of my application I triggered a findAll to get a list of available projects. As a result my socket got subscribed for all of them. The workaround would be to either initiate that call via plain http instead of a socket, or use a separate controller action for retrieving the list (therefor bypassing the blueprint route). I picked the second option because in my case it's silly to fetch all project data prior to picking one.
So to answer my own question. The reason I was getting updates from every instance is simply because at the start of my application I triggered a findAll to get a list of available projects. As a result my socket got subscribed for all of them.
The workaround would be to either initiate that call via plain http instead of a socket, or use a separate controller action for retrieving the list (therefor bypassing the blueprint route). I picked the second option because in my case it's silly to fetch all resources data prior to selecting one.
Here's the function I used to list all resources, where I filter part of the data which is not relevant for browsing the list initially.
list: function(req, res) {
Project.find()
.then(function(projects) {
var keys = [
'id',
'name',
'createdAt',
'updatedAt',
'author',
'description',
];
return projects.map(function(project){
return _.pick(project, keys);
});
})
.catch(function (e){
res.serverError(e.message);
})
.done(function(list){
res.json(list);
}, function(e) {
res.serverError(e.message);
});
},
Note that when the user loads a resource (project in my case) and then switches to another resource, the client is will be subscribed to both resources. I believe it requires a request to an action where you unsubscribe the socket explicitly to prevent this. In my case this isn't such a problem, but I plan to solve that later.
I hope this is helpful to someone.
There is a /users url in my ember application. I am using RESTAdapter to fetch data from server. When a template showing list of users is loaded, it loads the latest updated data from the server (e.g. if I make changes in one of the row in users table, say changing the name of the user, I get that change reflected at front-end as well). But if I delete the user from database, I still get that user in the users' list at the front-end.
The model-hook for users route simply returns the list of users from server:
this.store.find('user').then(onSuccess, onError);
When I try to do any operation on deleted user (say updating it's name at front-end), I get an error (which is obvious as the user is no more in the database and server responds with an appropriate error). How to force emberjs to load the list of users that are present in the database and not just in local ember-data store? Also, why local ember-data store is in-sync with database for updation (and addition as well) but not for deletion?
The problem is explained at the github issue.
The store has a cache for each model type, then when the store fetch data with the _findAll method method, it does not assume that is returning all the data, so the following steps are applied
store.pushMany(type, payload); // push the server response data to the local store
store.didUpdateAll(type); // update record with the most up-to-date record
return store.all(type); // return all the record in the local store
It will not take in consideration any record deletion as it is discussed in the issue, in this case the new response data would be added to the local store if a record with its same primaryKey cannot be found or in the other case, it will update the local copy.
I created a test to check the defined behaviour:
test('store.findAll does not manage record deletion', function() {
$.mockjaxClear();
$.mockjax({ url: '/categories', dataType: 'json', responseText: {
categories:[
{id: 1, name: "Name1"},
{id: 2, name: "Name2"}
]}});
var store = this.store();
stop();
store.find('category').then(function(result) {
equal(result.get('length'), 2, 'the new item returned with ajax is added to the local store');
$.mockjaxClear();
$.mockjax({ url: '/categories', dataType: 'json', responseText: {
categories:[
{id: 3, name: "Name3"},
]}});
store.find('category').then(function(result) {
equal(result.get('length'), 3);
$.mockjaxClear();
$.mockjax({ url: '/categories', dataType: 'json', responseText: {
categories:[
{id: 3, name: "Name4"},
]}});
store.find('category').then(function(result) {
equal(result.get('length'), 3);
equal(result.objectAt(2).get('name'), 'Name4', 'the old item returned with the new ajax request updates the local store');
start();
});
});
});
});
As you pointed something like find('modelName', {}) would solve your issue:
this.store.find('user', {})
It seems that this is a known bug in Emberjs. The workaround that I found (accidently) was : supply any arbitrary parameter to find() method and the data loaded will be the same one fetched from the server.
Below an example shows that xyz parameter is being passed with value abc.
this.store.find('user', {xyz:"abc"}).then(onSuccess, onError);
I am not sure if this is THE RIGHT WAY to do it, but it works for me. The other solutions would be those which are mentioned in the link (like sending some metadata from server i.e. maintaining the list of deleted records).
But I still couldn't figure out why this happens in case of DELETE only (and not in case of POST and PUT).
I am not sure if this really is a problem or I just do not understand EmberJS that much. Any help or comments on the same would be much appreciated.
To overcome this problem, you can unload all of the records from the store and then call find():
this.store.unloadAll('user');
this.store.find('user').then(
function(d) {
console.log(d.get('length')); // store now only contains the users returned from the server
}
);
In Backbone, I have a User model:
var User = Backbone.Model.extend({
url: '/api/user'
});
Next, I instantiate a user object:
var user = new User({ id: "123" });
Then I call:
user.fetch();
Upon inspection of the networks pane in Web Inspector, it appears that an API call is being made to /api/user when calling the fetch method on user. My question is simply this: should I not expect it to make an API call to /api/user/123?
You are using a model outside of the collection, so you need to set the urlroot
http://backbonejs.org/#Model-url