I am struggling to receive pubsub events in my client. The client store (reflux) gets the data from a project using its id. As I understand it this automatically subscribes the Sails socket for realtime events (from version 0.10), but I don't see it happening.
Here's my client store getting data from sails
(this is ES6 syntax)
onLoadProject(id) {
var url = '/api/projects/' + id;
io.socket.get(url, (p, jwres) => {
console.log('loaded project', id);
this.project = p;
this.trigger(p);
});
io.socket.on("project", function(event){
console.log('realtime event', event);
});
},
Then I created a test "touch" action in my project controller, just to have the modifiedAt field updated.
touch: function(req, res){
var id = req.param('id');
Project.findOne(id)
.then(function(project) {
if (!project) throw new Error('No project with id ' + id);
return Project.update({id: id}, {touched: project.touched+1});
})
.then(function(){
// this should not be required right?
return Project.publishUpdate(id);
})
.done(function() {
sails.log('touched ok');
res.ok();
}, function(e) {
sails.log("touch failed", e.message, e.stack);
res.serverError(e.message);
});
}
This doesn't trigger any realtime event in my client code. I also added a manual Project.publishUpdate(), but this shouldn't be required right?
What am I missing?
-------- edit ----------
There was a complication a result of my model touched attribute, since I set it to 'number' instead of 'integer' and the ORM exception wasn't caught by the promise error handling without a catch() part. So the code above works, hurray! But the realtime events are received for every instance of Project.
So let me rephrase my question:
How can I subscribe the client socket to an instance instead of a model? I could check the id on the client side and retrieve the updated instance data but that seems inefficient since every client receives a notification about every project even though they only should care about a single one.
----- edit again ------
So nevermind. The reason I was getting updates from every instance is simply because at the start of my application I triggered a findAll to get a list of available projects. As a result my socket got subscribed for all of them. The workaround would be to either initiate that call via plain http instead of a socket, or use a separate controller action for retrieving the list (therefor bypassing the blueprint route). I picked the second option because in my case it's silly to fetch all project data prior to picking one.
So to answer my own question. The reason I was getting updates from every instance is simply because at the start of my application I triggered a findAll to get a list of available projects. As a result my socket got subscribed for all of them.
The workaround would be to either initiate that call via plain http instead of a socket, or use a separate controller action for retrieving the list (therefor bypassing the blueprint route). I picked the second option because in my case it's silly to fetch all resources data prior to selecting one.
Here's the function I used to list all resources, where I filter part of the data which is not relevant for browsing the list initially.
list: function(req, res) {
Project.find()
.then(function(projects) {
var keys = [
'id',
'name',
'createdAt',
'updatedAt',
'author',
'description',
];
return projects.map(function(project){
return _.pick(project, keys);
});
})
.catch(function (e){
res.serverError(e.message);
})
.done(function(list){
res.json(list);
}, function(e) {
res.serverError(e.message);
});
},
Note that when the user loads a resource (project in my case) and then switches to another resource, the client is will be subscribed to both resources. I believe it requires a request to an action where you unsubscribe the socket explicitly to prevent this. In my case this isn't such a problem, but I plan to solve that later.
I hope this is helpful to someone.
Related
Using sails sockets.
From a browser I can get all 'tasks' where the user id is 1.
I can now listen for the 'task' event and look for 'created' in the verb to get new tasks and add them to the list.
However I get events from ALL created tasks regardless of user. This seems to be me as a major security issue. All someone needs to do jump into the console and set up a listener to get notified whenever any user creates a new task.
I had a look around for sometime but can't find any posts on the topic.
Being new to this kind of thing - can someone be kind enough to help out?
What is the best practise for dealing with lists over socket.io in Sails?
Cheers!
This should be what you're looking for; it prevents you from subscribing to all existing tasks on the client side. It only subscribes if you're logged in and only to tasks that belong to you. Keep in mind that this is just a first-step in implementing a secure REST API for your app - but it should get you started.
In your client-side app you'd write:
socket.on('connect', function socketConnected()
{
// This subscribes the user to all tasks that belong to him and only him.
socket.get('/task/subscribe', null, function response(data, jwres)
{
// We don’t really care about the response.
});
// This 1.) creates a new task and 2.) subscribes the user to that task.
// If the 'rest' blueprint is on, POSTing to /task gets redirected to TaskController.create automatically by sails.
// If it's not on, you write "socket.get('/task/create' ..."
socket.post('/task', {name : 'MyNewTask'}, function response(data, jwres)
{
// Add the created task inside of 'data' to your client side app.
});
})
Then in TaskController.js you would write:
subscribe : function(req, res)
{
// Is the user logged in?
if(!req.session.user)
{
return res.badRequest();
}
// Find all tasks that belong to the currently logged in user.
Task.find({userID : req.session.user.id}, findUsersCB(err, tasks)
{
// Subscribe the user to all of his tasks.
Task.subscribe(req.socket, tasks);
// Send user's tasks back to the client.
res.json(tasks);
});
}
create : function(req, res)
{
//Is the user logged in?
if(!req.session.user)
{
return res.badRequest();
}
var taskToBeCreated =
{
name : req.param('name'),
userID : req.session.user.id;
};
// Attempt to create the given task.
Task.create(taskToBeCreated, function createTaskCB(err, createdTask)
{
// Subscribe the user to the newly-created task.
Task.subscribe(req.socket, createdTask);
// Send user's task back to the client.
res.json(task);
});
}
I haven't shown an example for the 'update' and 'destroy' actions but the idea is the same for both.
I have a User controller that has a create method that checks the database for email and username uniqueness before creating the user (this is to work-around a bug in the mongodb adpater for SailsJS that doesn't honour the unique attribute flag - version 0.10.5).
The code looks like the following:
User.find({ email: req.body.email }, function (err, user) {
if(user) {
return res.badRequest('Unique email constraint. Email is already used.');
}
});
User.create(req.body).exec(function (err, user) {
// Code to catch and manage err or new user
}
What I expect is that if the email already exists in the database (mongodb), to send a 400 using res.badRequest(), then execution to end.
What happens is that the response is sent, but then control moves to User.create() - execution doesn't end. I suspect that return res.badRequest is returning control back to the calling function (User.findOne), and execution continues from there.
I tried using res.badRequest().end() but that leaves the client hanging (there is no response), and using res.end() after the return res.badRequest() generated 'header send' errors.
How do I have execution of this request end if an existing email is found?
First of all, your findOne is here a find. That's not related to your problem, but it is slightly confusing, and you should ensure you are getting data in the format you expect.
As for finishing the request after marking it bad, I have not used sails, but I was able to end execution in the past by using res.send(). EDIT: after looking at the docs, it seems this is done for you by .badRequest(), so ignore that part.
That said, even THAT is not actually your problem. Your problem is that you start an asynchronous User.find(), and then you immediately start running User.create() (also asynchronously), and so your request doesn't get marked bad until after you have already attempted to create a new user.
What you need to do is one of two things:
Use promises (NOTE: this is how it works for Mongoose; Sails may be different) to only run User.create() after User.find() has completed. e.g;
var userQuery = User.findOne({ email: req.body.email }).exec();
userQuery.addBack(function(err, user) {
if(!!user) res.badRequest('...');
else create_user();
});
Put your user creation logic inside of your findOne block. e.g.;
User.findOne({ email: req.body.email }, function(err, user) {
if (user) { // or perhaps you want if (!err)
User.create(...);
} else {
// handle error
}
});
Personally, I would advise that you use promises (especially later, when you have long chains of requests happening one on top of the other), but take your pick.
Each Sails.js model has the method publishAdd(). This notifies every listener, when a new record was added to a associated model.
This notification does not contain the newly created record. So I have to start another request from the client side to get the new record.
Is there a possibility, that Sails.js sends the new record with the notification, so I can reduce my request count?
Solution
I realized the accepted answer like that:
https://gist.github.com/openscript/7016c5fd8c5053b5e3a3
There's no way to get this record using the default publishAdd method. However, you can override that method and do the child record lookup in your implementation.
You can override publishAdd on a per-model basis by adding a publishAdd method to that model class, or override it for all models by adding the method to the config/models.js file.
I would start by copying the default publishAdd() method and then tweaking as necessary.
I know this is old, but I just had to solve this again, and didn't like the idea of dropping in duplicate code so if someone is looking for alternative, the trick is to update the model of the newly created record with an afterCreate: method.
Say you have a Game that you want to your Players to subscribe to. Games have notifications, a collection of text alerts that you only want players in the game to receive. To do this, subscribe to Game on the client by requesting it. Here I'm getting a particular game by calling game/gameId, then building my page based on what notifications and players are already on the model:
io.socket.get('/game/'+gameId, function(resData, jwres) {
let players = resData.players;
let notifications = resData.notifications;
$.each(players, function (k,v) {
if(v.id!=playerId){
addPartyMember(v);
}
});
$.each(notifications, function (k,v) {
addNotification(v.text);
});
});
Subscribed to game will only give the id's, as we know, but when I add a notification, I have both the Game Id and the notification record, so I can add the following to the Notification model:
afterCreate: function (newlyCreatedRecord, cb) {
Game.publishAdd(newlyCreatedRecord.game,'notifications',newlyCreatedRecord);
cb();}
Since my original socket.get subscribes to a particular game, I can publish only to those subscriber by using Game.publishAdd(). Now back on the client side, listen for the data coming back:
io.socket.on('game', function (event) {
if (event.attribute == 'notifications') {
addNotification(event.added.text);
}
});
The incoming records will look something like this:
{"id":"59fdd1439aee4e031e61f91f",
"verb":"addedTo",
"attribute" :"notifications",
"addedId":"59fef31ba264a60e2a88e5c1",
"added":{"game":"59fdd1439aee4e031e61f91f",
"text":"some messages",
"createdAt":"2017-11-05T11:16:43.488Z",
"updatedAt":"2017-11-05T11:16:43.488Z",
"id":"59fef31ba264a60e2a88e5c1"}}
There is a /users url in my ember application. I am using RESTAdapter to fetch data from server. When a template showing list of users is loaded, it loads the latest updated data from the server (e.g. if I make changes in one of the row in users table, say changing the name of the user, I get that change reflected at front-end as well). But if I delete the user from database, I still get that user in the users' list at the front-end.
The model-hook for users route simply returns the list of users from server:
this.store.find('user').then(onSuccess, onError);
When I try to do any operation on deleted user (say updating it's name at front-end), I get an error (which is obvious as the user is no more in the database and server responds with an appropriate error). How to force emberjs to load the list of users that are present in the database and not just in local ember-data store? Also, why local ember-data store is in-sync with database for updation (and addition as well) but not for deletion?
The problem is explained at the github issue.
The store has a cache for each model type, then when the store fetch data with the _findAll method method, it does not assume that is returning all the data, so the following steps are applied
store.pushMany(type, payload); // push the server response data to the local store
store.didUpdateAll(type); // update record with the most up-to-date record
return store.all(type); // return all the record in the local store
It will not take in consideration any record deletion as it is discussed in the issue, in this case the new response data would be added to the local store if a record with its same primaryKey cannot be found or in the other case, it will update the local copy.
I created a test to check the defined behaviour:
test('store.findAll does not manage record deletion', function() {
$.mockjaxClear();
$.mockjax({ url: '/categories', dataType: 'json', responseText: {
categories:[
{id: 1, name: "Name1"},
{id: 2, name: "Name2"}
]}});
var store = this.store();
stop();
store.find('category').then(function(result) {
equal(result.get('length'), 2, 'the new item returned with ajax is added to the local store');
$.mockjaxClear();
$.mockjax({ url: '/categories', dataType: 'json', responseText: {
categories:[
{id: 3, name: "Name3"},
]}});
store.find('category').then(function(result) {
equal(result.get('length'), 3);
$.mockjaxClear();
$.mockjax({ url: '/categories', dataType: 'json', responseText: {
categories:[
{id: 3, name: "Name4"},
]}});
store.find('category').then(function(result) {
equal(result.get('length'), 3);
equal(result.objectAt(2).get('name'), 'Name4', 'the old item returned with the new ajax request updates the local store');
start();
});
});
});
});
As you pointed something like find('modelName', {}) would solve your issue:
this.store.find('user', {})
It seems that this is a known bug in Emberjs. The workaround that I found (accidently) was : supply any arbitrary parameter to find() method and the data loaded will be the same one fetched from the server.
Below an example shows that xyz parameter is being passed with value abc.
this.store.find('user', {xyz:"abc"}).then(onSuccess, onError);
I am not sure if this is THE RIGHT WAY to do it, but it works for me. The other solutions would be those which are mentioned in the link (like sending some metadata from server i.e. maintaining the list of deleted records).
But I still couldn't figure out why this happens in case of DELETE only (and not in case of POST and PUT).
I am not sure if this really is a problem or I just do not understand EmberJS that much. Any help or comments on the same would be much appreciated.
To overcome this problem, you can unload all of the records from the store and then call find():
this.store.unloadAll('user');
this.store.find('user').then(
function(d) {
console.log(d.get('length')); // store now only contains the users returned from the server
}
);
I'm working on a website using nodejs for server side, emberjs for client side and mongodb for database. I have a page where a user profile is created and saved but the id of the data is stored as undefined unless I refresh. Is there a way atound this?
I would have to see the specific code in order to answer this with certain, but I suspect that you're either not waiting for a response from the server, or you're not passing in the model when you transition to the new route. Ember-data automatically updates when it gets a response from the server.
The general flow should go like this:
Send your post request to the server.
The server creates the user in Mongodb, and when it gets that object back, it sends it back to the client.
On the client, you wait to get the user back from the server, and pass the model into your transitionTo helper.
Here's an example on the Ember side:
App.UserCreateController = Ember.ObjectController.extend({
actions: {
createUser: function() {
var self = this;
this.get('model')
.save()
.then(function() {
self.transitionToRoute('profile', self.get('model'));
}, function() {
alert('User not successfully saved');
});
}
}
});
Another possible issue is that you're not sending the data back as Ember-data expects. i.e. Your payload should look something like this:
{
user: {
_id: 'lkj234l23jlk5j4l32j5lk34',
name: 'Jon Snow'
}
}
And you should let Ember know that it should be using the _id instead:
App.ApplicationSerializer = DS.RESTSerializer.extend({
primaryKey: "_id"
});
If this isn't your problem, post some code or give more details.