Starting to use Vue + Rethink, love the concepts of both of them and am starting to create a simple todo list.
I've got a skeleton app done with Vue + Express + RethinkDB working properly with CRD (no update yet) operations. Everything is dandy, but of course the data is not real time. When I insert a task/todo item, I need to refresh the page to see it appear,etc.
I've looked into RethinkDB's change feeds, which looks interesting, but I'm not 100% sure how to implement them into my current Express API setup.
I found a few examples playing around with Socket.io and now I am trying to implement that into my setup as well (is this necessary??).
My codebase is all up here : https://github.com/patrickbolle/vue-todo
Here's an example of my TaskList.vue component where I will display all the items -
import TaskNew from './TaskNew'
var socket = io.connect('http://localhost:8099');
export default {
components: {
TaskNew
},
data () {
return {
tasks: []
}
},
created () {
this.$http.get('http://localhost:8090/api/tasks').then(response => {
this.tasks = response.data
})
var vm = this
socket.on('tasksSocket', function (task) {
vm.tasks.push(task)
})
},
This sort of works, when I create a new task by posting to my API, a (blank) task appears, but none of my pre-existing tasks are shown.
Here is my GET route for /tasks:
//GET - All Tasks
router.get('/tasks', (req, res) => {
r.table("tasks").changes().run().then(function(cursor) {
cursor.each(function(err, task) {
io.sockets.emit("tasksSocket", task);
})
})
})
Honestly I'm just confusing myself more and more, I come from a Meteor background so doing this web socket/API stuff is confusing for me.
If anyone could just give me a pointer in the right direction I would be very grateful.
Essentially I need to :
- Retrieve tasks from my GET /api/tasks route
- Grab NEW tasks from the socket... which is also on the GET /api/tasks route?
- Display them in Vue JS
Related
I was wondering if it was possible to use DOMPurify to sanitize user input on a form before it is saved to database. Here's what I've got in my routes.js folder for my form post:
.post('/questionForm', (req, res, next) =>{
console.log(req.body);
/*console.log(req.headers);*/
const questions = new QuestionForm({
_id: mongoose.Types.ObjectId(),
price: req.body.price,
seats: req.body.seats,
body_style: req.body.body_style,
personality: req.body.personality,
activity: req.body.activity,
driving: req.body.driving,
priority: req.body.priority
});
var qClean = DOMPurify.sanitize(questions);
//res.redirect(200, path)({
// res: "Message recieved. Check for a response later."
//});
qClean.save()
.then(result => {
//res.redirect(200, '/path')({
// //res: "Message recieved. Check for a response later."
//});
res.status(200).json({
docs:[questions]
});
})
.catch(err => {
console.log(err);
});
});
I also imported the package at the top of the page with
import DOMPurify from 'dompurify';
When I run the server and submit a post request, it throws a 500 error and claims that dompurify.sanitize is not a function. Am I using it in the wrong place, and/or is it even correct to use it in the back end at all?
This might be a bit late, but for others like me happening to run into this use case I found an npm package that seems well suited so far. It's called isomorphic-dompurify.
isomorphic-dompurify
DOMPurify needs a DOM to interact with; usually supplied by the browser. Isomorphic-dompurify feeds DOMPurify another package, "jsdom", as a dependency that acts like a supplementary virtual DOM so DOMPurify knows how to sanitize your input server-side.
In the packages' own words "DOMPurify needs a DOM tree to base on, which is not available in Node by default. To work on the server side, we need a fake DOM to be created and supplied to DOMPurify. It means that DOMPurify initialization logic on server is not the same as on client".
Building on #Seth Lyness's excellent answer --
If you'd rather not add another dependency, you can just use this code before you require DOMPurify. Basically what isometric-dompurify is doing is just creating a jsdom object and putting it in global.window.
const jsdom = require('jsdom');
const {JSDOM} = jsdom;
const {window} = new JSDOM('<!DOCTYPE html>');
global.window = window;
I have the following scenario:
I have the following template:
<ul>
{{#each persons}}
{{Name}}
{{/each}}
</ul>
where persons = ReactiveVar([]) in the template .js file.
and I'm updating the persons variable in the callback of a HTTP Rest API:
var instance = Template.instance();
API(url, (error, result) = instance.persons.set(result)) //result is an array
Nothing happens on the UI. How can I fix this? (I am willing to use simple array as well but the condition is to populate the array from an API callback).
Binding external APIs to a template can be solved with a classic Template instance' ReactiveVar / ReactiveDict (let's call them reactive source). Note, that you should not make these calls or updates to a reactive source in a helper but rather inside an event or inside onCreated.
Let's take your template:
<ul>
{{#each persons}}
{{Name}}
{{/each}}
</ul>
We then make the call inside the onCreated function:
Template.myTemplate.onCreated(function () {
const instance = this
instance.state = new ReactiveDict()
instance.state.set('persons', [])
// Template's internal tracker
instance.autorun(() => {
API(url, (error, result) => instance.state.set('persons', result)) //result is an array
})
})
And return the data only by the reactive source in the helper:
Template.myTemplate.helpers({
persons() {
return Template.instance().state.get('persons')
}
})
Now this brings another problem: The external API is usually not reactive, causing the autorun to not trigger again if the data in the external API has changed. If the source would be a Mongo collection, the Template's internal Tracker would automatically re-run and update your persons state.
If you want to get the external data only once, it is fine. However, in order to scan the external api for changes you have some different options:
Easy way: use a timer (setInterval):
let timerId
Template.myTemplate.onCreated(function () {
const instance = this
instance.state = new ReactiveDict()
instance.state.set('persons', [])
timerId = setInterval(() => {
API(url, (error, result) => instance.state.set('persons', result)) //result is an array
}, 5000) // scans each 5 seconds for updates
})
Template.myTemplate.onDestroyed(function () {
if (timerId) {
clearInterval(timerId)
timerId = null
}
})
Pros
simple to implement
fine grained tuning of timing for a fluent experience
Cons
setInterval is a sink
you have to clean it up to prevent memory leaks (in onDestroyed)
Hard way: Let the external API's service call you!
If you have the option to let the external service connect and call your app via ddp you can let the external service decide, when the data has changed and is ready to fire, so your current app can update automatically.
You need a method and a collection for this:
server and client:
export const ExternalData = new Mongo.Collection('externalData')
server:
import ExternalData from 'path/to/externalData'
Meteor.methods({
'myApp.updateExternalData'(args) {
// check permissions...
// check data integrity...
const {url} = args
const {data} = args
ExternalData.update({url}, {$set: data})
}
})
Meteor.publish({
'myApp.externalData'(url) {
return ExternalData.find({url})
}
})
Now on the client you just need to subscribe to the data and update the reactive var automatically:
client:
import ExternalData from 'path/to/externalData'
Template.myTemplate.onCreated(function () {
const instance = this
// subscribe to changes
instance.autorun(() => {
const subscription = this.subscribe('myApp.externalData', url)
if (subscription.ready()) {
console.log('myApp.externalData is ready')
}
})
})
Template.myTemplate.helpers({
persons() {
return ExternalData.find({})
}
})
External Service / APP:
// if the external app is a meteor app you are lucky and can go with:
// https://docs.meteor.com/api/connections.html#DDP-connect
// Otherwise you can use the npm package:
// https://www.npmjs.com/package/ddp
// For authentication you can use:
// https://github.com/reactioncommerce/meteor-ddp-login
// or
// https://www.npmjs.com/package/ddp-login
const connection = // create a ddp connection
function onDataChanged () {
const data = //... get data from the backend of your ext. servie
const url = //... and the url for which the data is relevant
// call the app to update the data:
connection.call('myApp.updateExternalData', {url, data})
}
Pros:
Template automatically updates when the collection updates
No timers = no sinks!
Requires no additional reactive variable
You can use the collection to make external data persistent, cache it or create a revision / history
You can plug / unplug the external services (better scaling, less dependencies)
Cons:
High learning curve (but it's worth the effort)
Works only if you have control over the external service
More code = more potential errors so more tests to write
I'm working on a word game and am trying to return a random wordpair (my collection) on a page load. I'm using Express and have adapted my code from this tutorial if that's of any use.
A GET request renders my page just fine, and I'm trying to send a random WordPair object alongside the title:
router.get('/', function(req, res, next) {
res.render('play', { title: 'play', random_wordpair: wordpair_controller.wordpair_random});
});
The wordpair_random function is here inside a controller file I've made (which also successfully manages listing the wordpairs and creating new ones etc).
// Get random WordPair
exports.wordpair_random = function() {
WordPair.aggregate(
[{
$sample: {
size: 1
}
}]
)
.exec(function(err, random_wordpair) {
if (err) {
return next(err);
}
console.log(random_wordpair);
return random_wordpair;
});
};
Then inside a play.pug template, I'm simply trying to display this result:
h3 random wordpair selection is: #{random_wordpair}
But all I can see is the function rendered as HTML text. Can anyone tell me what I'm doing wrong?
I also understand looking at the documentation for MongoDB $sample aggregation that I need to be calling my function on the database object, but I've seen various examples and some don't do this. When I try calling db.wordpair.aggregate(...) (or WordPair or wordpairs as it appears in mLab) directly after initializing db in my app.js file, I get undefined errors. My db object doesn't seem to contain the correct data for this request.
Thanks!
I guess you're writing this in Node.JS. A core feature in Node.JS is non-blocking IO model. That means, the code won't wait for a database call to complete to move on.
Another concept you need to get it right is that Node.JS, being a variation of JavaScript, in nature is a functional programming. Assigning a function to a property of a JSON object like below won't cause the function to execute. It simply creates a pointer to the function body, that's why your application prints the function itself.
{ title: 'play', random_wordpair: wordpair_controller.wordpair_random}
To fix this, use a callback
exports.wordpair_random = function(callback) {
WordPair.aggregate([$sample: {size: 1}}]).exec(callback);
};
Then in you web function:
router.get('/', function(req, res, next) {
wordpair_controller.wordpair_random(function(err, result) {
//Handle errors if needed.
res.render('play', { title: 'play', random_wordpair:result });
})
});
I have a Kendo UI autocomplete bound to a remote transport that I need to tweak how it works and am coming up blank.
Currently, I perform a bunch of searches on the server and integrate the results into a JSON response and then return this to the datasource for the autocomplete. The problem is that this can take a long time and our application is time sensitive.
We have identified which searches are most important and found that 1 search accounts for 95% of the chosen results. However, I still need to provide the data from the other searches. I was thinking of kicking off separate requests for data on the server and adding them the autocomplete as they return. Our main search returns extremely fast and would be the first items added to the list. Then as the other searches return, I would like them to add dynamically to the list.
Our application uses knockout.js and I thought about making the datasource part of our view model, but from looking around, Kendo doesn't update based on changes to your observables.
I am currently stumped and any advice would be welcomed.
Edit:
I have been experimenting and have had some success simulating what I want with the following datasource:
var dataSource = new kendo.data.DataSource({
transport: {
read: {
url: window.performLookupUrl,
data: function () {
return {
param1: $("#Input").val()
};
}
},
parameterMap: function (options) {
return {
param1: options.param1
};
}
},
serverFiltering: true,
serverPaging: true,
requestEnd: function (e) {
if (e.type == "read") {
window.setTimeout(function() {
dataSource.add({ Name: "testin1234", Id: "X1234" })
}, 2000);
}
}
});
If the first search returns results, then after 2 seconds, a new item pops into the list. However, if the first search fails, then nothing happens. Is it proper to use (abuse??) the requestEnd like this? My eventual goal is to kick off the rest of the searches from this function.
I contacted Telerik and they gave me the following jsbin that I was able to modify to suit my needs.
http://jsbin.com/ezucuk/5/edit
I have been searching for an example of how I can stream the result of a MongoDB query to a nodejs client. All solutions I have found so far seem to read the query result at once and then send the result back to the server.
Instead, I would (obviously) like to supply a callback to the query method and have MongoDB call that when the next chunk of the result set is available.
I have been looking at mongoose - should I probably use a different driver?
Jan
node-mongodb-driver (the underlying layer that every mongoDB client uses in nodejs) except the cursor API that others mentioned has a nice stream API (#458). Unfortunately i did not find it documented elsewhere.
Update: there are docs.
It can be used like this:
var stream = collection.find().stream()
stream.on('error', function (err) {
console.error(err)
})
stream.on('data', function (doc) {
console.log(doc)
})
It actually implements the ReadableStream interface, so it has all the goodies (pause/resume etc)
Streaming in Mongoose became available in version 2.4.0 which appeared three months after you've posted this question:
Model.where('created').gte(twoWeeksAgo).stream().pipe(writeStream);
More elaborated examples can be found on their documentation page.
mongoose is not really "driver", it's actually an ORM wrapper around the MongoDB driver (node-mongodb-native).
To do what you're doing, take a look at the driver's .find and .each method. Here's some code from the examples:
// Find all records. find() returns a cursor
collection.find(function(err, cursor) {
sys.puts("Printing docs from Cursor Each")
cursor.each(function(err, doc) {
if(doc != null) sys.puts("Doc from Each " + sys.inspect(doc));
})
});
To stream the results, you're basically replacing that sys.puts with your "stream" function. Not sure how you plan to stream the results. I think you can do response.write() + response.flush(), but you may also want to checkout socket.io.
Here is the solution I found (please correct me anyone if thatis the wrong way to do it):
(Also excuse the bad coding - too late for me now to prettify this)
var sys = require('sys')
var http = require("http");
var Db = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Db,
Connection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Connection,
Collection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Collection,
Server = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Server;
var db = new Db('test', new Server('localhost',Connection.DEFAULT_PORT , {}));
var products;
db.open(function (error, client) {
if (error) throw error;
products = new Collection(client, 'products');
});
function ProductReader(collection) {
this.collection = collection;
}
ProductReader.prototype = new process.EventEmitter();
ProductReader.prototype.do = function() {
var self = this;
this.collection.find(function(err, cursor) {
if (err) {
self.emit('e1');
return;
}
sys.puts("Printing docs from Cursor Each");
self.emit('start');
cursor.each(function(err, doc) {
if (!err) {
self.emit('e2');
self.emit('end');
return;
}
if(doc != null) {
sys.puts("doc:" + doc.name);
self.emit('doc',doc);
} else {
self.emit('end');
}
})
});
};
http.createServer(function(req,res){
pr = new ProductReader(products);
pr.on('e1',function(){
sys.puts("E1");
res.writeHead(400,{"Content-Type": "text/plain"});
res.write("e1 occurred\n");
res.end();
});
pr.on('e2',function(){
sys.puts("E2");
res.write("ERROR\n");
});
pr.on('start',function(){
sys.puts("START");
res.writeHead(200,{"Content-Type": "text/plain"});
res.write("<products>\n");
});
pr.on('doc',function(doc){
sys.puts("A DOCUMENT" + doc.name);
res.write("<product><name>" + doc.name + "</name></product>\n");
});
pr.on('end',function(){
sys.puts("END");
res.write("</products>");
res.end();
});
pr.do();
}).listen(8000);
I have been studying mongodb streams myself, while I do not have the entire answer you are looking for, I do have part of it.
you can setup a socket.io stream
this is using javascript socket.io and socket.io-streaming available at NPM
also mongodb for the database because
using a 40 year old database that has issues is incorrect, time to modernize
also the 40 year old db is SQL and SQL doesn't do streams to my knowledge
So although you only asked about data going from server to client, I also want to get client to server in my answer because I can NEVER find it anywhere when I search and I wanted to setup one place with both the send and receive elements via stream so everyone could get the hang of it quickly.
client side sending data to server via streaming
stream = ss.createStream();
blobstream=ss.createBlobReadStream(data);
blobstream.pipe(stream);
ss(socket).emit('data.stream',stream,{},function(err,successful_db_insert_id){
//if you get back the id it went into the db and everything worked
});
server receiving stream from the client side and then replying when done
ss(socket).on('data.stream.out',function(stream,o,c){
buffer=[];
stream.on('data',function(chunk){buffer.push(chunk);});
stream.on('end',function(){
buffer=Buffer.concat(buffer);
db.insert(buffer,function(err,res){
res=insertedId[0];
c(null,res);
});
});
});
//This is the other half of that the fetching of data and streaming to the client
client side requesting and receiving stream data from server
stream=ss.createStream();
binarystring='';
stream.on('data',function(chunk){
for(var I=0;i<chunk.length;i++){
binarystring+=String.fromCharCode(chunk[i]);
}
});
stream.on('end',function(){ data=window.btoa(binarystring); c(null,data); });
ss(socket).emit('data.stream.get,stream,o,c);
server side replying to request for streaming data
ss(socket).on('data.stream.get',function(stream,o,c){
stream.on('end',function(){
c(null,true);
});
db.find().stream().pipe(stream);
});
The very last one there is the only one where I am kind of just pulling it out of my butt because I have not yet tried it, but that should work. I actually do something similar but I write the file to the hard drive then use fs.createReadStream to stream it to the client. So not sure if 100% but from what I read it should be, I'll get back to you once I test it.
P.s. anyone want to bug me about my colloquial way of talking, I'm Canadian, and I love saying "eh" come at me with your hugs and hits bros/sis' :D