Caching Loopback REST connector - rest

Loopback has the concept of non-database connectors, including a REST connector.
What is the right way of caching get requests to such data source?

Interesting thought... I think you'd have to do this yourself by creating a new custom remote method and check a local hash of values:
// in /common/models/myModel.js
var cache = {};
MyModel.lookup = function loopkup(someParam, next) {
if (cache[someParam]) {
// first see if the value is already in the cache
return next(null, cache[someParam]);
} else {
// otherwise do the REST remote method call...
MyModel.restLoopkup(someParam, function lookupCallback(err, data) {
if (err) { return next(err); }
cache[someParam] = data; // ...and then set the new cache value.
next(null, data);
});
};
MyModel.remoteMethod(
'lookup',
{
accepts: { arg: 'param', type: 'object', http: { source: 'query' } },
returns: { arg: 'results', type: 'object' },
http: { verb: 'get', path: '/lookup' }
}
);
This code would set up an endpoint at .../api/MyModels/lookup?param=foobar for the calling code to hit. Note that you would probably want to also set an expiration time for the data and properly manage the "cache". You could also use something like a redis store for the values instead of in-memory like I've done above.
Good luck!

Related

In an isomorphic flux application, should the REST api calls be implemented in the action?

Should it be implemented in the action creator, or in a service class or component? Does the recommendation change if it's an isomorphic web app?
I've seen two different examples:
Action creator dispatches an action login_success/login_failure after making the rest call
Component calls an api service first and that service creates a login_success or failure action directly
example 1
https://github.com/schempy/react-flux-api-calls
/actions/LoginActions.js
The action itself triggers a call to the api then dispatches success or failure
var LoginActions = {
authenticate: function () {
RESTApi
.get('/api/login')
.then(function (user) {
AppDispatcher.dispatch({
actionType: "login_success",
user: user
});
})
.catch(function(err) {
AppDispatcher.dispatch({actionType:"login_failure"});
});
};
};
example 2
https://github.com/auth0/react-flux-jwt-authentication-sample
The component onclick calls an authservice function which then creates an action after it gets back the authentication results
/services/AuthService.js
class AuthService {
login(username, password) {
return this.handleAuth(when(request({
url: LOGIN_URL,
method: 'POST',
crossOrigin: true,
type: 'json',
data: {
username, password
}
})));
}
logout() {
LoginActions.logoutUser();
}
signup(username, password, extra) {
return this.handleAuth(when(request({
url: SIGNUP_URL,
method: 'POST',
crossOrigin: true,
type: 'json',
data: {
username, password, extra
}
})));
}
handleAuth(loginPromise) {
return loginPromise
.then(function(response) {
var jwt = response.id_token;
LoginActions.loginUser(jwt);
return true;
});
}
}
What's the better/standard place for this call to live in a Flux architecture?
I use an api.store with an api utility. From https://github.com/calitek/ReactPatterns React.14/ReFluxSuperAgent.
import Reflux from 'reflux';
import Actions from './Actions';
import ApiFct from './../utils/api.js';
let ApiStoreObject = {
newData: {
"React version": "0.14",
"Project": "ReFluxSuperAgent",
"currentDateTime": new Date().toLocaleString()
},
listenables: Actions,
apiInit() { ApiFct.setData(this.newData); },
apiInitDone() { ApiFct.getData(); },
apiSetData(data) { ApiFct.setData(data); }
}
const ApiStore = Reflux.createStore(ApiStoreObject);
export default ApiStore;
import request from 'superagent';
import Actions from '../flux/Actions';
let uri = 'http://localhost:3500';
module.exports = {
getData() { request.get(uri + '/routes/getData').end((err, res) => { this.gotData(res.body); }); },
gotData(data) { Actions.gotData1(data); Actions.gotData2(data); Actions.gotData3(data); },
setData(data) { request.post('/routes/setData').send(data).end((err, res) => { Actions.apiInitDone(); }) },
};
In my experience it is better to use option 1:
Putting API calls in an action creator instead of component lets you better separate concerns: your component(-tree) only calls a "log me in" action, and can remain ignorant about where the response comes from. Could in theory come from the store if login details are already known.
Calls to the API are more centralized in the action, and therefore more easily debugged.
Option 2 looks like it still fits with the flux design principles.
There are also advocates of a third alternative: call the webAPI from the store. This makes close coupling of data structures on server and client side easier/ more compartmental. And may work better if syncing independent data structures between client and server is a key concern. My experiences have not been positive with third option: having stores (indirectly) create actions breaks the unidirectional flux pattern. Benefits for me never outweighed the extra troubles in debugging. But your results may vary.

Meteor onRendered function and access to Collections

When user refresh a certain page, I want to set some initial values from the mongoDB database.
I tried using the onRendered method, which in the documentation states will run when the template that it is run on is inserted into the DOM. However, the database is not available at that instance?
When I try to access the database from the function:
Template.scienceMC.onRendered(function() {
var currentRad = radiationCollection.find().fetch()[0].rad;
}
I get the following error messages:
Exception from Tracker afterFlush function:
TypeError: Cannot read property 'rad' of undefined
However, when I run the line radiationCollection.find().fetch()[0].rad; in the console I can access the value?
How can I make sure that the copy of the mongoDB is available?
The best way for me was to use the waitOn function in the router. Thanks to #David Weldon for the tip.
Router.route('/templateName', {
waitOn: function () {
return Meteor.subscribe('collectionName');
},
action: function () {
// render all templates and regions for this route
this.render();
}
});
You need to setup a proper publication (it seems you did) and subscribe in the route parameters. If you want to make sure that you effectively have your data in the onRendered function, you need to add an extra step.
Here is an example of how to make it in your route definition:
this.templateController = RouteController.extend({
template: "YourTemplate",
action: function() {
if(this.isReady()) { this.render(); } else { this.render("yourTemplate"); this.render("loading");}
/*ACTION_FUNCTION*/
},
isReady: function() {
var subs = [
Meteor.subscribe("yoursubscription1"),
Meteor.subscribe("yoursubscription2")
];
var ready = true;
_.each(subs, function(sub) {
if(!sub.ready())
ready = false;
});
return ready;
},
data: function() {
return {
params: this.params || {}, //if you have params
yourData: radiationCollection.find()
};
}
});
In this example you get,in the onRendered function, your data both using this.data.yourData or radiationCollection.find()
EDIT: as #David Weldon stated in comment, you could also use an easier alternative: waitOn
I can't see your collection, so I can't guarantee that rad is a key in your collection, that said I believe your problem is that you collection isn't available yet. As #David Weldon says, you need to guard or wait on your subscription to be available (remember it has to load).
What I do in ironrouter is this:
data:function(){
var currentRad = radiationCollection.find().fetch()[0].rad;
if (typeof currentRad != 'undefined') {
// if typeof currentRad is not undefined
return currentRad;
}
}

In sails.js: fetch model's data from remote server

my sails.js app is embedded in php project where php generate some date for sails model. Can't find the way to fetch this date in beforeCreate callback or some custom method. I don't want use db to sync 2 models (model from sails & model from php). And i need to send some date to remote php app.
sails.js v.0.9.x
here is my controller:
module.exports = {
index: function (req, res) {},
create: function (req, res) {
if ( !req.param('login') )
throw new Error('Login is undefined');
if ( !req.param('message') )
throw new Error('Initial message is undefined');
var user, thread;
User.create({
id: 1,
name: req.param('login'),
ip: req.ip
}).done( function (err, model) {
user = model;
if (err) {
return res.redirect('/500', err);
}
user.fetch(); // my custom method
});
return res.view({ thread: thread });
}
};
and model:
module.exports = {
attributes: {
id: 'integer',
name: 'string',
ip: 'string',
fetch: function (url) {
var app = sails.express.app;
// suggest this but unlucky :)
app.get('/path/to/other/loc', function (req, res, next) {
console.log(req, res)
return next()
});
}
}
};
UPD My solution
Model:
beforeCreate: function (values, next) {
var options = 'http://localhost:1337/test',
get = require('http').get;
get(options, function (res) {
res.on('data', function (data) {
_.extend( values, JSON.parse( data.toString() ) );
next();
});
}).on('error', function () {
throw new Error('Unable to fetch remote data');
next();
});
}
Yikes--app.get is nowhere near what you need. That's binding a route handler inside of your app and has nothing to do with requesting remote data. Don't do this.
The easiest way to fetch remote data in Node is using the Request library. First install it in your project directory using npm install request. Then at the top of your model file:
var request = require('request');
and in your fetch method:
fetch: function(url, callback) {
request.get(url, function(error, response, body) {
return callback(error, body);
});
}
Note the added callback parameter for fetch; this is needed because the request is an asynchronous operation. That is, the call to fetch() will return immediately, but the request will take some time, and when it's done it will send the result back via the callback function. Seeing as fetch is at this point just a wrapper around request.get, I'm not sure why it's necessary to have it as a model method at all, but if the URL was based on something within the model instance then it would make sense.

how to resolve optional url path using ng-resource

There are restful APIs, for instance:
/players - to get list for all players
/players{/playerName} - to get info for specific player
and I already have a function using ng-resource like:
function Play() {
return $resource('/players');
}
Can I reuse this function for specific player like:
function Play(name) {
return $resource('/players/:name', {
name: name
});
}
so I want to...
send request for /players if I didn't pass name parameter.
send request for /players/someone if I passed name parameter with someone
Otherwise, I have to write another function for specific play?
Using ngResource it's very, very simple (it's basically a two-liner). You don't need even need to create any custom actions here*.
I've posted a working Plunkr here (just open Chrome Developer tools and go to the Network tab to see the results).
Service body:
return $resource('/users/:id/:name', { id:'#id', name: '#name' })
Controller:
function( $scope, Users ){
Users.query(); // GET /users (expects an array)
Users.get({id:2}); // GET /users/2
Users.get({name:'Joe'}); // GET /users/Joe
}
of course, you could, if you really wanted to :)
This is how I did it. This way you don't have to write a custom resource function for each one of your endpoints, you just add it to your list resources list. I defined a list of the endpoints I wanted to use like this.
var constants = {
"serverAddress": "foobar.com/",
"resources": {
"Foo": {
"endpoint": "foo"
},
"Bar": {
"endpoint": "bar"
}
}
}
Then created resources out of each one of them like this.
var service = angular.module('app.services', ['ngResource']);
var resourceObjects = constants.resources;
for (var resourceName in resourceObjects) {
if (resourceObjects.hasOwnProperty(resourceName)) {
addResourceFactoryToService(service, resourceName, resourceObjects[resourceName].endpoint);
}
}
function addResourceFactoryToService (service, resourceName, resourceEndpoint) {
service.factory(resourceName, function($resource) {
return $resource(
constants.serverAddress + resourceEndpoint + '/:id',
{
id: '#id',
},
{
update: {
method: 'PUT',
params: {id: '#id'}
},
}
);
});
}
The nice thing about this is that it takes 2 seconds to add a new endpoint, and I even threw in a put method for you. Then you can inject any of your resources into your controllers like this.
.controller('homeCtrl', function($scope, Foo, Bar) {
$scope.foo = Foo.query();
$scope.bar = Bar.get({id:4});
}
Use Play.query() to find all players
Use Play.get({name:$scope.name}) to find one player

Backbone.js Model different url for create and update?

lets say I have a Backbone Model and I create an instance of a model like this:
var User = Backbone.Model.extend({ ... });
var John = new User({ name : 'John', age : 33 });
I wonder if it is possible when I use John.save() to target /user/create when I use John.save() on second time (update/PUT) to target /user/update when I use John.fetch() to target /user/get and when I use John.remove() to target /user/remove
I know that I could define John.url each time before I trigger any method but I'm wondering if it could be happen automatically some how without overriding any Backbone method.
I know that I could use one url like /user/handle and handle the request based on request method (GET/POST/PUT/DELETE) but I'm just wondering if there is a way to have different url per action in Backbone.
Thanks!
Methods .fetch(), .save() and .destroy() on Backbone.Model are checking if the model has .sync() defined and if yes it will get called otherwise Backbone.sync() will get called (see the last lines of the linked source code).
So one of the solutions is to implement .sync() method.
Example:
var User = Backbone.Model.extend({
// ...
methodToURL: {
'read': '/user/get',
'create': '/user/create',
'update': '/user/update',
'delete': '/user/remove'
},
sync: function(method, model, options) {
options = options || {};
options.url = model.methodToURL[method.toLowerCase()];
return Backbone.sync.apply(this, arguments);
}
}
To abstract dzejkej's solution one level further, you might wrap the Backbone.sync function to query the model for method-specific URLs.
function setDefaultUrlOptionByMethod(syncFunc)
return function sync (method, model, options) {
options = options || {};
if (!options.url)
options.url = _.result(model, method + 'Url'); // Let Backbone.sync handle model.url fallback value
return syncFunc.call(this, method, model, options);
}
}
Then you could define the model with:
var User = Backbone.Model.extend({
sync: setDefaultUrlOptionByMethod(Backbone.sync),
readUrl: '/user/get',
createUrl: '/user/create',
updateUrl: '/user/update',
deleteUrl: '/user/delete'
});
Are you dealing with a REST implementation that isn't to spec or needs some kind of workaround?
Instead, consider using the emulateHTTP option found here:
http://documentcloud.github.com/backbone/#Sync
Otherwise, you'll probably just need to override the default Backbone.sync method and you'll be good to go if you want to get real crazy with that... but I don't suggest that. It'd be best to just use a true RESTful interface.
No you can't do this by default with backbone. What you could to is to add to the model that will change the model url on every event the model trigger. But then you have always the problem that bckbone will use POST add the first time the model was saved and PUT for every call afterward. So you need to override the save() method or Backbone.sync as well.
After all it seems not a good idea to do this cause it break the REST pattern Backbone is build on.
I got inspired by this solution, where you just create your own ajax call for the methods that are not for fetching the model. Here is a trimmed down version of it:
var Backbone = require("backbone");
var $ = require("jquery");
var _ = require("underscore");
function _request(url, method, data, callback) {
$.ajax({
url: url,
contentType: "application/json",
dataType: "json",
type: method,
data: JSON.stringify( data ),
success: function (response) {
if ( !response.error ) {
if ( callback && _.isFunction(callback.success) ) {
callback.success(response);
}
} else {
if ( callback && _.isFunction(callback.error) ) {
callback.error(response);
}
}
},
error: function(mod, response){
if ( callback && _.isFunction(callback.error) ) {
callback.error(response);
}
}
});
}
var User = Backbone.Model.extend({
initialize: function() {
_.bindAll(this, "login", "logout", "signup");
},
login: function (data, callback) {
_request("api/auth/login", "POST", data, callback);
},
logout: function (callback) {
if (this.isLoggedIn()) {
_request("api/auth/logout", "GET", null, callback);
}
},
signup: function (data, callback) {
_request(url, "POST", data, callback);
},
url: "api/auth/user"
});
module.exports = User;
And then you can use it like this:
var user = new User();
// user signup
user.signup(data, {
success: function (response) {
// signup success
}
});
// user login
user.login(data, {
success: function (response) {
// login success
}
});
// user logout
user.login({
success: function (response) {
// logout success
}
});
// fetch user details
user.fetch({
success: function () {
// logged in, go to home
window.location.hash = "";
},
error: function () {
// logged out, go to signin
window.location.hash = "signin";
}
});