How to implement ReST services with Sails.js? - rest

I am quite new to Node. I came across Sails.js. I think it is based on WebSocket, which seems to be really good for building real-time applications. I would like to know that whether Sails can be used to implement REST architecture as it uses WebSocket? And if yes, how?

Yes it can. Sails JS allows you to easily build a RESTful API, essentially with no effort to get started. Also, websockets (through socket.io) are integrated by default into the view and api.
To create a fully RESTful app from the ground up, it actually requires no JS. Try:
sails new testapp
cd testapp
sails generate model user
sails generate controller user
cd <main root>
sails lift
The CRUD (Create, Read, Update, Delete) actions are already created for you. No code!
You can create a user in your browser by doing the following:
HTTP POST (using a tool like PostMan) to http://:1337/user/create
{
"firstName": "Bob",
"lastName": "Jones"
}
Next, do a GET to see the new user:
HTTP GET http://:1337/user/
FYI - Sails JS uses a default disk based database to get you going
Done.

sails new testapp
cd testapp
sails generate api apiName
controller
create: function (req, res) {
var payload = {
name:req.body.name,
price:req.body.price,
category:req.body.category,
author:req.body.author,
description:req.body.description
};
Book.create(payload).exec(function(err){
if(err){
res.status(500).json({'error':'something is not right'})
}else{
res.status(200).json({'success':true, 'result':payload, 'message':'Book Created success'})
}
});
},
readone: async function (req, res) {
var id = req.params.id;
var fff = await Book.find(id);
if(fff.length == 0){
res.status(500).json({'error':'No record found from this ID'})
}else{
res.status(200).json({'success':true, 'result':fff, 'message':'Record found'})
}
},
model
attributes: {
id: { type: 'number', autoIncrement: true },
name: { type: 'string', required: true, },
price: { type: 'number', required: true, },
category: { type: 'string', required: true, },
author: { type: 'string' },
description: { type: 'string' },
},
routes
'post /newbook': 'BookController.create',
'get /book/:id': 'BookController.readone',

Related

how to connect postgresql with graphql [duplicate]

GraphQL has mutations, Postgres has INSERT; GraphQL has queries, Postgres has SELECT's; etc., etc.. I haven't found an example showing how you could use both in a project, for example passing all the queries from front end (React, Relay) in GraphQL, but to a actually store the data in Postgres.
Does anyone know what Facebook is using as DB and how it's connected with GraphQL?
Is the only option of storing data in Postgres right now to build custom "adapters" that take the GraphQL query and convert it into SQL?
GraphQL is database agnostic, so you can use whatever you normally use to interact with the database, and use the query or mutation's resolve method to call a function you've defined that will get/add something to the database.
Without Relay
Here is an example of a mutation using the promise-based Knex SQL query builder, first without Relay to get a feel for the concept. I'm going to assume that you have created a userType in your GraphQL schema that has three fields: id, username, and created: all required, and that you have a getUser function already defined which queries the database and returns a user object. In the database I also have a password column, but since I don't want that queried I leave it out of my userType.
// db.js
// take a user object and use knex to add it to the database, then return the newly
// created user from the db.
const addUser = (user) => (
knex('users')
.returning('id') // returns [id]
.insert({
username: user.username,
password: yourPasswordHashFunction(user.password),
created: Math.floor(Date.now() / 1000), // Unix time in seconds
})
.then((id) => (getUser(id[0])))
.catch((error) => (
console.log(error)
))
);
// schema.js
// the resolve function receives the query inputs as args, then you can call
// your addUser function using them
const mutationType = new GraphQLObjectType({
name: 'Mutation',
description: 'Functions to add things to the database.',
fields: () => ({
addUser: {
type: userType,
args: {
username: {
type: new GraphQLNonNull(GraphQLString),
},
password: {
type: new GraphQLNonNull(GraphQLString),
},
},
resolve: (_, args) => (
addUser({
username: args.username,
password: args.password,
})
),
},
}),
});
Since Postgres creates the id for me and I calculate the created timestamp, I don't need them in my mutation query.
The Relay Way
Using the helpers in graphql-relay and sticking pretty close to the Relay Starter Kit helped me, because it was a lot to take in all at once. Relay requires you to set up your schema in a specific way so that it can work properly, but the idea is the same: use your functions to fetch from or add to the database in the resolve methods.
One important caveat is that the Relay way expects that the object returned from getUser is an instance of a class User, so you'll have to modify getUser to accommodate that.
The final example using Relay (fromGlobalId, globalIdField, mutationWithClientMutationId, and nodeDefinitions are all from graphql-relay):
/**
* We get the node interface and field from the Relay library.
*
* The first method defines the way we resolve an ID to its object.
* The second defines the way we resolve an object to its GraphQL type.
*
* All your types will implement this nodeInterface
*/
const { nodeInterface, nodeField } = nodeDefinitions(
(globalId) => {
const { type, id } = fromGlobalId(globalId);
if (type === 'User') {
return getUser(id);
}
return null;
},
(obj) => {
if (obj instanceof User) {
return userType;
}
return null;
}
);
// a globalId is just a base64 encoding of the database id and the type
const userType = new GraphQLObjectType({
name: 'User',
description: 'A user.',
fields: () => ({
id: globalIdField('User'),
username: {
type: new GraphQLNonNull(GraphQLString),
description: 'The username the user has selected.',
},
created: {
type: GraphQLInt,
description: 'The Unix timestamp in seconds of when the user was created.',
},
}),
interfaces: [nodeInterface],
});
// The "payload" is the data that will be returned from the mutation
const userMutation = mutationWithClientMutationId({
name: 'AddUser',
inputFields: {
username: {
type: GraphQLString,
},
password: {
type: new GraphQLNonNull(GraphQLString),
},
},
outputFields: {
user: {
type: userType,
resolve: (payload) => getUser(payload.userId),
},
},
mutateAndGetPayload: ({ username, password }) =>
addUser(
{ username, password }
).then((user) => ({ userId: user.id })), // passed to resolve in outputFields
});
const mutationType = new GraphQLObjectType({
name: 'Mutation',
description: 'Functions to add things to the database.',
fields: () => ({
addUser: userMutation,
}),
});
const queryType = new GraphQLObjectType({
name: 'Query',
fields: () => ({
node: nodeField,
user: {
type: userType,
args: {
id: {
description: 'ID number of the user.',
type: new GraphQLNonNull(GraphQLID),
},
},
resolve: (root, args) => getUser(args.id),
},
}),
});
We address this problem in Join Monster, a library we recently open-sourced to automatically translate GraphQL queries to SQL based on your schema definitions.
This GraphQL Starter Kit can be used for experimenting with GraphQL.js and PostgreSQL:
https://github.com/kriasoft/graphql-starter-kit - Node.js, GraphQL.js, PostgreSQL, Babel, Flow
(disclaimer: I'm the author)
Have a look at graphql-sequelize for how to work with Postgres.
For mutations (create/update/delete) you can look at the examples in the relay repo for instance.
Postgraphile https://www.graphile.org/postgraphile/ is Open Source
Rapidly build highly customisable, lightning-fast GraphQL APIs
PostGraphile is an open-source tool to help you rapidly design and
serve a high-performance, secure, client-facing GraphQL API backed
primarily by your PostgreSQL database. Delight your customers with
incredible performance whilst maintaining full control over your data
and your database. Use our powerful plugin system to customise every
facet of your GraphQL API to your liking.
You can use an ORM like sequelize if you're using Javascript or Typeorm if you're using Typescript
Probably FB using mongodb or nosql in backend. I've recently read a blog entry which explain how to connect to mongodb. Basically, you need to build a graph model to match the data you already have in your DB. Then write resolve, reject function to tell GQL how to behave when posting a query request.
See https://www.compose.io/articles/using-graphql-with-mongodb/
Have a look at SequelizeJS which is a promise based ORM that can work with a number of dialects; PostgreSQL, MySQL, SQLite and MSSQL
The below code is pulled right from its example
const Sequelize = require('sequelize');
const sequelize = new Sequelize('database', 'username', 'password', {
host: 'localhost',
dialect: 'mysql'|'sqlite'|'postgres'|'mssql',
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
},
// SQLite only
storage: 'path/to/database.sqlite',
// http://docs.sequelizejs.com/manual/tutorial/querying.html#operators
operatorsAliases: false
});
const User = sequelize.define('user', {
username: Sequelize.STRING,
birthday: Sequelize.DATE
});
sequelize.sync()
.then(() => User.create({
username: 'janedoe',
birthday: new Date(1980, 6, 20)
}))
.then(jane => {
console.log(jane.toJSON());
});

Sails.JS association validation

I'm using several one to many associations in Sails.JS that look like the following:
User
email: {
type: 'string',
required: true,
unique: true
},
projects: {
collection: 'project',
via: 'user'
}
Project
name: {
type: 'string',
required: true,
minLength: 3,
maxLength: 50
},
user: {
model: 'user',
required: true
},
sites: {
collection: 'site',
via: 'project'
}
Site
project: {
model: 'project',
required: true
},
name: {
type: 'string',
required: true
}
Now when I fire off a POST request to /project it creates the project fine, and specifying the param 'user' (taken from the session) associates the project with that particular user.
The same goes for when I create a new site. However, I appear to be able to specify any number for the param 'project', even if that particular project ID doesn't exist. Really it should fail the validation if the project doesn't exist and not create the site. I thought it'd look up the association with project and check that the project ID specified is valid?
Also, I only want to be able to create a site that is associated with a project that belongs to the current user. How would I go about doing this?
Thanks in advance.
I'm not sure if it's a bug or intended behavior with your non-existent project ID association, but one work-around is to have a beforeCreate hook in your models to verify that the project ID exists:
// In your Site model
beforeCreate: function(values, next) {
...
var projectID = values['project'];
Project.findOne(projectID, function (err, project) {
if (err || !project) return next("some error message");
return next();
});
}
You can also do a check in the beforeCreate hook for your second question:
// In your Site model
beforeCreate: function(values, next) {
...
var projectID = values['project'];
Project.findOne(projectID).populate('user').exec(function (err, project) {
if (err || !project) return next("some error message");
if (project.user.id != values['userID']) return next("some other error message");
return next();
});
}
Note that you'll have to pass 'userID' as a param into the params for creating a Site instance.

How to access complex REST resources with ExtJS 5

I am using ExtJS 5 and I want to access complex REST resources as discussed in this similar thread using ExtJS 4.
The REST service that I am accessing exposes these resources:
GET /rest/clients - it returns a list of clients
GET /rest/users - it returns a list of all users
GET /rest/clients/{clientId}/users - it returns a list of users from the specified client.
I have these models:
Ext.define('App.model.Base', {
extend: 'Ext.data.Model',
schema: {
namespace: 'App.model'
}
});
Ext.define('App.model.Client', {
extend: 'App.model.Base',
fields: [{
name: 'name',
type: 'string'
}],
proxy: {
url: 'rest/clients',
type: 'rest'
}
});
Ext.define('App.model.User', {
extend: 'App.model.Base',
fields: [{
name: 'name',
type: 'string'
},{
name: 'clientId',
reference: 'Client'
}],
proxy: {
url: 'rest/users',
type: 'rest'
}
});
I did this:
var client = App.model.Client.load(2);
var users = client.users().load();
And it sent, respectively:
//GET rest/clients/2
//GET rest/users?filter:[{"property":"personId","value":"Person-1","exactMatch":true}]
Questions:
Is there any way that I can send my request to "GET rest/clients/2/users" without updating the user proxy url manually with its clientId?
How can I send above request without losing the original url defined in App.model.User, "rest/users"
I think this essentially the same as this question:
Accessing complex REST resources with Ext JS
I don't think much has changed since it was first asked.

OData Jaydata - odata update request returns error 404 (SAPUI5, node)

I'm building a web application with SAPUI5 which makes available a list of services, that are stored in a MongoDB and available as OData.
I followed this guide jaydata-install-your-own-odata-server-with-nodejs-and-mongodb and these are my model.js:
$data.Class.define("marketplace.Service", $data.Entity, null, {
Id: {type: "id", key: true, computed: true, nullable: false},
Name: {type: "string", nullable: false, maxLength: 50},
}, null);
$data.Class.defineEx("marketplace.Context", [$data.EntityContext, $data.ServiceBase], null, {
Services: {type: $data.EntitySet, elementType: marketplace.Service}
});
exports = marketplace.Context;
and server.js:
var c = require('express');
require('jaydata');
window.DOMParser = require('xmldom').DOMParser;
require('q');
require('./model.js');
var app = c();
app.use(c.query());
app.use(c.bodyParser());
app.use(c.cookieParser());
app.use(c.methodOverride());
app.configure(function() {app.use(app.router);});
app.use(c.session({secret: 'session key'}));
app.use("/marketplace", $data.JayService.OData.Utils.simpleBodyReader());
app.use("/marketplace", $data.JayService.createAdapter(marketplace.Context, function (req, res) {
return new marketplace.Context({
name: "mongoDB",
databaseName: "marketplace",
address: "localhost",
port: 27017
});
}));
app.use("/", c.static(__dirname));
app.use(c.errorHandler());
app.listen(8080);
The client is developed by using SAPUI5 and these are the parts of the code relative to the odata model creation:
oModel = sap.ui.model.odata.ODataModel("http://localhost:8080/marketplace", false); // connection to the odata endpoint
oModel.setDefaultBindingMode(sap.ui.model.BindingMode.TwoWay);
sap.ui.getCore().setModel(oModel);
The various services are correctly showed in a SAPUI5 table and I'm easily able to insert a new service by using the POST OData.request in this way:
OData.request({
requestUri: "http://localhost:8080/marketplace/Services",
method: "POST",
data: newEntry // json object with the new entry
},
function(insertedItem) {
// success notifier
},
function(err) {
// error notifier
}
);
and delete a service by using the SAPUI5 function oModel.remove() in this way (oParams is a json object which contains the alert notification functions):
var serviceId = oTable.getRows()[selectedIndex].getCells()[0].getText();
oModel.remove("/Services('" + serviceId + "')", oParams);
Everything works fine but the update request for a single service. I've tried with the functions provided by SAPUI5 (oModel.update or oModel.submitChanges), by using OData.request ("method: PUT"), by creating an ajax PUT request, I also tried to craft PUT request with Fiddler.
I always get error 404:
Request URL:http://localhost:8080/marketplace/Services('NTMzZDM3M2JlNjY2YjY3ODIwZjlmOTQ0')
Request Method:PUT
Status Code:404 Not Found
Where can be the problem?
I tried with Chrome, IE, and Firefox; same problem...
Thanks
Try to update with MERGE verb and pass the modified fields in JSON format inside the BODY

sencha touch rest proxy url

Why if a use a JSON file foo.json my code works but if I change the URL to something.com/foo.json it doesn´t work?
This is working in my project:
var store = Ext.create('Ext.data.Store', {
model: 'Client',
autoLoad: true,
autoSync: true,
proxy: {
type: 'rest',
url : 'clients.json',
reader: {
type: 'json'
}
}
});
What I want is to replace the static file for an URL:
var store = Ext.create('Ext.data.Store', {
model: 'Client',
autoLoad: true,
autoSync: true,
proxy: {
type: 'rest',
url : 'http://rails-api.herokuapp.com/clients.json',
reader: {
type: 'json'
}
}
});
The clients.json file is a copy/paste from http://rails-api.herokuapp.com/clients.json it is the same data.
Where do you run your application? Are you able to track the http requests? Do you get any output on your javascript console?
If i had to guess I'd say that your issue might be related to CORS => http://en.wikipedia.org/wiki/Cross-origin_resource_sharing.
Edit:
Note that you only need to have a look at CORS or use jsonp if you are running your app and the "backend"/api on two different webservers.
Most people will probably...
a) ...run the app on the same webserver as the backend or...
b) ...use native packaging (cordova, phonegap or sencha cmd s own packaging).
In both cases you can simply use the "normal" ajax or rest proxys.
On the server side I had to add the callback, see this question sencha-seems-to-not-like-rails-json also I had to change the type from rest to jsonp and removed some useless code, at the end my code looks like this:
var store = Ext.create('Ext.data.Store', {
model: 'Client',
autoLoad: true,
autoSync: true,
proxy: {
type: 'jsonp',
url : 'http://rails-api.herokuapp.com/clients.json'
}
});
on the server side:
def index
#clients = Client.all
respond_to do |format|
format.html
format.json { render :json => #clients, :callback => params[:callback] }
end
end