Using Grunt to Mock Endpoints - rest

I'm using Yeoman, Grunt, and Bower, to construct a platform for building a frontend independently of a a backend. The idea would be that all of my (AngularJS) controller, services, factories, etc live in this project, and get injected afterwards into my serverside codebase based off the result of grunt build.
My question is:
How can I mock endpoints so that the Grunt server responds to the same endpoints as my (Rails) App will?
At the moment I am using:
angular.module('myApp', ['ngResource'])
.run(['$rootScope', function ($rootScope) {
$rootScope.testState = 'test';
}]);
And then in each of my individual services:
mockJSON = {'foo': 'myMockJSON'}
And on every method:
if($rootScope.testState == 'test'){
return mockJSON;
}
else {
real service logic with $q/$http goes here
}
Then after grunt build, testState = 'test' gets removed.
This is clearly a relatively janky architecture. How can I avoid it? How can I have Grunt respond to the same endpoints as my app (some of which have dynamic params) apply some logic (if necessary), and serve out a json file (possibly dependent on path params)?

I've fixed this issue by using express to write a server that responds with static json.
First I created a directory in my project called 'api'. Within that directory I have the following files:
package.json:
{
"name": "mockAPI",
"version": "0.0.0",
"dependencies": {
"express": "~3.3.4"
}
}
Then I run npm install in this directory.
index.js:
module.exports = require('./lib/server');
lib/server.js:
express = require('express');
var app = express();
app.get('/my/endpoint', function(req, res){
res.json({'foo': 'myMockJSON'});
});
module.exports = app
and finally in my global Gruntfile.js:
connect: {
options: {
port: 9000,
hostname: 'localhost',
},
livereload: {
options: {
middleware: function (connect, options) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
require('./api')
];
}
}
},
Then the services make the requests, and the express server serves the correct JSON.
After grunt build, the express server is simply replaced by a rails server.

As of grunt-contrib-connect v.0.7.0 you can also just add your custom middleware to the existing middleware stack without having to manually rebuild the existing middleware stack.
livereload: {
options: {
open: true,
base: [
'.tmp',
'<%= config.app %>'
],
middleware: function(connect, options, middlewares) {
// inject a custom middleware into the array of default middlewares
middlewares.push(function(req, res, next) {
if (req.url !== '/my/endpoint') {
return next();
}
res.writeHead(200, {'Content-Type': 'application/json' });
res.end("{'foo': 'myMockJSON'}");
});
return middlewares;
}
}
},
See https://github.com/gruntjs/grunt-contrib-connect#middleware for the official documentation.

Alternatively you can use the grunt-connect-proxy to proxy everything that is missing in your test server to an actual backend.
It's quite easy to install, just one thing to remember when adding proxy to your livereload connect middleware is to add it last, like this:
middleware: function (connect) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
proxySnippet
];
}

grunt-connect-prism is similar to the Ruby project VCR. It provides an easy way for front end developers to record HTTP responses returned by their API (or some other remote source) and replay them later. It's basically an HTTP cache, but for developers working on a Single Page Application (SPA). You can also generate stubs for API calls that don't exist, and populate them the way you want.
It's useful for mocking complex & high latency API calls during development. It's also useful when writing e2e tests for your SPA only, removing the server from the equation. This results in much faster execution of your e2e test suite.
Prism works by adding a custom connect middleware to the connect server provided by the grunt-contrib-connect plugin. While in 'record' mode it will generate a file per response on the filesystem with content like the following:
{
"requestUrl": "/api/ponies",
"contentType": "application/json",
"statusCode": 200,
"data": {
"text": "my little ponies"
}
}
DISCLAIMER: I'm the author of this project.

You can use Apache proxy and connect your REST server with gruntjs.
Apache would do this:
proxy / -> gruntjs
proxy /service -> REST server
you would use your application hitting Apache and angular.js application would think that is talking with itself so no cross domain problem.
Here is a great tutorial on how to set this up:
http://alfrescoblog.com/2014/06/14/angular-js-activiti-webapp-with-activiti-rest/

Just my alternative way that based on Abraham P's answer. It does not need to install express within 'api' folder. I can separate the mock services for certain files. For example, my 'api' folder contains 3 files:
api\
index.js // assign all the "modules" and then simply require that.
user.js // all mocking for user
product.js // all mocking for product
file user.js
var user = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/user') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'role' : 'admin'
})
);
}
else {
next();
}
}
module.exports = user;
file product.js
var product = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/product') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'name' : 'test',
'category': 'test'
})
);
}
else {
next();
}
}
module.exports = product;
index.js just assigns all the "modules" and we simply require that.
module.exports = {
product: require('./product.js'),
user: require('./user.js')
};
My Gruntfile.js file
connect: {
options: {
port: 9000,
// Change this to '0.0.0.0' to access the server from outside.
hostname: 'localhost',
livereload: 35729
},
livereload: {
options: {
open: true,
middleware: function (connect) {
return [
connect.static('.tmp'),
connect().use(
'/bower_components',
connect.static('./bower_components')
),
connect.static(appConfig.app),
require('./api').user,
require('./api').product,
];
}
}
}

Related

Can Bunjs be used as a backend server?

Now we can start a react App with bun as a server
Can we use Bunjs as complete backend server?
For Example, Can bun run this code?
const express = require('express')
const app = express()
app.get('/', (req, res) => {
res.send('hello world')
})
app.listen(3000)
I guess Bun does not YET implement all node.js api's. I tried http and it seems currently missing. And as much I understand it currently has its own built-in HTTP server.
Check the "Getting started" section on -> https://bun.sh/
A sample server:
export default {
port: 3000,
fetch(request) {
return new Response("Welcome to Bun!");
},
};
(This example reminds me of serverless functions.)
As this is the case, it seems you can not rely on Node.js http, or most probably any server framework like express.
At least for now, bun's roadmap (https://github.com/oven-sh/bun/issues/159) shows a line, which I am not sure is talking about node's http server or sth. else about Bun's own server.
Once complete, the next step is integration with the HTTP server and
other Bun APIs
Bun api is really different from nodejs, I created a library called bunrest, a express like api, so new user does not need to learn much about bun.
Here is how to use it
Install the package from npm
npm i bunrest
To create a server
const App = require('bunrest');
const server = new App.BunServer();
After that, you can call it like on express
server.get('/test', (req, res) => {
res.status(200).json({ message: 'succeed' });
});
server.put('/test', (req, res) => {
res.status(200).json({ message: 'succeed' });
});
server.post('/test', (req, res) => {
res.status(200).json({ message: 'succeed' });
});
To start the server
server.listen(3000, () => {
console.log('App is listening on port 3000');
});
Other way is using Hono: https://honojs.dev/
There is a working demo: https://github.com/cachac/bun-api
Import package and set new instance
import { Hono } from 'hono'
const app = new Hono()
Create routes:
Instead of NodeJs res.send(), use c.json({})
app.get('/hello', c => c.json({ message: 'Hello World' }))
export and run api server:
export default {
fetch: app.fetch,
port: 3000
}

Parse Server - Image files' path returns localhost

I have deployed 2 Ubuntu servers on Azure. First, I have installed the Parse Server and the second, I installed MongoDB. (I have also put a ready db there from my previous server via mongorestore)
Everything works fine! Both Parse Server and MongoDB server. They also communicate well. The thing is, when I run my iOS app, it brings all data correctly, except images. I print the URL of an image and here's what it returned: http://localhost:1337/parse/files/filename.jpeg
If I replace localhost with my server's ip, the image is being fetched nicely!
Here's what I have on my index.js:
var express = require('express');
var ParseServer = require('parse-server').ParseServer;
var ParseDashboard = require('parse-dashboard');
var allowInsecureHTTP = true;
var path = require('path');
var databaseUri = process.env.DATABASE_URI || process.env.MONGODB_URI;
if (!databaseUri) {
console.log('DATABASE_URI not specified, falling back to localhost.');
}
var api = new ParseServer({
databaseURI: databaseUri || 'mongodb://IP:27017/db',
cloud: './cloud/main.js',
appId: process.env.APP_ID || 'xxx',
masterKey: process.env.MASTER_KEY || 'xxx', //Add your master key here. Keep it secret!
fileKey: 'xxx',
serverURL: process.env.SERVER_URL || 'http://localhost:1337/parse', // Don't forget to change to https if needed
// Enable email verification
verifyUserEmails: false,
// The public URL of your app.
// This will appear in the link that is used to verify email addresses and reset passwords.
// Set the mount path as it is in serverURL
publicServerURL: 'http://localhost:1337/parse',
});
// Client-keys like the javascript key or the .NET key are not necessary with parse-server
// If you wish you require them, you can set them as options in the initialization above:
// javascriptKey, restAPIKey, dotNetKey, clientKey
var app = express();
// Serve static assets from the /public folder
app.use('/public', express.static(path.join(__dirname, '/public')));
// Serve the Parse API on the /parse URL prefix
var mountPath = process.env.PARSE_MOUNT || '/parse';
app.use(mountPath, api);
// Parse Server plays nicely with the rest of your web routes
app.get('/', function(req, res) {
res.status(200).send('Make sure to star the parse-server repo on GitHub!');
});
// There will be a test page available on the /test path of your server url
// Remove this before launching your app
app.get('/test', function(req, res) {
res.sendFile(path.join(__dirname, '/public/test.html'));
});
var port = process.env.PORT || 1337;
var httpServer = require('http').createServer(app);
httpServer.listen(port, function() {
console.log('parse-server-example running on port ' + port + '.');
});
// Set up parse dashboard
var config = {
"allowInsecureHTTP": true,
"apps": [
{
"serverURL": "http://localhost:1337/parse",
"appId": "xxx",
"masterKey": "xxx",
"appName": "name",
"production": true
}
],
"users": [
{
"user":"username",
"pass":"pass"
}
]
};
var dashboard = new ParseDashboard(config, config.allowInsecureHTTP);
var dashApp = express();
// make the Parse Dashboard available at /dashboard
dashApp.use('/dashboard', dashboard);
// Parse Server plays nicely with the rest of your web routes
dashApp.get('/', function(req, res) {
res.status(200).send('Parse Dashboard App');
});
var httpServerDash = require('http').createServer(dashApp);
httpServerDash.listen(4040, function() {
console.log('dashboard-server running on port 4040.');
});
One thing I noticed at Parse's documentation, is this: When using files on Parse, you will need to use the publicServerURL option in your Parse Server config. This is the URL that files will be accessed from, so it should be a URL that resolves to your Parse Server. Make sure to include your mount point in this URL.
The thing is that this documentation was written having in mind MongoDB, is on the same server with Parse, which in my case isn't.
Any ideas on what to do?
I had to replace the publicServerURL of parse server's config, from http://localhost:1337/parse to http://publicIP:1337/parse and everything worked out great!
If you want to work with files(images) download them, just use publicServerURL as mentioned #Sotiris Kaniras
I would add that the config.json is in ~/stack/parse/config.json. Also here is the difference between serverURL and publicServerURL
Difference between serverURL and publicServerURL on ParseServer
In my case, I needed to add publicServerURL parameter alongside with serverURL because it hasn't existed yet.
So both parameters(publicServerURL & serverURL) are complement, not mutually exclusive, use them both.

Permissive CORS check in meanjs.org application during development?

I am using meanjs.org stack to develop a simple application. Now I am woking on the mobile client with ionic framework.
Current issue I have is the CORS check causes an error when testing the ionic app like is described here: http://blog.ionic.io/handling-cors-issues-in-ionic/
While I could set up a proxy as described in that link it also works if I add the response header to the server side like: res.header("Access-Control-Allow-Origin", "*");
Adding this to each endpoint doesn't feels good and I would like to find a solution to use during development where I could disable the CORS check.
I've also tried changing the lusca configuration on config/env/default.js setting xssProtection: false bit didn't work?
How can I, during development, enable the following response header res.header("Access-Control-Allow-Origin", "*"); in all endpoints, is it possible or the only viable solution using meanjs.org + ionic during development is setting up the proxy?
Thank you
For development you can set proxy to avoid CORS issues,
Set up your ionic.project file to be something like:
{
"name": "proxy-example",
"app_id": "",
"proxies": [{
"path": "/api",
"proxyUrl": "http://example.com/api"
}]
}
Then from controller call api like below,
.controller('MyController', function($http) {
$http.post('/api/getMoreDeatils',data)
.then(function (response) {
//do something
})
})
Hope this is helpful for you
Try to use a middleware like this in your server main file:
app.use(function(req, res, next) {
res.header("Access-Control-Allow-Origin", "*");
res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
next();
});
After some time I finally got a hint on the meanjs.org forums and built a solution.
Create the following method on config/lib/express.js file:
/**
* Disable CORS check to be used during development with access from mobile clients (currently tested with ionic)
*/
module.exports.disableCORSForDev = function(app) {
app.all('/*', function(req, res, next) {
res.header('Access-Control-Allow-Origin', '*');
res.header('Access-Control-Allow-Credentials', true);
res.header('Access-Control-Allow-Methods', 'POST');
res.header('Access-Control-Allow-Headers', 'Origin, X-Requested-With, Content-Type, Accept');
next();
}).options('*', function(req, res) {
res.end();
});
var corsOptionsDelegate = function(req, callback) {
var corsOptions = { credentials: true };
callback(null, corsOptions); // callback expects two parameters: error and options
};
app.use(cors(corsOptionsDelegate));
};
added the following to the method init on this same class just after declaration of var app = express();
var app = express();
this.disableCORSForDev(app);
Note: This is a server side solution to bypass cors during development and should not be used in production, many may prefer to setup a proxy on the ionic client instead as it's described in this ionic article.

how to sync data from ydn-db web app to backend server?

With ydn-dn, i want to automatically synchronise data from my web app with my REST back end.
I read the documentation and searched in examples but i cannot make it work.
https://yathit.github.io/ydn-db/synchronization.html
http://dev.yathit.com/api/ydn/db/schema.html#sync
I tried to define a schema with sync configuration like that :
var schema = {
stores: [ {
name: 'contact',
keyPath: 'id',
Sync: {
format: 'rest',
transport: service,
Options: {
baseUri: '/'
}
}
}
]
};
and created a function for transport :
var service = function(args) {
console.log("contact synch");
};
but my service function is never called.
I certainly misunderstood how YDN-db work, but i didn't found any example.
To complete, here is a jsfiddle :
http://jsfiddle.net/asicfr/y7sL7b3j/
Please see the example http://yathit.github.io/ydndb-demo/entity-sync/app.html
Older example http://yathit.github.io/sprintly-service/playground.html from https://github.com/yathit/sprintly-service

grunt.initConfig in a callback does not work

I want to use grunt for deployment and therefore want to read in configuration of remote hosts based on the already existing ~/.ssh/config file.
To load that configuration I'm using sshconf but need to include the grunt.initConfig() call in the callback to have the configuration when defining environments.
var sshconf = require('sshconf');
module.exports = function(grunt) {
// Read in ssh configuration
sshconf.read(function(err, sshHosts) {
if (err)
console.log(err);
// SSH config loaded, now init grunt
grunt.initConfig({
sshconfig: {
staging: {
privateKey: grunt.file.read(sshHosts['project_staging'].properties.IdentityFile),
host: sshHosts['project_staging'].properties.HostName,
username: sshHosts['project_staging'].properties.User,
port: sshHosts['project_staging'].properties.Port || 22,
path: "/var/www/project"
},
production: {
// ...
}
},
// Tasks to be executed on remote server
sshexec: {
example_task: {
command: 'uptime && hostname'
}
},
sftp: {
deploy: {
files: {
"./": ["*.json", "*.js", "config/**", "controllers/**", "lib/**", "models/**", "public/**", "views/**"]
},
options: {
//srcBasePath: "test/",
createDirectories: true
}
}
}
// More tasks
// ...
});
grunt.loadNpmTasks('grunt-ssh');
// More plugins ...
});
};
When I call grunt --help it states:
> grunt --help
Grunt: The JavaScript Task Runner (v0.4.1)
…
Available tasks
(no tasks found)
If I do not wrap the grunt initiation in that callback (sshconf.read(function(err, sshHosts) {})) everything is working fine (except for the ssh config not loaded or not yet ready to be used).
Is what I am trying even possible and if so, how? Am I missing something obvious?
Grunt init cannot be used in an async fashion like this. Either read the sshconf synchronously, or use a task, as described in this answer: How can I perform an asynchronous operation before grunt.initConfig()?