When i set migrate: 'safe' in config/models.js, sails.log stopped printing to console. When i set migrate: 'alter' then everything is ok. The same thing applies to console.log
Is this because sails.log isn't supposed to work during production? If so, how can i log to console during production?
My models.js:
module.exports.models = {
schema: true,
migrate: 'safe',
}
UPDATE 1
I logged from my model blueprint api as followed:
module.exports = {
create: function(req, res){
console.log('req.body');
console.log(req.body);
sails.log.error('req.body');
sails.log.error(req.body);
return res.status(200).send('HAPPY');
}
};
When i send a post request, i received the 'HAPPY' return value with code 200. So there couldn't have been any errors.
Update 2
This also happens when i set csrf: false in config/security. I think this might be a bug and will report it to sails.
Related
This is my first time posting a question up here. I hope you guys can help me out with this. I am fairly new to node.js, express, so sorry in advance for my inexperience.
I am currently having a problem with my authentication session in my node.js, express app. I use Passport.js to handle my authentication, I store the login session with connect-pg-simple (a PostgreSQL session store). After clicking the login button, the session was stored inside my PostgreSQL database, but somehow express couldn't find it. In fact, it stores the session twice in the database, but only one of them got the passport cookie in it.
This issue was not present when the server was still on localhost. It appears when I host my server on Heroku.
Also, whenever I push to heroku repo, it shows this warning:
"connect.session() MemoryStore is not designed for a production environment, as it will leak memory, and will not scale past a single process."
My guess is I didn't connect express session to the PostgreSQL express store properly. Below is my code:
This is how I set up the PostgreSQL database:
const Pool = require("pg").Pool;
const pool = new Pool({
user: process.env.PGUSER,
password: process.env.PGPASSWORD,
host: process.env.PGHOST,
port: process.env.PGPORT,
database: process.env.PGDATABASE
});
module.exports = pool
This is how I set up the session:
const poolSession = new (require('connect-pg-simple')(session))({
pool : pool
})
app.set('trust proxy', 1);
app.use(session({
store: poolSession,
secret: process.env.SESSION_SECRET,
saveUninitialized: true,
resave: false,
cookie: {
secure: true,
maxAge: 30 * 24 * 60 * 60 * 1000
} // 30 days
}));
app.use(passport.initialize());
app.use(passport.session());
This is the image of 2 sessions were store in the database when clicking the login button
https://i.stack.imgur.com/lzAby.png
This is my login route (when click the login button):
router
.route("/signin")
.post((req, res, next) => {
console.log("Signing in...")
passport.authenticate('local', function(err, user, info) {
//code....
req.logIn(user, function(err) {[enter image description here][1]
console.log(user);
if (err) {
console.log(err);
res.send(apiResponse(500, err.message, false, null))
return next(err);
}
console.log(req.sessionID); //The id of the 1st session store in db
console.log(req.isAuthenticated()) //True
res.redirect('/auth');
});
})(req, res, next);
})
This is the route that is redirected to when login successfully:
router.get("/", (req, res) => {
console.log("/ ", req.isAuthenticated()); //False
console.log("/ ", req.sessionID); //The Id of the 2nd session store in db
if(req.isAuthenticated()){
//Notify user login success
}
});
I have been stuck here for a few days now. Please tell me if you need more code!
I have also tried all varying combinations for web origins and valid redirect URIs
I login via keycloak and it continuously redirects me back and forth between my localhost application and this url: http://localhost:4200/#state=166446fd-daf6-4b76-b595-583c01c663df&session_state=57ead1f3-bf41-4117-9ddf-75e37c9248e7&code=8692b58b-0868-4762-b82e-acde9911dd34.57ead1f3-bf41-4117-9ddf-75e37c9248e7.1e8b5b9d-b590-453e-b396-62b46c18cc9f
I have tried it on firefox and chrome but with the same issue - it seems to be looking for the keycloak.json file in the network tab even though I can login to the correct realm via keycloak
GET http://localhost:4200/keycloak.json 404 (Not Found)
scheduleTask # zone.js:2969
ERROR An error happened during Keycloak initialization. core.js:1601
Unhandled Promise rejection: An error happened during Keycloak initialization. ; Zone: ; Task: Promise.then ; Value: An error happened during Keycloak initialization. undefined
static init(): Promise<any> {
const keycloakAuth: any = Keycloak({
url: 'http://localhost:8080/auth',
realm: 'ContractPortal',
clientId: 'secretkey2',
'ssl-required': 'external',
'public-client': true
});
KeycloakService.auth.loggedIn = false;
return new Promise((resolve, reject) => {
keycloakAuth.init({onLoad: 'login-required'})
.success(() => {
console.log(keycloakAuth);
KeycloakService.auth.loggedIn = true;
KeycloakService.auth.authz = keycloakAuth;
KeycloakService.auth.logoutUrl = keycloakAuth.authServerUrl
+ '/realms/angular_keycloak/protocol/openid-connect/logout?redirect_uri='
+ document.baseURI;
resolve();
})
.error(() => {
reject();
});
});
}
it'd be great if one of you guys can point me in the right direction to solve this issue...
I have found another similar question to what I have asked link but not sure how to implement this solution! this is my provider setup -
providers: [
{
provide: APP_INITIALIZER,
useFactory: initializer,
multi: true,
deps: [KeycloakService]
}
],
but OP puts the following in his provider
providers: [
KeyCloakService,
AssetService,
{
provide: LocationStrategy,
useClass: PathLocationStrategy
}
]
Please let me know if you require any other information
For anybody that ran into the same issue - I misunderstood the keycloak.json file and did not know where to get it - so I was exporting the complete json file from keycloak but THIS is not how you should get it!!
First you have to go to your realm > clients and click on installation
then download the keycloak OIDC json file
then place it next to the index.html file in your application
This solved my issue - hope it helps somebody else
I was researching on the internet and it is the same case, for me my angular page is reloading automatically in infnite loop
I did some changes in keycloak-init.ts file (please find following code and change checkLoginIframe from true to false)
Old Code: keycloak.init ({onLoad: 'login-required', "checkLoginIframe": true})
New Code: keycloak.init ({onLoad: 'login-required', "checkLoginIframe": false})
this.loadingService.isLoading$.pipe(delay(100)).subscribe((data => {
this.loading = data
}))
this.isLogging = await this.keycloak.isLoggedIn();
type roleUser = Array<{ id: number, text: string }>
if (this.isLogging) {
this.userProfile = await this.keycloak.loadUserProfile();
this.userRoles = await this.keycloak.getUserRoles();
}else {
// this.clearSession();
}
if you are using "this.keycloak.loadUserProfile()" Remove this "clear session"
I'm looking for a best way to seed my development database in sails js.
In rails I would just use the seeds.rb file but even without that I could use a rake task.
With sails I am unsure of how I could do this outside of manually doing it with sails console.
Please note that solutions which add logic to config/models and the models themselves for production seeding are not what I am looking for. I don't want these records to exist in production.
You can seed your database in the config/bootstrap.js file.
To seed it for a particular environment, what I usually do is:
// config/bootstrap.js
module.exports.bootstrap = function (cb) {
if(process.env.NODE_ENV !== 'development')
return cb();
// Do whatever you want
// And don't forget to...
return cb();
};
And to drop the database each time during the Sails lifting:
// config/env/development.js
module.exports = {
models: {
migrate: 'drop'
}
};
You can use test framework, like Mocha. At your development mode, switch your table name to development table. Here is step by step:
Install mocha with npm install mocha --save-dev
Create test/boostrap.test.js and fill with (configure as your needs), look at my configured connections, it'll override default connections at config.
var Sails = require('sails'),
sails;
before(function (done) {
Sails.lift({
log : {
level: 'error'
},
connections: {
mongodbServer: {
database: 'table_test'
}
},
models : {
migrate: 'drop'
}
}, function (err, server) {
sails = server;
done(err, sails);
});
});
after(function (done) {
// here you can clear fixtures, etc.
sails.lower(done);
});
Create another file for seeding your data, for example create test/inject/seed.js and fill with something like.
describe('data seeding', function(){
it('should seed data', function(done){
sails.models.someModel
.create({
name: 'Some Name'
})
.then(function(result){
done();
})
.catch(done);
});
});
Add this at your package.json under "scripts" key.
"test": "_mocha test/bootstrap.test.js test/inject/**/*.inject.js --no-timeouts"
Run it with npm test to seed your data.
If you need to use it at development mode, when you run sails lift, edit your config/env/development.js and add something like this.
module.exports = {
connections: {
mongodbServer: {
database: 'table_test'
}
}
};
Now your sails lift will use table_test instead of production table, so your production table will be clean.
My code keeps failing here when the user tries to login:
isAuthenticated: function (req, res) {
if (req.isAuthenticated()) { return res.json(req.user); }
else { return res.send(401); }
},
It FAILS and I get GET http://localhost:1337/user/authenticated 401 (Unauthorized) in the console, even though the user has entered in a correct email and password.
Where in the code makes that test pass?
I have the related StackOverflow question with more info HERE.
The problem was that my frontend application has a different origin than my backend application, so the AJAX requests will not include the session cookie and req.isAuthenticated() will never return true.
Use the withCredentials options to force it.
$http({ withCredentials: true, ... })
I'm using Yeoman, Grunt, and Bower, to construct a platform for building a frontend independently of a a backend. The idea would be that all of my (AngularJS) controller, services, factories, etc live in this project, and get injected afterwards into my serverside codebase based off the result of grunt build.
My question is:
How can I mock endpoints so that the Grunt server responds to the same endpoints as my (Rails) App will?
At the moment I am using:
angular.module('myApp', ['ngResource'])
.run(['$rootScope', function ($rootScope) {
$rootScope.testState = 'test';
}]);
And then in each of my individual services:
mockJSON = {'foo': 'myMockJSON'}
And on every method:
if($rootScope.testState == 'test'){
return mockJSON;
}
else {
real service logic with $q/$http goes here
}
Then after grunt build, testState = 'test' gets removed.
This is clearly a relatively janky architecture. How can I avoid it? How can I have Grunt respond to the same endpoints as my app (some of which have dynamic params) apply some logic (if necessary), and serve out a json file (possibly dependent on path params)?
I've fixed this issue by using express to write a server that responds with static json.
First I created a directory in my project called 'api'. Within that directory I have the following files:
package.json:
{
"name": "mockAPI",
"version": "0.0.0",
"dependencies": {
"express": "~3.3.4"
}
}
Then I run npm install in this directory.
index.js:
module.exports = require('./lib/server');
lib/server.js:
express = require('express');
var app = express();
app.get('/my/endpoint', function(req, res){
res.json({'foo': 'myMockJSON'});
});
module.exports = app
and finally in my global Gruntfile.js:
connect: {
options: {
port: 9000,
hostname: 'localhost',
},
livereload: {
options: {
middleware: function (connect, options) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
require('./api')
];
}
}
},
Then the services make the requests, and the express server serves the correct JSON.
After grunt build, the express server is simply replaced by a rails server.
As of grunt-contrib-connect v.0.7.0 you can also just add your custom middleware to the existing middleware stack without having to manually rebuild the existing middleware stack.
livereload: {
options: {
open: true,
base: [
'.tmp',
'<%= config.app %>'
],
middleware: function(connect, options, middlewares) {
// inject a custom middleware into the array of default middlewares
middlewares.push(function(req, res, next) {
if (req.url !== '/my/endpoint') {
return next();
}
res.writeHead(200, {'Content-Type': 'application/json' });
res.end("{'foo': 'myMockJSON'}");
});
return middlewares;
}
}
},
See https://github.com/gruntjs/grunt-contrib-connect#middleware for the official documentation.
Alternatively you can use the grunt-connect-proxy to proxy everything that is missing in your test server to an actual backend.
It's quite easy to install, just one thing to remember when adding proxy to your livereload connect middleware is to add it last, like this:
middleware: function (connect) {
return [
lrSnippet,
mountFolder(connect, '.tmp'),
mountFolder(connect, yeomanConfig.app),
proxySnippet
];
}
grunt-connect-prism is similar to the Ruby project VCR. It provides an easy way for front end developers to record HTTP responses returned by their API (or some other remote source) and replay them later. It's basically an HTTP cache, but for developers working on a Single Page Application (SPA). You can also generate stubs for API calls that don't exist, and populate them the way you want.
It's useful for mocking complex & high latency API calls during development. It's also useful when writing e2e tests for your SPA only, removing the server from the equation. This results in much faster execution of your e2e test suite.
Prism works by adding a custom connect middleware to the connect server provided by the grunt-contrib-connect plugin. While in 'record' mode it will generate a file per response on the filesystem with content like the following:
{
"requestUrl": "/api/ponies",
"contentType": "application/json",
"statusCode": 200,
"data": {
"text": "my little ponies"
}
}
DISCLAIMER: I'm the author of this project.
You can use Apache proxy and connect your REST server with gruntjs.
Apache would do this:
proxy / -> gruntjs
proxy /service -> REST server
you would use your application hitting Apache and angular.js application would think that is talking with itself so no cross domain problem.
Here is a great tutorial on how to set this up:
http://alfrescoblog.com/2014/06/14/angular-js-activiti-webapp-with-activiti-rest/
Just my alternative way that based on Abraham P's answer. It does not need to install express within 'api' folder. I can separate the mock services for certain files. For example, my 'api' folder contains 3 files:
api\
index.js // assign all the "modules" and then simply require that.
user.js // all mocking for user
product.js // all mocking for product
file user.js
var user = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/user') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'role' : 'admin'
})
);
}
else {
next();
}
}
module.exports = user;
file product.js
var product = function(req, res, next) {
if (req.method === 'POST' && req.url.indexOf('/product') === 0) {
res.end(
JSON.stringify({
'id' : '5463c277-87c4-4f1d-8f95-7d895304de12',
'name' : 'test',
'category': 'test'
})
);
}
else {
next();
}
}
module.exports = product;
index.js just assigns all the "modules" and we simply require that.
module.exports = {
product: require('./product.js'),
user: require('./user.js')
};
My Gruntfile.js file
connect: {
options: {
port: 9000,
// Change this to '0.0.0.0' to access the server from outside.
hostname: 'localhost',
livereload: 35729
},
livereload: {
options: {
open: true,
middleware: function (connect) {
return [
connect.static('.tmp'),
connect().use(
'/bower_components',
connect.static('./bower_components')
),
connect.static(appConfig.app),
require('./api').user,
require('./api').product,
];
}
}
}