Hi I followed this Serverless + AWS REST API tutorial and it went great, I got it to work.
Now, I'm trying to modify it but have hit a wall while trying to submit data into the DynamoDB table.
Using Postman to submit a valid JSON object I get a 502 response. If I test the function in Lambda, I get the following error:
{
"errorType": "SyntaxError",
"errorMessage": "Unexpected token o in JSON at position 1",
"trace": [
"SyntaxError: Unexpected token o in JSON at position 1",
" at JSON.parse (<anonymous>)",
" at Runtime.module.exports.submit [as handler] (/var/task/api/interview.js:11:28)",
" at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)",
" at process._tickCallback (internal/process/next_tick.js:68:7)"
]
}
After searching for solutions, what I found out is that it seem like the event that is being passed as JSON.parse(event)is undefined.
Here's the serverless.yml:
service: interview
frameworkVersion: ">=1.1.0 <2.0.0"
provider:
name: aws
runtime: nodejs10.x
stage: dev
region: us-east-1
environment:
INTERVIEW_TABLE: ${self:service}-${opt:stage, self:provider.stage}
INTERVIEW_EMAIL_TABLE: "interview-email-${opt:stage, self:provider.stage}"
iamRoleStatements:
- Effect: Allow
Action:
- dynamodb:Query
- dynamodb:Scan
- dynamodb:GetItem
- dynamodb:PutItem
Resource: "*"
resources:
Resources:
CandidatesDynamoDbTable:
Type: 'AWS::DynamoDB::Table'
DeletionPolicy: Retain
Properties:
AttributeDefinitions:
-
AttributeName: "id"
AttributeType: "S"
KeySchema:
-
AttributeName: "id"
KeyType: "HASH"
ProvisionedThroughput:
ReadCapacityUnits: 1
WriteCapacityUnits: 1
StreamSpecification:
StreamViewType: "NEW_AND_OLD_IMAGES"
TableName: ${self:provider.environment.INTERVIEW_TABLE}
functions:
interviewSubmission:
handler: api/interview.submit
memorySize: 128
description: Submit interview information and starts interview process.
events:
- http:
path: interviews
method: post
and the interview.js
'use strict';
const uuid = require('uuid');
const AWS = require('aws-sdk');
AWS.config.setPromisesDependency(require('bluebird'));
const dynamoDb = new AWS.DynamoDB.DocumentClient();
module.exports.submit = (event, context, callback) => {
const requestBody = JSON.parse(event);
const fullname = requestBody.fullname;
const email = requestBody.email;
const test = requestBody.test;
const experience = requestBody.experience;
if (typeof fullname !== 'string' || typeof email !== 'string' || typeof experience !== 'number') {
console.error('Validation Failed');
callback(new Error('Couldn\'t submit interview because of validation errors.'));
return;
}
submitInterviewP(interviewInfo(fullname, email, experience, test))
.then(res => {
callback(null, {
statusCode: 200,
body: JSON.stringify({
message: `Sucessfully submitted interview with email ${email}`,
interviewId: res.id
})
});
})
.catch(err => {
console.log(err);
callback(null, {
statusCode: 500,
body: JSON.stringify({
message: `Unable to submit interview with email ${email}`
})
})
});
};
const submitInterviewP = interview => {
console.log('Submitting interview');
const interviewInfo = {
TableName: process.env.INTERVIEW_TABLE,
Item: interview,
};
return dynamoDb.put(interviewInfo).promise()
.then(res => interview);
};
const interviewInfo = (fullname, email, experience,test) => {
const timestamp = new Date().getTime();
return {
id: uuid.v1(),
fullname: fullname,
email: email,
experience: experience,
test: test,
submittedAt: timestamp,
updatedAt: timestamp,
};
};
If I replace the event param for a valid JSON object and then deploy again. I'm able to successfully insert the object into dynamoDB.
Any clues? Please let me know if there's anything I missing that could help.
Thanks!
API Gateway stringify the request body in event's body property.
Currently you are trying to parse event object const requestBody = JSON.parse(event); which is wrong. You need to parse event.body property:
const requestBody = JSON.parse(event.body);
Related
I'm new to AWS so I apologize for any newbie stuff.
I'm trying to connect a MongoDB Atlas M0 cluster with our AWS EC2 instance, which is running a nodejs / react stack. The problem is that I can't make these two instances connect - AWS and MongoDB that is. When trying to use the backend sign in function (our nodejs api), it just gives this error:
Operation `user_profile.findOne()` buffering timed out after 10000ms
This is our index / connection:
import config from './config';
import app from './app';
import { connect } from 'mongoose'; // MongoDB
import { ServerApiVersion } from 'mongodb';
import https from 'https';
import AWS from 'aws-sdk';
const makeLogger = (bucket: string) => {
const s3 = new AWS.S3({
accessKeyId: <ACCESS_KEY_ID>,
secretAccessKey: <SECRET_ACCESS_KEY>
});
return (logData: any, filename: string) => {
s3.upload({
Bucket: bucket, // pass your bucket name
Key: filename, // file will be saved as testBucket/contacts.csv
Body: JSON.stringify(logData, null, 2)
}, function (s3Err: any, data: any) {
if (s3Err) throw s3Err
console.log(`File uploaded successfully at ${data.Location}`)
});
console.log(`log (${filename}): ${logData}`);
};
};
const log = makeLogger('xxx-xxxx');
log(config.MONGO_DB_ADDRESS, 'mongo_db_address.txt');
const credentials = <CREDENTIALS>
connect(config.MONGO_DB_ADDRESS, {
sslKey: credentials,
sslCert: credentials,
serverApi: ServerApiVersion.v1
}) //, { useNewUrlParser: true })
.then(() => console.log('Connected to MongoDB'))
.catch((err) => console.error('Failed connection to MongoDB', err));
app.on('error', error => {
console.error('app error: ' + error);
});
app.listen(config.WEB_PORT, () => {
console.log(`Example app listening on port ${config.WEB_PORT}`);
});
One of the endpoints giving the timeout error:
router.post('/signin', async (req, res) => {
var form_validation = signin_schema.validate({
email: req.body.email,
password: req.body.password,
});
if (form_validation.error) {
console.log('form validation sent');
//return res.status(400).send(form_validation);
return res.status(400).send({
kind: 'ERROR',
message: 'Sorry - something didn\'t go well. Please try again.'
});
}
var User = model('model', UserSchema, 'user_profile');
User.findOne({ email: req.body.email }, (err: any, the_user: any) => {
if (err) {
return res.status(400).send({
kind: 'ERROR',
message: err.message
});
}
if (!the_user) {
return res.status(400).send({
kind: 'ERROR',
message: 'the_user undefined',
});
}
compare(req.body.password, the_user.password)
.then((result) => {
if (result == true) {
const user_payload = { name: the_user.name, email: the_user.email };
const access_token = sign(user_payload, config.SECRET_TOKEN);
res.cookie('authorization', access_token, {
httpOnly: true,
secure: false,
maxAge: 3600000,
});
return res.send({ kind: "LOADING" });
// return res.send(access_token);
} else {
return res.status(400).send({
kind: 'ERROR',
message: 'Sorry - wrong password or email used.'
});
}
})
})
});
The strange thing is that I can connect from my local developer machine, when running our frontend. Just as I can connect from wsl2 ubuntu cli.
On the Mongo side, I have whitelisted every possible ip address. On the AWS side, I have created the outbound security group policy required. Regarding the inbound, I think it is correct. I've allowed access on the ports 27000 - 28018.
Again - I'm new to AWS, so if anyone can tell me what it is I'm simply not understanding here, I would be very grateful
Thanks
open mongodb atlas Network Access
open 0.0.0.0/0 (includes your current IP address)
I'm setting up AWS S3 bucket to upload audio files to using MongoDB Stitch (here are the docs mongo s3 docs . After following the instructions and authenticating my user I keep geting this error when trying to upload the selected file: error image from console
On line 119 where the error is coming from I'm just catching the error after running AWS build:
const aws = stitchClient.getServiceClient(AwsServiceClient.factory, "AWS");
convertAudioToBSONBinaryObject(file).then((result) => {
const audiofile = mongodb.db("data").collection("audiofile");
//now we need an instance of AWS service client
const key = `${stitchClient.auth.user.id}-${file.name}`;
// const key = `${stitchClient.auth.user.id}-${file.name}`;
const bucket = "myBucketName";
const url =
"http://" + bucket + ".s3.amazonaws.com/" + encodeURIComponent(key);
const args = {
ACL: "public-read",
Bucket: bucket,
ContentType: file.type,
Key: key,
Body: result,
// aws_service: "s3",
};
// building the request
const request = new AwsRequest.Builder()
.withService("s3")
.withAction("PutObject")
.withRegion("us-east-1")
.withArgs(args);
aws
.execute(request.build)
.then((result) => {
console.log(result);
console.log(url);
return audiofile.insertOne({
owner_id: stitchClient.auth.user.id,
url,
file: {
name: file.name,
type: file.type,
},
Etag: result.Etag,
ts: new Date(),
});
})
.then((result) => {
console.log("last result", result);
})
.catch((err) => {
console.log(err);
});
});
My Stitch rule for s3 looks like this: Stitch rule for AWS s3
So it seems to me that everything is set up the way it's inteded to, but the error tells me I'm not passing all the needed args. I'd really appreciate any thoughts on how to fix this error.
P.S. If I change "AWS" to "AWS_S3" in this line :
const aws = stitchClient.getServiceClient(AwsServiceClient.factory, "AWS");
The error message changes to this:
StitchServiceError {message: "service not found: 'AWS_S3'", name: "StitchServiceError", errorCode: 18, errorCodeName: "ServiceNotFound",
And the log in Stitch shows this for information for both errors: Stitch Logs
The answer to this is a simple typo in this line:
aws
.execute(request.build)
.then((result)
build is a function so I just needed to call it - (request.build()).then((result).
Issue solved, thanks all!
I want to add an avatar in the user registration, but I don't know how, Please can someone share with me a full example (form, JS front, and JS backend). I'm using SailsJS 1.0 (the stable version) with VueJs.
Thanks in advance .
I figured it out. Watch these platzi tutorials:
https://courses.platzi.com/classes/1273-sails-js/10757-uploading-backend-file/
https://courses.platzi.com/classes/1273-sails-js/10758-uploading-frontend-files/
https://courses.platzi.com/classes/1273-sails-js/10759-downloading-files/
Here is what the videos tell you to do:
npm i sails-hook-uploads.
In api/controllers/entrance/signup.js
Above inputs key add a new key/value of files: ['avatar'],
In the inputs add:
avatar: {
type: 'ref',
required: true
}
In the body of the fn find var newUserRecord and above this add (even if avatar is not required, make sure to do this line, otherwise you will have a "timeout of unconsuemd file stream":
const avatarInfo = await sails.uploadOne(inputs.avatar);
Then in the first argument object of var newUserRecord = await User.create(_.extend({ add:
avatarFd: avatarInfo.fd,
avatarMime: avatarInfo.type
In api/models/User.js, add these attributes to your User model:
avatarFd: {
type: 'string',
required: false,
description: 'will either have "text" or "avatarFd"'
},
avatarMime: {
type: 'string',
required: false,
description: 'required if "avatarFd" provided'
},
Then create a download endpoint, here is how the action would look for it:
const user = await User.findOne(id);
this.res.type(paste.photoMime);
const avatarStream = await sails.startDownload(paste.photoFd);
return exits.success(avatarStream);
Add to the routes a route for this download avatar endpoint.
Then you can display this avatar by pointing the <img src=""> the source in here to this download endpoint.
------APPENDIX-----
----signup.js-----
module.exports = {
friendlyName: 'Signup',
description: 'Sign up for a new user account.',
extendedDescription:
`This creates a new user record in the database, signs in the requesting user agent
by modifying its [session](https://sailsjs.com/documentation/concepts/sessions), and
(if emailing with Mailgun is enabled) sends an account verification email.
If a verification email is sent, the new user's account is put in an "unconfirmed" state
until they confirm they are using a legitimate email address (by clicking the link in
the account verification message.)`,
files: ['avatar'],
inputs: {
emailAddress: {
required: true,
type: 'string',
isEmail: true,
description: 'The email address for the new account, e.g. m#example.com.',
extendedDescription: 'Must be a valid email address.',
},
password: {
required: true,
type: 'string',
maxLength: 200,
example: 'passwordlol',
description: 'The unencrypted password to use for the new account.'
},
fullName: {
required: true,
type: 'string',
example: 'Frida Kahlo de Rivera',
description: 'The user\'s full name.',
},
avatar: {
}
},
exits: {
success: {
description: 'New user account was created successfully.'
},
invalid: {
responseType: 'badRequest',
description: 'The provided fullName, password and/or email address are invalid.',
extendedDescription: 'If this request was sent from a graphical user interface, the request '+
'parameters should have been validated/coerced _before_ they were sent.'
},
emailAlreadyInUse: {
statusCode: 409,
description: 'The provided email address is already in use.',
},
},
fn: async function (inputs) {
var newEmailAddress = inputs.emailAddress.toLowerCase();
// must do this even if inputs.avatar is not required
const avatarInfo = await sails.uploadOne(inputs.avatar);
// Build up data for the new user record and save it to the database.
// (Also use `fetch` to retrieve the new ID so that we can use it below.)
var newUserRecord = await User.create(_.extend({
emailAddress: newEmailAddress,
password: await sails.helpers.passwords.hashPassword(inputs.password),
fullName: inputs.fullName,
tosAcceptedByIp: this.req.ip,
avatarFd: avatarInfo.fd,
avatarMime: avatarInfo.type
}, sails.config.custom.verifyEmailAddresses? {
emailProofToken: await sails.helpers.strings.random('url-friendly'),
emailProofTokenExpiresAt: Date.now() + sails.config.custom.emailProofTokenTTL,
emailStatus: 'unconfirmed'
}:{}))
.intercept('E_UNIQUE', 'emailAlreadyInUse')
.intercept({name: 'UsageError'}, 'invalid')
.fetch();
// If billing feaures are enabled, save a new customer entry in the Stripe API.
// Then persist the Stripe customer id in the database.
if (sails.config.custom.enableBillingFeatures) {
let stripeCustomerId = await sails.helpers.stripe.saveBillingInfo.with({
emailAddress: newEmailAddress
}).timeout(5000).retry();
await User.updateOne(newUserRecord.id)
.set({
stripeCustomerId
});
}
// Store the user's new id in their session.
this.req.session.userId = newUserRecord.id;
if (sails.config.custom.verifyEmailAddresses) {
// Send "confirm account" email
await sails.helpers.sendTemplateEmail.with({
to: newEmailAddress,
subject: 'Please confirm your account',
template: 'email-verify-account',
templateData: {
fullName: inputs.fullName,
token: newUserRecord.emailProofToken
}
});
} else {
sails.log.info('Skipping new account email verification... (since `verifyEmailAddresses` is disabled)');
}
// add to pubilc group
const publicGroup = await Group.fetchPublicGroup();
await Group.addMember(publicGroup.id, newUserRecord.id);
}
};
I don't understand how to get standard JSON back from an orientjs query. I see people talking about "serializing" the result, but I don't understand why or how to do that. There is a toJSON() method, but i only see it being used with fetchplans etc...
I am trying to pipe a stream to a csv file and it isn't working properly because of the incorrect JSON format.
I would love an explanation of how and when to serialize. :-)
My Query:
return db.query(
`SELECT
id,
name,
out('posted_to').name as page,
out('posted_to').id as page_id,
out('posted_to').out('is_language').name as language,
out('posted_to').out('is_network').name as network
FROM post
WHERE posted_at
BETWEEN
'${since}'
AND
'${until}'
UNWIND
page,
page_id,
language,
network
`
My Result:
[ { '#type': 'd',
id: '207109605968597_1053732754639607',
name: '10 maneiras pelas quais você está ferindo seus relacionamentos',
page: 'Eu Amo o Meu Irmão',
page_id: '207109605968597',
language: 'portuguese',
network: 'facebook',
'#rid': { [String: '#-2:1'] cluster: -2, position: 1 },
'#version': 0 },
{ '#type': 'd',
id: '268487636604575_822548567865143',
name: '10 maneiras pelas quais você está ferindo seus relacionamentos',
page: 'Amo meus Filhos',
page_id: '268487636604575',
language: 'portuguese',
network: 'facebook',
'#rid': { [String: '#-2:3'] cluster: -2, position: 3 },
'#version': 0 }]
This is my dataset:
Query:
db.select('id','code').from('tablename').where({deleted:true}).all()
.then(function (vertex) {
console.log('Vertexes found: ');
console.log(vertex);
});
Output:
Vertexes found:
[ { '#type': 'd',
id: '6256650b-f5f2-4b55-ab79-489e8069b474',
code: '4b7d99fa-16ed-4fdb-9baf-b33771c37cf4',
'#rid': { [String: '#-2:0'] cluster: -2, position: 0 },
'#version': 0 },
{ '#type': 'd',
id: '2751c2a0-6b95-44c8-966a-4af7e240752b',
code: '50356d95-7fe7-41b6-b7d9-53abb8ad3e6d',
'#rid': { [String: '#-2:1'] cluster: -2, position: 1 },
'#version': 0 } ]
If I add the instruction JSON.stringify():
Query:
db.select('id','code').from('tablename').where({deleted:true}).all()
.then(function (vertex) {
console.log('Vertexes found: ');
console.log(JSON.stringify(vertex));
});
Output:
Vertexes found:
[{"#type":"d","id":"6256650b-f5f2-4b55-ab79-489e8069b474","code":"4b7d99fa-16ed-
4fdb-9baf-b33771c37cf4","#rid":"#-2:0","#version":0},{"#type":"d","id":"2751c2a0
-6b95-44c8-966a-4af7e240752b","code":"50356d95-7fe7-41b6-b7d9-53abb8ad3e6d","#ri
d":"#-2:1","#version":0}]
Hope it helps
I found a way that worked for me. instead of using :
db.query()
i used http request in node to query on database. on OrientDB Document also said you get only JSON format in result. this way if you query in database you will always get a valid JSON.
for making a http request i used request module.
this is a sample that worked for me :
var request = require("request");
var auth = "Basic " + new Buffer("root" + ":" + "root").toString("base64")
request(
{
url : encodeURI('http://localhost:2480/query/tech_graph/sql/'+queryInput+'/20'),
headers : {
"Authorization" : auth
}
},
function (error, response, body) {
console.log(body);
return body;
}
);
I am trying to modify the http status code of create.
POST /api/users
{
"lastname": "wqe",
"firstname": "qwe",
}
Returns 200 instead of 201
I can do something like that for errors:
var err = new Error();
err.statusCode = 406;
return callback(err, info);
But I can't find how to change status code for create.
I found the create method:
MySQL.prototype.create = function (model, data, callback) {
var fields = this.toFields(model, data);
var sql = 'INSERT INTO ' + this.tableEscaped(model);
if (fields) {
sql += ' SET ' + fields;
} else {
sql += ' VALUES ()';
}
this.query(sql, function (err, info) {
callback(err, info && info.insertId);
});
};
In your call to remoteMethod you can add a function to the response directly. This is accomplished with the rest.after option:
function responseStatus(status) {
return function(context, callback) {
var result = context.result;
if(testResult(result)) { // testResult is some method for checking that you have the correct return data
context.res.statusCode = status;
}
return callback();
}
}
MyModel.remoteMethod('create', {
description: 'Create a new object and persist it into the data source',
accepts: {arg: 'data', type: 'object', description: 'Model instance data', http: {source: 'body'}},
returns: {arg: 'data', type: mname, root: true},
http: {verb: 'post', path: '/'},
rest: {after: responseStatus(201) }
});
Note: It appears that strongloop will force a 204 "No Content" if the context.result value is falsey. To get around this I simply pass back an empty object {} with my desired status code.
You can specify a default success response code for a remote method in the http parameter.
MyModel.remoteMethod(
'create',
{
http: {path: '/', verb: 'post', status: 201},
...
}
);
For loopback verion 2 and 3+: you can also use afterRemote hook to modify the response:
module.exports = function(MyModel) {
MyModel.afterRemote('create', function(
context,
remoteMethodOutput,
next
) {
context.res.statusCode = 201;
next();
});
};
This way, you don't have to modify or touch original method or its signature. You can also customize the output along with the status code from this hook.