AWS Lambda return a file - scala

I'm reading a CSV file from S3 using Lambda and want to return that file to the caller of the lambda function. I don't want to print the file through outputstream - I want to return the actual file. Is there a way to do this? I get the S3 file like this:
override def handleRequest(inputStream: InputStream,
outputStream: OutputStream,
context: Context): Unit = {
...
val s3File = s3Client.getObject(bucketName, bucketKey)
}
How can I return the actual file (as opposed to converting it to String and printing via outputStream)?

The above can be done using nodejs aws lambda with express server. You could use aws provided package
// lambda.js
'use strict'
const awsServerlessExpress = require('aws-serverless-express')
const app = require('./app')
const server = awsServerlessExpress.createServer(app)
exports.handler = (event, context) => { awsServerlessExpress.proxy(server, event, context) }
// app.js
// refer the examples for complete code
// https://github.com/awslabs/aws-serverless-express/blob/master/examples/basic-starter/app.js
...
router.get('/download', (req, res) => {
s3.getObject({Bucket: myBucket, Key: myFile},function(err,data){
if (err) {
// report error and return the server error
return res.status(500).send("Error!");
}
// set the http Headers
res.set("Content-Length",data.ContentLength)
.set("Content-Type",data.ContentType);
// send the data
res.send(data.Body);
});
});
...
PS: I have not tested the above code.

Related

How can I mock aws-sdk with jest?

I am trying to mock aws-sdk with jest. Actually I only care about one function. How can I do this? I have read the docs about mocking classes with jest, but the docs are complicated and I don't quite understand them.
Here is my best attempt:
handler.test.js
'use strict';
const aws = require('aws-sdk');
const { handler } = require('../../src/rotateSecret/index');
jest.mock('aws-sdk');
const event = {
SecretId: 'test',
ClientRequestToken: 'ccc',
Step: 'createSecret',
};
describe('rotateSecret', () => {
it.only('should not get or put a secret', async () => {
aws.SecretsManager.mockImplementation(() => ({
getSecretValue: () => ({}),
}));
expect.assertions(1);
await handler(event);
// You can see what I am trying to do here but it doesn't work
expect(aws.SecretsManager.getSecretManager).not.toHaveBeenCalled();
});
});
handler.js
exports.handler = async (event) => {
const secretsManager = new aws.SecretsManager();
const secret = await secretsManager.describeSecret({ SecretId: event.SecretId }).promise();
if (someCondition) {
console.log("All conditions not met");
return;
}
return secretsManager.getSecretValue(someParams)
};
Okay so the way I would approach this is as follows:
AWS-SDK Mock
Create an actual mock for aws-sdk and put it in __mocks__/aws-sdk.js file at the root of your project
// __mocks__/aws-sdk.js
class AWS {
static SecretsManager = class {
describeSecret = jest.fn(() =>{
return { promise: ()=> Promise.resolve({ ARN: "custom-arn1", Name: "describeSec" })}
});
getSecretValue = jest.fn(() =>{
return {promise: ()=> Promise.resolve({ ARN: "custom-arn2", Name: "getSecretVal" })
});
};
}
module.exports = AWS;
I have used static before SecretsManager because AWS class is never instantiated yet it wants access to SecretsManager class.
Inside SecretsManager, I have defined 2 functions and stubbed them using jest.fn.
Now same stuff as you have done in your test file:
jest.mock('aws-sdk');
How to Test
To test if your mock functions are called, thats the tricky part (so i will detail that at the end of this post).
Better approach would be to assert against the end result of your main function after all processing is finished.
Assertions
Back in your test file, I would simply invoke the handler with the await (as you already have) and then assert against the final result like so:
// test.js
describe("rotateSecret", () => {
it.only("should not get or put a secret", async () => {
const event = {name:"event"};
const result = await handler(event);
expect(result).toEqual("whatever-your-function-is-expected-to-return");
});
});
Testing Secret Manager's function invocations
For this you will need to tweak your main handler.js file itself and will need to take out invocation of secrets Manager from the main function body like so:
const secretsManager = new aws.SecretsManager(); // <---- Declare it in outer scope
exports.handler = async (event) => {
const secret = await secretsManager
.describeSecret({ SecretId: event.SecretId })
.promise();
if (someCondition) {
console.log("All conditions not met");
return;
}
return secretsManager.getSecretValue(someParams);
};
Then back in your test.js file, you will need to similarly declare the SecretsManager invocation before you initiate your handler function like so:
//test.js
describe("rotateSecret", () => {
const secretsManager = new aws.SecretsManager(); // <---- Declare it in outer scope
it.only("should not get or put a secret", async () => {
const event = {name:"event"};
await handler(event);
// Now you can make assertions on function invocations
expect(secretsManager.describeSecret).toHaveBeenCalled();
// OR check if passed args were correct
expect(secretsManager.describeSecret).toHaveBeenCalledWith({
SecretId: event.SecretId,
});
});
});
This will allow you to make assertions on function invocation as well the args that were passed.
The reason I declare it outside function scope is to tell Jest that secretsManager should be existing somewhere in global scope and it should be used from there.
Previously, we had it declared inside the function scope, so Jest would invoke it but we weren't able to get access to it.
We couldn't directly reference it like this AWS.SecretsManager.getSecretManager because getSecretManager method is only available after you instantiate the SecretsManager class (and even if you did that, you will get a new instance of the class which won't help with any assertions).
Downside of __mocks__/aws.js fake module
Obvious issue is - you are stubbing the function on every single call and maybe you won't want that.
Perhaps you only want to stub it out once for a specific test but for the rest of them you want it to run normal.
In that case, you should not create __mocks__ folder.
Instead, create a one-time fake BUT make sure your SecretsManager invocation is in the outside scope in your test file as before.
//test.js
const aws = require("aws-sdk");
describe("rotateSecret", () => {
// Declare it in outer scope
const secretsManager = new aws.SecretsManager();
it.only("should not get or put a secret", async () => {
const event = {name:"event"};
// Create a mock for this instance ONLY
secretsManager.describeSecret = jest.fn().mockImplementationOnce(()=>Promise.resolve("fake-values"));
await handler(event);
expect(secretsManager.describeSecret).toHaveBeenCalled();
expect(secretsManager.describeSecret).toHaveBeenCalledWith({
SecretId: event.SecretId,
});
});
});

Pg-promise - How to stream binary data directly to response

Forgive me I'm still learning. I'm trying to download some mp3 files that I have stored in a table. I can download files directly from the file system like this:
if (fs.existsSync(filename)) {
res.setHeader('Content-disposition', 'attachment; filename=' + filename);
res.setHeader('Content-Type', 'application/audio/mpeg3');
var rstream = fs.createReadStream(filename);
rstream.pipe(res);
I have stored the data in the table using pg-promise example in the docs like so:
const rs = fs.createReadStream(filename);
function receiver(_, data) {
function source(index) {
if (index < data.length) {
return data[index];
}
}
function dest(index, data) {
return this.none('INSERT INTO test_bin (utterance) VALUES($1)', data);
}
return this.sequence(source, {dest});
} // end receiver func
rep.tx(t => {
return streamRead.call(t, rs, receiver);
})
.then(data => {
console.log('DATA:', data);
})
.catch(error => {
console.log('ERROR: ', error);
});
But now I want to take that data out of the table and download it to the client. The example in the docs of taking data out of binary converts it to JSON and then prints it to the console like this:
db.stream(qs, s => {
s.pipe(JSONStream.stringify()).pipe(process.stdout)
})
and that works. So the data is coming out of the database ok. But I can't seem to send it to the client. It seems that the data is already a stream so I have tried:
db.stream(qs, s => {
s.pipe(res);
});
But I get a typeerror: First argument must be a string or Buffer
Alternatively, I could take that stream and write it to the file system, and then serve it as in the top step above, but that seems like a workaround. I wish there was an example of how to save to a file in the docs.
What step am I missing?

Fromname in mail using sendgrid mail api

I'm trying to send emails using sendgrid mail API.
Everything works fine. However, I want my emails to have a specific name.
Not the prefix of the sender's address, which is coming up by default.
I changed the From value to "MY_email_name <sender#example.com>". But it didn't work.
I have set the From_Name field to "MY_email_name". That too didn't work.
However, it's working when I not read the html content from an external file and instead give some inline. In that case it is sending me the email_name.
Any idea about how I can do this with reading the content.
Thanks.
var sendgrid = require('sendgrid')('MY_APP_SECRET');
var fs = require('fs');
var content;
// First I want to read the file
fs.readFile(__dirname+'/email.html', function read(err, data) {
if (err) {
throw err;
}
content = data;
// Invoke the next step here however you like
//console.log(content); // Put all of the code here (not the best solution)
processFile(); // Or put the next step in a function and invoke it
});
function processFile() {
console.log(content);
}
module.exports = function sendMail(mailObject){
return new Promise(function (resolve, reject){
// create a new email instance
var email = new sendgrid.Email();
email.addTo('some1#example.com');
email.setFrom('sender#example.com');
email.setSubject('My-Email-body');
email.setFromName("Email-Name");
email.setHtml(content);
email.addHeader('X-Sent-Using', 'SendGrid-API');
email.addHeader('X-Transport', 'web');
email.setASMGroupID(835);
//send mail
sendgrid.send(email, function(err, json) {
//if something went wrong
if (err) { reject({
error:err,
res : json,
}); }
//else
resolve({
statusText: 'OK',
res : json
});
});
})
}

Not able to encrypt a string with a public key in Protractor

I am trying call the encrypt function mentioned below:
var encryptor = require("./jsencrypt.js");
this.encrypt = function () {
var key="LxVtiqZV6g2D493gDBfG0BfV6sAhteG6hOCAu48qO00Z99OpiaIG5vZxVtiqZV8C7bpwIDAQAB";
encryptor = new JSEncrypt();
encryptor.setPublicKey(key);
var newString = encryptor.encrypt('Password');
console.log("Encrypted password =",newString);
}
Initially I was getting Reference Error for undefined JSEncrypt.
So I downoaded jsencrypt.js file and added var encryptor = require("./jsencrypt.js");at the begining. Now I am getting following error:
Message:
ReferenceError: navigator is not defined
Stacktrace:
ReferenceError: navigator is not defined
at e:\Praveen Data\Projects\ECP\CentralRegistryUI\TestScripts\Utils\jsencrypt.js:73:13
at Object.<anonymous> (e:\Praveen Data\Projects\ECP\CentralRegistryUI\TestScripts\Utils\jsencrypt.js:4342:3)
at require (module.js:385:17)
Tried using windows.navigator in jsencrypt.js, but didn't work.
Protractor tests are not run in browser environment but in node.js, because of that navigator object is not available there. JSEncrypt relies on it to work on the client side across different browsers and versions.
It's referenced in many places in the JSEncrypt code so my best bet would be to either switch to a server side encryption library that would work for you or if not possible mock a global navigator json object with all expected properties/methods as if it was a Chrome browser - node.js runs on chrome's js engine so should work fine.
One of my colleague helped me with the solution.
So here I have a function for encryption:
this.initializeEncryptedPassword = () => {
//console.log("before calling encrypt... ");
browser.executeScript(() => {
//console.log("Starting to return encryptor...");
return window.loginEncryptor.encrypt(window.loginPassword);
}).then((encryptedPassword) => {
this.encryptedPassword = encryptedPassword;
});
//console.log("after calling encrypt...");
}
This function is being called by:
export default class Encryptor {
constructor($window, $http) {
'ngInject';
this.encryptor = new $window.JSEncrypt();
//Need to use HTTP here instead of resource since the resource does not return plain text.
//Getting Public Key by hitting a rest uri.
$http({method: "GET", url: "/xyz/authenticate"}).success((item) => {
this.encryptor.setPublicKey(item);
//set the current encryptor on the window so that testing can use it
$window.loginEncryptor = this.encryptor;
});
}
encryptPassword(credentials) {
credentials.password = this.encryptor.encrypt(credentials.password);
}
}
Hope this help others.
before require('jsencrypt') you can write first:
const { JSDOM } = require('jsdom');
const jsdom = new JSDOM('<!doctype html><html><body></body></html>');
const { window } = jsdom;
global.window = window;
global.document = window.document;
global.navigator ={userAgent: 'node.js'};
const { JSEncrypt } = require('jsencrypt')
You can mock by doing the following:
global.navigator = { appName: 'protractor' };
global.window = {};
const JSEncrypt = require('JSEncrypt').default;

How to use GridFS to store images using Node.js and Mongoose

I am new to Node.js. Can anyone provide me an example of how to use GridFS for storing and retrieving binary data, such as images, using Node.js and Mongoose? Do I need to directly access GridFS?
I was not satisfied with the highest rated answer here and so I'm providing a new one:
I ended up using the node module 'gridfs-stream' (great documentation there!) which can be installed via npm.
With it, and in combination with mongoose, it could look like this:
var fs = require('fs');
var mongoose = require("mongoose");
var Grid = require('gridfs-stream');
var GridFS = Grid(mongoose.connection.db, mongoose.mongo);
function putFile(path, name, callback) {
var writestream = GridFS.createWriteStream({
filename: name
});
writestream.on('close', function (file) {
callback(null, file);
});
fs.createReadStream(path).pipe(writestream);
}
Note that path is the path of the file on the local system.
As for my read function of the file, for my case I just need to stream the file to the browser (using express):
try {
var readstream = GridFS.createReadStream({_id: id});
readstream.pipe(res);
} catch (err) {
log.error(err);
return next(errors.create(404, "File not found."));
}
Answers so far are good, however, I believe it would be beneficial to document here how to do this using the official mongodb nodejs driver instead of relying on further abstractions such as "gridfs-stream".
One previous answer has indeed utilized the official mongodb driver, however they use the Gridstore API; which has since been deprecated, see here. My example will be using the new GridFSBucket API.
The question is quite broad as such my answer will be an entire nodejs program. This will include setting up the express server, mongodb driver, defining the routes and handling the GET and POST routes.
Npm Packages Used
express (nodejs web application framework to simplify this snippet)
multer (for handling multipart/form-data requests)
mongodb (official mongodb nodejs driver)
The GET photo route takes a Mongo ObjectID as a parameter to retrieve the image.
I configure multer to keep the uploaded file in memory. This means the photo file will not be written to the file system at anytime, and instead be streamed straight from memory into GridFS.
/**
* NPM Module dependencies.
*/
const express = require('express');
const photoRoute = express.Router();
const multer = require('multer');
var storage = multer.memoryStorage()
var upload = multer({ storage: storage, limits: { fields: 1, fileSize: 6000000, files: 1, parts: 2 }});
const mongodb = require('mongodb');
const MongoClient = require('mongodb').MongoClient;
const ObjectID = require('mongodb').ObjectID;
let db;
/**
* NodeJS Module dependencies.
*/
const { Readable } = require('stream');
/**
* Create Express server && Routes configuration.
*/
const app = express();
app.use('/photos', photoRoute);
/**
* Connect Mongo Driver to MongoDB.
*/
MongoClient.connect('mongodb://localhost/photoDB', (err, database) => {
if (err) {
console.log('MongoDB Connection Error. Please make sure that MongoDB is running.');
process.exit(1);
}
db = database;
});
/**
* GET photo by ID Route
*/
photoRoute.get('/:photoID', (req, res) => {
try {
var photoID = new ObjectID(req.params.photoID);
} catch(err) {
return res.status(400).json({ message: "Invalid PhotoID in URL parameter. Must be a single String of 12 bytes or a string of 24 hex characters" });
}
let bucket = new mongodb.GridFSBucket(db, {
bucketName: 'photos'
});
let downloadStream = bucket.openDownloadStream(photoID);
downloadStream.on('data', (chunk) => {
res.write(chunk);
});
downloadStream.on('error', () => {
res.sendStatus(404);
});
downloadStream.on('end', () => {
res.end();
});
});
/**
* POST photo Route
*/
photoRoute.post('/', (req, res) => {
upload.single('photo')(req, res, (err) => {
if (err) {
return res.status(400).json({ message: "Upload Request Validation Failed" });
} else if(!req.body.name) {
return res.status(400).json({ message: "No photo name in request body" });
}
let photoName = req.body.name;
// Covert buffer to Readable Stream
const readablePhotoStream = new Readable();
readablePhotoStream.push(req.file.buffer);
readablePhotoStream.push(null);
let bucket = new mongodb.GridFSBucket(db, {
bucketName: 'photos'
});
let uploadStream = bucket.openUploadStream(photoName);
let id = uploadStream.id;
readablePhotoStream.pipe(uploadStream);
uploadStream.on('error', () => {
return res.status(500).json({ message: "Error uploading file" });
});
uploadStream.on('finish', () => {
return res.status(201).json({ message: "File uploaded successfully, stored under Mongo ObjectID: " + id });
});
});
});
app.listen(3005, () => {
console.log("App listening on port 3005!");
});
I wrote a blog post on this subject; is is an elaboration of my answer. Available here
Further Reading/Inspiration:
NodeJs Streams: Everything you need to know
Multer NPM docs
Nodejs MongoDB Driver
I suggest taking a look at this question: Problem with MongoDB GridFS Saving Files with Node.JS
Copied example from the answer (credit goes to christkv):
// You can use an object id as well as filename now
var gs = new mongodb.GridStore(this.db, filename, "w", {
"chunk_size": 1024*4,
metadata: {
hashpath:gridfs_name,
hash:hash,
name: name
}
});
gs.open(function(err,store) {
// Write data and automatically close on finished write
gs.writeBuffer(data, true, function(err,chunk) {
// Each file has an md5 in the file structure
cb(err,hash,chunk);
});
});
It looks like the writeBuffer has since been deprecated.
/Users/kmandrup/private/repos/node-mongodb-native/HISTORY:
82 * Fixed dereference method on Db class to correctly dereference Db reference objects.
83 * Moved connect object onto Db class(Db.connect) as well as keeping backward compatibility.
84: * Removed writeBuffer method from gridstore, write handles switching automatically now.
85 * Changed readBuffer to read on Gridstore, Gridstore now only supports Binary Buffers no Strings anymore.
remove the fileupload library
and if it is giving some multi-part header related error than remove the content-type from the headers