StitchServiceError "aws: "aws_service" is a required string", errorCodeName: InvalidParameter - mongodb

I'm setting up AWS S3 bucket to upload audio files to using MongoDB Stitch (here are the docs mongo s3 docs . After following the instructions and authenticating my user I keep geting this error when trying to upload the selected file: error image from console
On line 119 where the error is coming from I'm just catching the error after running AWS build:
const aws = stitchClient.getServiceClient(AwsServiceClient.factory, "AWS");
convertAudioToBSONBinaryObject(file).then((result) => {
const audiofile = mongodb.db("data").collection("audiofile");
//now we need an instance of AWS service client
const key = `${stitchClient.auth.user.id}-${file.name}`;
// const key = `${stitchClient.auth.user.id}-${file.name}`;
const bucket = "myBucketName";
const url =
"http://" + bucket + ".s3.amazonaws.com/" + encodeURIComponent(key);
const args = {
ACL: "public-read",
Bucket: bucket,
ContentType: file.type,
Key: key,
Body: result,
// aws_service: "s3",
};
// building the request
const request = new AwsRequest.Builder()
.withService("s3")
.withAction("PutObject")
.withRegion("us-east-1")
.withArgs(args);
aws
.execute(request.build)
.then((result) => {
console.log(result);
console.log(url);
return audiofile.insertOne({
owner_id: stitchClient.auth.user.id,
url,
file: {
name: file.name,
type: file.type,
},
Etag: result.Etag,
ts: new Date(),
});
})
.then((result) => {
console.log("last result", result);
})
.catch((err) => {
console.log(err);
});
});
My Stitch rule for s3 looks like this: Stitch rule for AWS s3
So it seems to me that everything is set up the way it's inteded to, but the error tells me I'm not passing all the needed args. I'd really appreciate any thoughts on how to fix this error.
P.S. If I change "AWS" to "AWS_S3" in this line :
const aws = stitchClient.getServiceClient(AwsServiceClient.factory, "AWS");
The error message changes to this:
StitchServiceError {message: "service not found: 'AWS_S3'", name: "StitchServiceError", errorCode: 18, errorCodeName: "ServiceNotFound",
And the log in Stitch shows this for information for both errors: Stitch Logs

The answer to this is a simple typo in this line:
aws
.execute(request.build)
.then((result)
build is a function so I just needed to call it - (request.build()).then((result).
Issue solved, thanks all!

Related

Connection between MongoDB Atlas and AWS giving timeout error?

I'm new to AWS so I apologize for any newbie stuff.
I'm trying to connect a MongoDB Atlas M0 cluster with our AWS EC2 instance, which is running a nodejs / react stack. The problem is that I can't make these two instances connect - AWS and MongoDB that is. When trying to use the backend sign in function (our nodejs api), it just gives this error:
Operation `user_profile.findOne()` buffering timed out after 10000ms
This is our index / connection:
import config from './config';
import app from './app';
import { connect } from 'mongoose'; // MongoDB
import { ServerApiVersion } from 'mongodb';
import https from 'https';
import AWS from 'aws-sdk';
const makeLogger = (bucket: string) => {
const s3 = new AWS.S3({
accessKeyId: <ACCESS_KEY_ID>,
secretAccessKey: <SECRET_ACCESS_KEY>
});
return (logData: any, filename: string) => {
s3.upload({
Bucket: bucket, // pass your bucket name
Key: filename, // file will be saved as testBucket/contacts.csv
Body: JSON.stringify(logData, null, 2)
}, function (s3Err: any, data: any) {
if (s3Err) throw s3Err
console.log(`File uploaded successfully at ${data.Location}`)
});
console.log(`log (${filename}): ${logData}`);
};
};
const log = makeLogger('xxx-xxxx');
log(config.MONGO_DB_ADDRESS, 'mongo_db_address.txt');
const credentials = <CREDENTIALS>
connect(config.MONGO_DB_ADDRESS, {
sslKey: credentials,
sslCert: credentials,
serverApi: ServerApiVersion.v1
}) //, { useNewUrlParser: true })
.then(() => console.log('Connected to MongoDB'))
.catch((err) => console.error('Failed connection to MongoDB', err));
app.on('error', error => {
console.error('app error: ' + error);
});
app.listen(config.WEB_PORT, () => {
console.log(`Example app listening on port ${config.WEB_PORT}`);
});
One of the endpoints giving the timeout error:
router.post('/signin', async (req, res) => {
var form_validation = signin_schema.validate({
email: req.body.email,
password: req.body.password,
});
if (form_validation.error) {
console.log('form validation sent');
//return res.status(400).send(form_validation);
return res.status(400).send({
kind: 'ERROR',
message: 'Sorry - something didn\'t go well. Please try again.'
});
}
var User = model('model', UserSchema, 'user_profile');
User.findOne({ email: req.body.email }, (err: any, the_user: any) => {
if (err) {
return res.status(400).send({
kind: 'ERROR',
message: err.message
});
}
if (!the_user) {
return res.status(400).send({
kind: 'ERROR',
message: 'the_user undefined',
});
}
compare(req.body.password, the_user.password)
.then((result) => {
if (result == true) {
const user_payload = { name: the_user.name, email: the_user.email };
const access_token = sign(user_payload, config.SECRET_TOKEN);
res.cookie('authorization', access_token, {
httpOnly: true,
secure: false,
maxAge: 3600000,
});
return res.send({ kind: "LOADING" });
// return res.send(access_token);
} else {
return res.status(400).send({
kind: 'ERROR',
message: 'Sorry - wrong password or email used.'
});
}
})
})
});
The strange thing is that I can connect from my local developer machine, when running our frontend. Just as I can connect from wsl2 ubuntu cli.
On the Mongo side, I have whitelisted every possible ip address. On the AWS side, I have created the outbound security group policy required. Regarding the inbound, I think it is correct. I've allowed access on the ports 27000 - 28018.
Again - I'm new to AWS, so if anyone can tell me what it is I'm simply not understanding here, I would be very grateful
Thanks
open mongodb atlas Network Access
open 0.0.0.0/0 (includes your current IP address)

How to solve Vercel 500 Internal Server Error?

I have created a project that uses MongoDB to store user info and Next-Auth to authenticate users. On local host this is all working seamlessly. Previously I had a couple errors with my next-auth config, but that seems to be working fine now on Vercel live site. Once the user logs in they are redirected to "my-project/suggestions". On this page I am using getServerSideProps to identify if there is a valid session token. If so, data is pulled from a local json file.
On the live site, when the user logs in, the page is redirected to "/suggestions", yet I am receiving an 500 Internal Server Error page. On the function logs I am getting this error message:
[GET] /_next/data/KpsnuV9k44lUAhQ-0rK-B/suggestions.json
10:10:57:12
2022-05-05T14:10:59.270Z 5b7a7375-045f-4518-864b-7968c3c9385f ERROR [Error: ENOENT: no such file or directory, open '/var/task/public/data/data.json'] {
errno: -2,
syscall: 'open',
path: '/var/task/public/data/data.json',
page: '/suggestions'
}
RequestId: 5b7a7375-045f-4518-864b-7968c3c9385f Error: Runtime exited with error: exit status 1
Runtime.ExitError
This is my first project using MongoDB and Next-Auth.. not so sure what the issue is in this case. In my .env.local file I only have these two variables:
NEXTAUTH_SECRET="MUNKNATION"
NEXTAUTH_URL=http://localhost:3000
How I am pulling the data on local host:
export const getServerSideProps = async (context) => {
const session = await getSession({ req: context.req });
if (!session) {
return {
redirect: {
destination: "/",
permanent: false,
},
};
} else {
let filePath = path.join(process.cwd(), "public", "data", "data.json");
let jsonData = await fs.readFile(filePath);
const data = JSON.parse(jsonData);
const inProgressStatusData = data.productRequests.filter(
(item) => item.status == "in-progress"
);
const liveStatusData = data.productRequests.filter(
(item) => item.status == "live"
);
const plannedStatusData = data.productRequests.filter(
(item) => item.status == "planned"
);
let filterData = filteredData(data, "suggestion");
let feedbackData = {
suggestions: filterData,
progress: inProgressStatusData,
planned: plannedStatusData,
live: liveStatusData,
};
return {
props: { session, feedbackData },
};
}
};
Folder structure:
A simple solution to this problem would be to, inside of your getServerSideProps, instead of calling readFile use readFileSync as follows:
export const getServerSideProps = async (context) => {
...
const file = readFileSync(
join(process.cwd(), "public", "data", "data.json"),
"utf8"
);
const data = JSON.parse(fileData);
...
I have tested this solution with Vercel and it works correctly, in development and production mode.

HashiCorp Vault + node-vault + write = 404

I'm trying a simple operation using node-vault but it is not working. Here is my attempt:
Configuration
var options = {
apiVersion: 'v2', // default
endpoint: 'http://127.0.0.1:8200', // default
};
// get new instance of the client
var vault = require("node-vault")(options);
vault.token = "<<MY TOKEN>>";
Usage
vault.write('secret/data/new', {"data": {"foo": "bar"}}).then(
function (value: any) {
console.log(value);
})
.catch((err: any) => {
console.log(err);
});
Response
{ statusCode: 404, body: { errors: [] } }
But, if I run vault kv put secret/data/new foo=bar it does work and value is there.
What is going on?
Thank you all and I wish a happy new year!
Ok, here is what I did.
Reinstall Vault, something happened to storage because I did a lot of attempts and commands in it.
Enable secrets engine in specific path vault secrets enable -path=testPath kv
Write to this path
Configure:
export const VAULT_OPTIONS = {
apiVersion: 'v1',
endpoint: 'http://127.0.0.1:8200',
token: '<<YOUR TOKEN>>'
};
vault = require("node-vault")(VAULT_OPTIONS);
Write:
this.vault.write('test/data/mykey', {"data": {"tests": {"test1": "test1-value", "test2": "test2-value"}}}).then(
(result: any) => {
console.log(res.data);
}, (error: any) => {
console.log(error);
});
Please note that path must contain data and data must be surounded by data ({ data: {key:value}) as well.

rest-hapi standalone endpoint not returning handler results

Forgive me if it's a silly question, but the last time I coded in javascript was almost 20 years ago... I'm re-learning javascript these weeks and I'm not sure I got it all.
I'm using hapi with rest-hapi and want to add some standalone endpoints, basically translating the backend portion of this Autodesk tutorial form express.
I'm using the basic rest-hapi example main script, and tried to add a route with the following code:
//api/forge.js
module.exports = function(server, mongoose, logger) {
const Axios = require('axios')
const querystring = require('querystring')
const Boom = require('boom')
const FORGE_CLIENT_ID = process.env.FORGE_CLIENT_ID
const FORGE_CLIENT_SECRET = process.env.FORGE_CLIENT_SECRET
const AUTH_URL = 'https://developer.api.autodesk.com/authentication/v1/authenticate'
const oauthPublicHandler = async(request, h) => {
const Log = logger.bind('User Token')
try {
const response = await Axios({
method: 'POST',
url: AUTH_URL,
headers: {
'content-type': 'application/x-www-form-urlencoded',
},
data: querystring.stringify({
client_id: FORGE_CLIENT_ID,
client_secret: FORGE_CLIENT_SECRET,
grant_type: 'client_credentials',
scope: 'viewables:read'
})
})
Log.note('Forge access token retrieved: ' + response.data.access_token)
return h.response(response.data).code(200)
} catch(err) {
if (!err.isBoom){
Log.error(err)
throw Boom.badImplementation(err)
} else {
throw err
}
}
}
server.route({
method: 'GET',
path: '/api/forge/oauth/public',
options: {
handler: oauthPublicHandler,
tags: [ 'api' ],
plugins: {
'hapi-swagger': {}
}
}
})
}
The code works and I can display the access_token in nodejs console, but swagger doesn't get the response:
At first I thought that an async function cannot be used as handler, but my hapi version is 17.4.0, and it supports async handlers.
What am I doing wrong?
It turns out it was an easy fix: I just needed to specify the Hapi server hostname in my main script!
The problem was with CORS, since Hapi used my machine name instead of localhost. Using
let server = Hapi.Server({
port: 8080,
host: 'localhost'
})
solved my problem.

Image returned from REST API always displays broken

I am building a content management system for an art portfolio app, with React. The client will POST to the API which uses Mongoose to insert into a MongoDB. The API then queries the DB for the newly inserted image, and returns it to the client.
Here's my code to connect to MongoDB using Mongoose:
mongoose.connect('mongodb://localhost/test').then(() =>
console.log('connected to db')).catch(err => console.log(err))
mongoose.Promise = global.Promise
const db = mongoose.connection
db.on('error', console.error.bind(console, 'MongoDB connection error:'))
const Schema = mongoose.Schema;
const ImgSchema = new Schema({
img: { data: Buffer, contentType: String }
})
const Img = mongoose.model('Img', ImgSchema)
I am using multer and fs to handle the image file. My POST endpoint looks like this:
router.post('/', upload.single('image'), (req, res) => {
if (!req.file) {
res.send('no file')
} else {
const imgItem = new Img()
imgItem.img.data = fs.readFileSync(req.file.path)
imgItem.contentType = 'image/png'
imgItem
.save()
.then(data =>
Img.findById(data, (err, findImg) => {
console.log(findImg.img)
fs.writeFileSync('api/uploads/image.png', findImg.img.data)
res.sendFile(__dirname + '/uploads/image.png')
}))
}
})
I can see in the file structure that writeFileSync is writing the image to the disk. res.sendFile grabs it and sends it down to the client.
Client side code looks like this:
handleSubmit = e => {
e.preventDefault()
const img = new FormData()
img.append('image', this.state.file, this.state.file.name)
axios
.post('http://localhost:8000/api/gallery', img, {
onUploadProgress: progressEvent => {
console.log(progressEvent.loaded / progressEvent.total)
}
})
.then(res => {
console.log('responsed')
console.log(res)
const returnedFile = new File([res.data], 'image.png', { type: 'image/png' })
const reader = new FileReader()
reader.onloadend = () => {
this.setState({ returnedFile, returned: reader.result })
}
reader.readAsDataURL(returnedFile)
})
.catch(err => console.log(err))
}
This does successfully place both the returned file and the img data url on state. However, in my application, the image always displays broken.
Here's some screenshots:
How to fix this?
Avoid sending back base64 encoded images (multiple images + large files + large encoded strings = very slow performance). I'd highly recommend creating a microservice that only handles image uploads and any other image related get/post/put/delete requests. Separate it from your main application.
For example:
I use multer to create an image buffer
Then use sharp or fs to save the image (depending upon file type)
Then I send the filepath to my controller to be saved to my DB
Then, the front-end does a GET request when it tries to access: http://localhost:4000/uploads/timestamp-randomstring-originalname.fileext
In simple terms, my microservice acts like a CDN solely for images.
For example, a user sends a post request to http://localhost:4000/api/avatar/create with some FormData:
It first passes through some Express middlewares:
libs/middlewares.js
...
app.use(cors({credentials: true, origin: "http://localhost:3000" })) // allows receiving of cookies from front-end
app.use(morgan(`tiny`)); // logging framework
app.use(multer({
limits: {
fileSize: 10240000,
files: 1,
fields: 1
},
fileFilter: (req, file, next) => {
if (!/\.(jpe?g|png|gif|bmp)$/i.test(file.originalname)) {
req.err = `That file extension is not accepted!`
next(null, false)
}
next(null, true);
}
}).single(`file`))
app.use(bodyParser.json()); // parses header requests (req.body)
app.use(bodyParser.urlencoded({ limit: `10mb`, extended: true })); // allows objects and arrays to be URL-encoded
...etc
Then, hits the avatars route:
routes/avatars.js
app.post(`/api/avatar/create`, requireAuth, saveImage, create);
It then passes through some user authentication, then goes through my saveImage middleware:
services/saveImage.js
const createRandomString = require('../shared/helpers');
const fs = require("fs");
const sharp = require("sharp");
const randomString = createRandomString();
if (req.err || !req.file) {
return res.status(500).json({ err: req.err || `Unable to locate the requested file to be saved` })
next();
}
const filename = `${Date.now()}-${randomString}-${req.file.originalname}`;
const filepath = `uploads/${filename}`;
const setFilePath = () => { req.file.path = filepath; return next();}
(/\.(gif|bmp)$/i.test(req.file.originalname))
? fs.writeFile(filepath, req.file.buffer, (err) => {
if (err) {
return res.status(500).json({ err: `There was a problem saving the image.`});
next();
}
setFilePath();
})
: sharp(req.file.buffer).resize(256, 256).max().withoutEnlargement().toFile(filepath).then(() => setFilePath())
If the file is saved, it then sends a req.file.path to my create controller. This gets saved to my DB as a file path and as an image path (the avatarFilePath or /uploads/imagefile.ext is saved for removal purposes and the avatarURL or [http://localhost:4000]/uploads/imagefile.ext is saved and used for the front-end GET request):
controllers/avatars.js (I'm using Postgres, but you can substitute for Mongo)
create: async (req, res, done) => {
try {
const avatarurl = `${apiURL}/${req.file.path}`;
await db.result("INSERT INTO avatars(userid, avatarURL, avatarFilePath) VALUES ($1, $2, $3)", [req.session.id, avatarurl, req.file.path]);
res.status(201).json({ avatarurl });
} catch (err) { return res.status(500).json({ err: err.toString() }); done();
}
Then when the front-end tries to access the uploads folder via <img src={avatarURL} alt="image" /> or <img src="[http://localhost:4000]/uploads/imagefile.ext" alt="image" />, it gets served up by the microservice:
libs/server.js
const express = require("express");
const path = app.get("path");
const PORT = 4000;
//============================================================//
// EXPRESS SERVE AVATAR IMAGES
//============================================================//
app.use(`/uploads`, express.static(`uploads`));
//============================================================//
/* CREATE EXPRESS SERVER */
//============================================================//
app.listen(PORT);
What it looks when logging requests:
19:17:54 INSERT INTO avatars(userid, avatarURL, avatarFilePath) VALUES ('08861626-b6d0-11e8-9047-672b670fe126', 'http://localhost:4000/uploads/1536891474536-k9c7OdimjEWYXbjTIs9J4S3lh2ldrzV8-android.png', 'uploads/1536891474536-k9c7OdimjEWYXbjTIs9J4S3lh2ldrzV8-android.png')
POST /api/avatar/create 201 109 - 61.614 ms
GET /uploads/1536891474536-k9c7OdimjEWYXbjTIs9J4S3lh2ldrzV8-android.png 200 3027 - 3.877 ms
What the user sees upon successful GET request: