Uploading images to s3 through stitch aws service fails - mongodb

Sorry I am a noob, but I am building a quasar frontend using mongodb stitch as backend.
I am trying to upload an image using the stitch javascript sdks and the AwsRequest.Builder.
Quasar gives me an image object with base64 encoded data.
I remove the header string from the base64 string (the part that says "data:image/jpeg;base64,"), I convert it to Binary and upload it to the aws s3 bucket.
I can get the data to upload just fine and when I download it again I get the exact bytes that I have uploaded, so the roundtrip through stitch to aws S3 and back seems to work.
Only, the image I upload can neither be opened in S3 nor cannot be opened once downloaded.
The difficulties seem to be in the conversion to binary of the base64 string and/or in the choice of the proper upload parameters for stitch.
Here is my code:
var fileSrc = file.__img.src // valid base64 encoded image with header string
var fileData = fileSrc.substr(fileSrc.indexOf(',') + 1) // stripping out header string
var body = BSON.Binary.fromBase64(fileData, 0) // here I get the BSON error
const args = {
ACL: 'public-read',
Bucket: 'elever-erp-document-store',
ContentType: file.type,
ContentEncoding: 'x-www-form-urlencoded', // not sure about the need to specify encoding for binary file
Key: file.name,
Body: body
}
const request = new AwsRequest.Builder()
.withService('s3')
.withRegion('eu-west-1')
.withAction('PutObject')
.withArgs(args)
aws.execute(request.build())
.then(result => {
alert('OK ' + result)
return file
}).catch(err => {
alert('error ' + err)
})
In the snippet above I try to use BSON.Binary.fromBase64 for the conversion to binary as per Haley's suggestion below, but I get following error:
boot_stitch__WEBPACK_IMPORTED_MODULE_3__["BSON"].Binary.fromBase64 is not a function.
I have also tried other ways to convert the base64 string to binary, like the vanilla atob() function and the BUFFER npm module, but with no joy.
I must be doing something stupid somewhere but I cannot find my way out.

I had a similar issue, solved it by creating a buffer from the base64 data and then used new BSON.Binary(new Uint8Array(fileBuffer), 0) to create the BSON Binary Object.
Using the OP it would look something like this:
var fileSrc = file.__img.src // valid base64 encoded image with header string
var fileData = fileSrc.substr(fileSrc.indexOf(',') + 1) // stripping out header string
var fileBuffer = new Buffer(fileData, 'base64');
var body = new BSON.Binary(new Uint8Array(fileBuffer), 0)

You should be able to convert the base64 image to BSON.Binary and then upload the actual image that way (i have some of the values hard-coded, but you can replace those):
context.services.get("<aws-svc-name>").s3("<your-region>").PutObject({
Bucket: 'myBucket',
Key: "hello.png",
ContentType: "image/png",
Body: BSON.Binary.fromBase64("iVBORw0KGgoAA... (rest of the base64 string)", 0),
})

Related

uppy.io using to send base64 encoded data rather than specifying a file input

Is there a way to send base64 encoded data using uppy.io? I already have it working for 'soft-copy' document uploads using the Dashboard component, but cant seem to work out a way where I can pass the file bytes and not use an input file tag to provide the data to be uploaded.
Context:
I have a page that uses a JavaScript component to access local scanner hardware. It scans, shows a preview, all working. The user then hits an upload button to push it to the server, the scanning component outputs the scan as base64 encoded data. I can send this up to the server using XMLHttpRequest like so:
var req = new XMLHttpRequest();
var formData = new FormData();
formData.append('fileName', uploadFileName);
formData.append('imageFileAsBase64String', imageFileAsBase64String);
req.open("POST", uploadFormUrl);
req.onreadystatechange = __uploadImages_readyStateChanged;
req.send(formData);
but I would really like to use uppy because scan files can be quite large and I get the resumable uploads, nice progress bar etc, and I already have tusdotnet on the back setup and ready to receive it.
All the examples rely on input tags so I dont really know what approach to take. thanks for any pointers.
I eventually figured this out. here in case its useful to anyone else.
you can use fetch to convert the base64 string, then turn it into a blob and finally add it to uppy files via the addFile api.
I referenced this article:
https://ionicframework.com/blog/converting-a-base64-string-to-a-blob-in-javascript/
code below works with my setup with tusdotnet handling the tus service server side.
var uppy = new Uppy.Core({
autoProceed: true,
debug: true
})
.use(Uppy.Tus, { endpoint: 'https://localhost:44302/files/' })
.use(Uppy.ProgressBar, {
target: '.UppyInput-Progress',
hideAfterFinish: false,
})
uppy.on('upload', (data) => {
uppy.setMeta({ md:'value' })
})
uppy.on('complete', (result) => {
// do completion stuff
})
fetch(`data:image/jpeg;base64,${imageFileAsBase64String}`)
.then((response) => response.blob())
.then((blob) => {
uppy.addFile({
name: 'image.jpg',
type: 'image/jpeg',
data: blob
})
});

Flutter Dio multipart image, Cannot read property '0' of undefined

I am trying to send multiple images to the backend server.
I have tried Dio and Http but the server responds with an error and the rest of the data except the images get saved.
The documentation says that the key values must be 'images_0' 'images_1' and so on.
Code -
The images are converted from XFile to multipart file.
int imgNum = 0;
for (int i = 0; i < data.images.length; i++) {
if (data.images[i] != null) {
File file = File(data.images[i]!.path);
String fileName = file.path.split('/').last;
http.MultipartFile mFile = http.MultipartFile(
'images_$imgNum',
file.readAsBytes().asStream(),
await file.length(),
filename: fileName,
);
imgNum++;
request.files.add(mFile);
}
}
(data.images is a list of XFiles? , request is the http.MultipartRequest)
All the other fields and headers are working and being saved correctly.
I have tried using Dio too which gives the same response. The response from the server is :
{data: {error: Cannot read property '0' of undefined}, message: Something went wrong. Please try again later.}
Our project also has a website where the form is working correctly, and when the http request there shows the images as follows : (in Google Chrome)
Under form data
Form Data source
The api team has also shared Swagger Ui API docs-
Docs
What am I doing wrong? How can I fix this?
Thank you for responding.

How to upload file to mongodb on mongoose using nestJS?

Hello pls do somebody know how to upload file to MongoDB (mongoose) using nestJS ??
I already have the ability to #Post upload file to my nestJS projet and #Get, but know I wanna post to mongodb using mongoose, pls help
I don't recommend to store images on your database but you can do this:
async function saveFile(file: Express.Multer.File){
//Convert the file to base64 string
const fileB64 = file.buffer.toString('base64')
//userModel is a mongoose model
//Store the string
await this.userModel.create({file: fileB64})
}
async function getFile(userId: string){
//Get user from database
const user = await this.userModel.findOne({_id: userId}).lean()
if(!user) throw new NotFoundException('User not found')
const file = user.file
//Convert the string to buffer
return Buffer.from(file, 'base64')
}
First you have to convert that file to a string with base64 encoding then you can save that string on your database with the create method or updating a document.
If you want to get that file just search that info in your database and then convert the string to buffer and return it.
Like I said before, I don't recommend this it is better if you upload the buffer to s3 and save the link on your database.
thanks it is worked, but it is only buffer a can't see the image! please is there any others option to get is as image? please here what i'm getting :
{"type":"Buffer","data":[255,216,255,224,0,16,74,70,73,70,0,1,1,0,0,72,0,72,0,0,255,225,0,120,69,120,105,102,0,0,73,73,42,0,8,0,0,0,4,0,18,1,3,0,1,0,0,0,1,0,0,0,49,1,2,0,7,0,0,0,62,0,0,0,18,2,3,0,2,0,0,0,2,0,2,0,105,135,4,0,1,0,0,0,70,0,0,0,0,0,0,0,71,111,111,103,108,101,0,0,3,0,0,144,7,0,4,0,0,0,48,50,50,48,2,160,4,0,1,0,0,0,208,2,0,0,3,160,4,0,1,0,0,0,0,5,0,0,0,0,0,0,255,192,0,17,8,5,0,2,208,3,1,34,0,2,17,1,3,17,1,255,196,0,31,0,0,1,5,1,1,1,1,1,1,0,0,0,0,0,0,0,0,1,2,3,4,5,6,7,8,9,10,11,255,196,0,181,16,0,2,1,3,3,2,4,3,5,5,4,4,0,0,1,125,1,2,3,0,4,17,5,18,33,49,65,6,19,81,97,7,34,113,20,50,129,145,161,8,35,66,177,193....
Angular service file
postFile(fileToUpload: File): Observable<any> {
const formaData: FormData = new FormData();
formaData.append('fileKey', fileToUpload, fileToUpload.name);
return this.http.post(`${this.backEndURL}/api/prods/upload/two/tree/`, JSON.stringify(formaData));}
but my backend Nestjs trow error:
[ExceptionsHandler] Cannot read property 'buffer' of undefined +80859ms
TypeError: Cannot read property 'buffer' of undefined

Storing data url in Mongo DB

I am considering storing data-urls in my mongoDB instead of storing a reference to a file or using GridFS.
Data url:
data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAMcAAAEsCAYAAAB38aczAAAgAElEQV
All of the files I am storing are JPG or PNG, and are less than 1MB in size.
I am wondering if this is considered bad practice, and what the performance implications for both read and write operations, storing the data-urls 1) in a separate collection 2) as meta data in a collection.
I'm open to any other suggestions for small file storage.
First, I wouldn't store base64 encoded data in a database that is perfectly capable of storing binary data. That's just waste of space. Store the image itself, not its base64 representation, i.e. not data : "VBORw0KGgoAAAANSUhEUgAAA...", but data : BinData("VBORw0KGgoAAAANSUhEUgAAA...") (the former is a string for MongoDB, the latter is binary data). Base64 increases the size by 33%.
Other than that, I think this is fine. The trade off is 1 request that grabs all the data vs. multiple requests. The downside of storing larger chunks of data is that all the data must be in RAM for a moment, but at 1MB that's probably a non-issue.
You should, however, make sure that you don't fetch the document in situations where you don't need the image. 1MB isn't too much, but for a read-heavy collection it's a disaster.
I just finished a solution for this. This solution works with Ajax, so you can use fetch calls in Javascript with this. The strange thing is that nowhere on the whole internet this solution could be found and thats why I put in on to help others who want to work with images with data uris :-)
Model:
cmsImage: { data: Buffer, contentType: String }
Storing in MongoDB:
let rawData = fs.readFileSync(`${root}public/uploads/` + file.originalname);
let base64Data = Buffer.from(rawData).toString('base64');
// upload this image
let image = {
cmsImage: {
data: base64Data,
contentType: file.mimetype
}
};
// in this record in the database
await this.model.findByIdAndUpdate(body.id, image);
Retrieving from MongoDB and create image element dynamically:
// decode image from database as image uri
let imageArray = new Int8Array(image.data.data);
let decodedImage = new TextDecoder().decode(imageArray);
// image
let cmsImage = document.createElement("img");
cmsImage.src = "data:" + image.contentType + ";base64," + decodedImage;
cmsImage.alt = "image";
cmsContent.appendChild(cmsImage);
Multer - Use originalfile for upload to database.
let storage = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, './public/uploads')
},
filename: function (req, file, cb) {
cb(null, file.originalname)
}
});
this.upload = multer({ storage: storage });
upload to directory
this.upload.single(uploadImage)

Meteor: Saving images from urls to AWS S3 storage

I am trying, server-side, to take an image from the web by it's url (i.e. http://www.skrenta.com/images/stackoverflow.jpg) and save this image to my AWS S3 bucket using Meteor, the aws-sdk meteorite package as well as the http meteor package.
This is my attempt, which indeed put a file in my bucket (someImageFile.jpg), but the image file is corrupted then and cannot be displayed by a browser or a viewer application.
Probably I am doing something wrong with the encoding of the file. I tried many combinations and none of them worked. Also, I tried adding ContentLength and/or ContentEncoding with different encodings like binary, hex, base64 (also in combination with Buffer.toString("base64"), none of them worked. Any advice will be greatly appreciated!
This is in my server-side-code:
var url="http://www.skrenta.com/images/stackoverflow.jpg";
HTTP.get(url, function(err, data) {
if (err) {
console.log("Error: " + err);
} else {
//console.log("Result: "+JSON.stringify(data));
//uncommenting above line fills up the console with raw image data
s3.putObject({
ACL:"public-read",
Bucket:"MY_BUCKET",
Key: "someImageFile.jpg",
Body: new Buffer(data.content,"binary"),
ContentType: data.headers["content-type"], // = image/jpeg
//ContentLength: parseInt(data.headers["content-length"]),
//ContentEncoding: "binary"
},
function(err,data){ // CALLBACK OF HTTP GET
if(err){
console.log("S3 Error: "+err);
}else{
console.log("S3 Data: "+JSON.stringify(data));
}
}
);
}
});
Actually I am trying to use the filepicker.io REST API via HTTP calls, i.e. for storing a converted image to my s3, but for this problem this is the minimum example to demonstrate the actual problem.
After several trial an error runs I gave up on Meteor.HTTP and put together the code below, maybe it will help somebody when running into encoding issues with Meteor.HTTP.
Meteor.HTTP seems to be meant to just fetch some JSON or text data from remote APIs and such, somehow it seems to be not quiet the choice for binary data. However, the Npm http module definitely does support binary data, so this works like a charm:
var http=Npm.require("http");
url = "http://www.whatever.com/check.jpg";
var req = http.get(url, function(resp) {
var buf = new Buffer("", "binary");
resp.on('data', function(chunk) {
buf = Buffer.concat([buf, chunk]);
});
resp.on('end', function() {
var thisObject = {
ACL: "public-read",
Bucket: "mybucket",
Key: "myNiceImage.jpg",
Body: buf,
ContentType: resp.headers["content-type"],
ContentLength: buf.length
};
s3.putObject(thisObject, function(err, data) {
if (err) {
console.log("S3 Error: " + err);
} else {
console.log("S3 Data: " + JSON.stringify(data));
}
});
});
});
The best solution is to look at what has already been done in this regard:
https://github.com/Lepozepo/S3
Also filepicker.so seems pretty simple:
Integrating Filepicker.IO with Meteor