Filepicker.io — Stop .pickMultiple From Automatically Storing File In S3 Bucket - filepicker.io

I'm trying to use Filepicker.io as an uploader but in order to fire an onSuccess event for each file in the payload I'm using a combination of the .pickMultiple and .store methods. Like so:
filepicker.pickMultiple(function(fpfiles){
for(var i = 0; i < fpfiles.length; i++){
//Clean the filename
//Check duplicate
//Store the file on S3
filepicker.store(
fpfiles[i].url,
{location: 'S3', path: 'filepicker/' + fpfiles[i].filename},
function(my_uploaded_file){
//Do some other cool stuff ...
}
);
}
});
(This is opposed to using the .pickAndStore method, which will only fire an onSuccess event after the entire payload has completed transmission)
The problem I'm having with this is that it seems as thought the .pickMultiple method is 'auto-magically' saving a copy of the file in the root of my S3 bucket; so I'm ending up with two copies of the same file.
For example:
If I upload my_file.png to a folder in my bucket called IMAGES I should get the result of http://s3.amazonaws.com/my_bucket/IMAGES/my_file.png
Which is happening, but I'm also getting:
http://s3.amazonaws.com/my_bucket/UNIQUE_ID_my_file.png
Anyone know how to prevent .pickMultiple from automatically adding the file to my S3 bucket?
Thanks for any help.

For anyone else that may come across this same problem, the .pickMultiple() —> .store() method is a dead end. The (only) way to get an onSuccess event to fire for each file in the payload is to use a vanilla <input type="file" /> onChange event to get the element's FILES array and then loop through FILES and call .store() for each of the files in the array.
The Example:
$('#BTN_upload').change(function(){
var files = $(this)[0].files;
//So you can see what should be uploading
console.log(JSON.stringify(files));
//Loop the files array to store each file on S3
for(var i = 0; i < files.length; i++){
//All good. Now execute the .store call
filepicker.store(
//The file to upload
files[i],
//The file options
//(I'm storing the files in a specific folder within my S3 bucket called 'my_folder')
//This is also where you'll rename your file to whatever you'd like
{path: 'my_folder/' + files[i].name},
//OnSuccess
function(FPFile){
console.log("Store successful: ", JSON.stringify(FPFile));
//Now possibly call .remove() to remove the 'temp' file from FP.io
},
//OnError
function(FPError){
console.log(FPError.toString());
},
//OnProgress
function(progress){
console.log("Loading: " + progress + "%");
}
);
}
});
filepicker.setKey('MY_FP.IO_KEY');
And the HTML:
<input id="BTN_upload" type="file" />
This example is not a finished product. You'll still have to roll your own user feedback (like a queue display with progress bars), duplicate checking, renaming, etc. But that's all pretty simple stuff.
NOTE: This is only for local-to-S3 uploads. I'm not sure how to integrate the other souces that FP.io has on tap. Maybe you do?

Related

How to upload to a specific folder in the Google Drive GoogleAPIs v3 in Dart?

I'm want to upload images to a specific folder.
This is the function i'm using right now to upload my files, it uploads them the to main folder:
import 'package:googleapis/drive/v3.dart' as driveV3;
Future upload(File file) async {
var client = await getHttpClient();
var drive = driveV3.DriveApi(client);
var response = await drive.files.create(
driveV3.File()..name = p.basename(file.absolute.path),
uploadMedia: driveV3.Media(file.openRead(), file.lengthSync())
);
}
What i want is a way to upload ALWAYS to a folder that can be created at the first time or something like that. what do i need to modify in the above function so i can specify a folder name EX: MyFolder and always uploads to it?
You are looking for the parents of the File class
This property takes a list of String. These strings are the folders you want to upload the file to. As it seems, a file can be uploaded in multiple folders.
So your code should look something like this:
Future upload(File file) async {
var client = await getHttpClient();
var drive = driveV3.DriveApi(client);
file.parents = ["folder_id"];
var response = await drive.files.create(
driveV3.File()..name = p.basename(file.absolute.path),
uploadMedia: driveV3.Media(file.openRead(), file.lengthSync())
);
}
On the documentation I quote:
The IDs of the parent folders which contain the file. If not specified
as part of a create request, the file will be placed directly in the
user's My Drive folder. If not specified as part of a copy request,
the file will inherit any discoverable parents of the source file.
Update requests must use the addParents and removeParents parameters
to modify the parents list.

Read Uint8Array buffer from Collection and download as pdf

I saved a pdf file in a collection using this function:
/*** client.js ***/
// asign a change event into input tag
'change input' : function(event,template){
var file = event.target.files[0]; //assuming 1 file only
if (!file) return;
var reader = new FileReader(); //create a reader according to HTML5 File API
reader.onload = function(event){
var buffer = new Uint8Array(reader.result) // convert to binary
Meteor.call('saveFile', buffer);
}
reader.readAsArrayBuffer(file); //read the file as arraybuffer
}
/*** server.js ***/
Files = new Mongo.Collection('files');
Meteor.methods({
'saveFile': function(buffer){
Files.insert({data:buffer})
}
});
How can I read it again from the collection and provide a download link that the user can download the file as a pdf and save it on the local computer?
It depends on what the data type ends up to be on the front-end when you see that document record in your MiniMongo collection. What you want to do is to convert that Uint8Array data to a base64-encoded data URL and provide a Download PDF link after you get the data in the browser.
Meteor does not support serving files from the server out of the box, so you'll likely have to publish that file's blob via mongo->minimongo publication/subscription mechanism and then use the HTML data-uri API to get it like I've just described.

Get undefined variable trying to update the path to an uploaded file within MongoDB using angular-file-upload and MEAN.js

If someone could help me I would be eternally grateful. I have been slamming my head against a brick wall for weeks trying to get images to upload the way it is demonstrated out of the box with the MEAN.js users module. In the generated users module the file is uploaded into a directory and the path to that file is stored in a field in the mongodb document. I can get the file to upload to where it needs to go using multer and the fileupload function. However, I cannot save the path to the field within the document. I cannot figure out how to avoid getting an 'undefined' variable. I've tried creating a $window service and passing data to it as a global variable and a bunch of other things and I'm totally stuck.
I have commented the code below to demonstrate what is going awry in my server controller changeShoePicture function.
// This is the boilerplate code from the mean.js "users" module.
// I can not create a $window service or global variable to store the
// shoe data below so that I can update the shoe.shoeImageURL field
// in MongoDB with path to the successfully uploaded file.
exports.changeShoePicture = function (req, res) {
var message = null;
var shoe = req.shoe;
var upload = multer(config.uploads.shoeUpload).single('newProfilePicture');
var profileUploadFileFilter = require(path.resolve('./config/lib/multer')).profileUploadFileFilter;
console.log('i am here', shoe); // shoe is defined here.
// Filtering to upload only images. This works and proceeds to the else condition!
upload.fileFilter = profileUploadFileFilter;
upload(req, res, function (uploadError) {
if(uploadError) {
return res.status(400).send({
message: 'Error occurred while uploading profile picture'
});
} else {
//shoe image file is successfully uploaded to the location on the server,
// However the following fails because the shoe variable is undefined.
shoe.shoeImageURL = config.uploads.shoeUpload.dest + req.file.filename;
}
});
To make sure I've got this right:
The upload function is being called on your parameters passed by your route, req and res. You set the shoe var from req.shoe.
What are the chances that upload() is messing with your req?
Drop a console.log(req) in right after you call upload and report back

Upload Data to Meteor / Mongo DB

I have a Meteor app and would like to upload data (from csv) to a meteor collection.
I have found:
solutions (e.g. Collectionfs) which deal with file uploads
methods for uploading directly to the underlying mongo db from the shell
references to meteor router - but I am using the excellent iron-router, which does not appear to provide this functionality
My requirement is that the app user be able to upload csv data to the app from within the app. I do not need to store the csv file anywhere within the app file structure, I just need to read the csv data to the collection.
It is possible that I cannot figure out how to do this because my terms of reference ('upload data to meteor') are ambiguous or incorrect. Or that I am an idiot.
ChristianF's answer is spot on and I have accepted it as the correct answer. However, it provides even more than I need at this stage, so I am including here the code I have actually used - which is largely taken from Christian's answer and other elements I have found as a result:
HTML UPLOAD BUTTON (I am not including drag and drop at this stage)
<template name="upload">
<input type="file" id="files" name="files[]" multiple />
<output id="list"></output>
</template>
JAVASCRIPT
Template.upload.events({
"change #files": function (e) {
var files = e.target.files || e.dataTransfer.files;
for (var i = 0, file; file = files[i]; i++) {
if (file.type.indexOf("text") == 0) {
var reader = new FileReader();
reader.onloadend = function (e) {
var text = e.target.result;
console.log(text)
var all = $.csv.toObjects(text);
console.log(all)
_.each(all, function (entry) {
Members.insert(entry);
});
}
reader.readAsText(file);
}
}
}
})
NB there is a jquery-csv library for Meteor here: https://github.com/donskifarrell/meteor-jquery-csv
I've solved this problem in the past using this gist of mine, together with this code (using the jquery-csv plugin to parse the csv data). This is done on the client side and is independent of using iron-router or not. It would be fairly straightforward to move the insertion code into a Meteor method, uploading the csv file first and then parsing and inserting the data on the server. I've tried that, too, but didn't see any performance improvement.
$(document).ready(function() {
var dd = new dragAndDrop({
onComplete: function(files) {
for (var i = 0; i < files.length; i++) {
// Only process csv files.
if (!f.type.match('text/csv')) {
continue;
}
var reader = new FileReader();
reader.onloadend = function(event) {
var all = $.csv.toObjects(event.target.result);
// do something with file content
_.each(all, function(entry) {
Items.insert(entry);
});
}
}
}
});
dd.add('upload-div'); // add to an existing div, turning it into a drop container
});
Beware though that if you are inserting a lot of entries, then you are better off turning all reactive rerendering off for a while, until all of them are inserted. Otherwise, both node on the server and the browser tab will get really slow. See my suggested solution here: Meteor's subscription and sync are slow

ajaxcontroltoolkit setting hidden value after asyncfileupload has completed

I have an asyncfileupload control that I'm using from the ajaxcontroltoolkit. On the file complete in the code behind I process the file and write the information in the file to a database. I get the id of the record from the database, and this needs to be written to an asp hidden field. I've tried just setting the value:
fldImageID.Value = pimg.IdImageGroup.ToString();
I've tried Registering a script like I've seen in an example on a website:
ScriptManager.RegisterClientScriptBlock(
ImageFileUploader,
ImageFileUploader.GetType(),
"script1",
"alert('hi'); top.document.getElementById('"
+ fldImageID.ClientID
+ "').value='"
+ pimg.IdImageGroup.ToString()
+ "'; top.document.getElementById('"
+ lblError.ClientID
+ "').innerHTML = 'image uploaded'",
true);
I've just tried embedding javascript in a response.Write call from the method I've set to process the uploaded file. Nothing I've done has worked so far. After I've done everything the hidden field still does not contain the required value.
This is pretty easy with jQuery. Have an html hidden input control placed in your page, not the asp:hidden input control. Add a class lets say "hiddenPhoto" to your html hidden control.
so lets say our control html is like this
<input type="hidden" class="hiddenPhoto" runat="server" id="hdPhotoName" />
Now access it using class selector in your OnClientUploadComplete js method and set its value. Have it declared runat="server" in order to access its value on the server side.
Regards
I found an acceptable solution back when I was working on this. And since then I've received emails from people who have had the same problem and have been asking if I found a solution. So I'm presenting it here, stripping out any extraineous code:
From the user control that has the FileUpload control I first set the session variable on the back side in the FileUploadComplete handler:
*in the ascx file (upload_chart.ascx) I have the AsyncFileUpload, what is important is the OnUploadComplete and the OnClientUploadComplete:*
<ajaxToolkit:AsyncFileUpload
OnUploadedComplete="FileUploadComplete1"
OnClientUploadComplete="UploadComplete1"
ID="ImageFileUploader"
runat="server" />
*in the code behind of the ascx file (upload_chart.ascx.cs) I handle the FileUploadComplete:*
public void FileUploadComplete1(object sender, EventArgs e)
{
try
{
if (ImageFileUploader.FileBytes.Length > 0)
{
// File data is in ImageFileUploaded.FileBytes
// Save it however you need to
// I saved it to a database, in a DBImage Object class I created
// DBImage is specific to my application
ODS.Entity.DBImage pimg =
ODS.Data.DataRepository.SaveImageBytes(ImageFileUploaded.FileBytes);
// Set the ImageID1 in the session
Session["ImageID1"] = pimg.IdImageGroup.ToString();
}
else
{
// error handling for an empty file, however you want to handle it
}
}
catch (Exception Ex)
{
// error handling for an unhandled exception, whatever you want to do here
}
}
Javascript and script methods are used to set the value on the page, here is my codebehind for the script method:
// on the aspx page code behind (chartofthedayadmin.aspx.cs) I have the webmethod:
[System.Web.Services.WebMethod]
public static string GetImageID1()
{
System.Web.SessionState.HttpSessionState Session = System.Web.HttpContext.Current.Session;
String retval = Session["ImageID1"].ToString();
Session["ImageID1"] = null;
return retval;
}
Here is the javascript:
// on the aspx front end (chartofthedayadmin.aspx) I have the javascript
// to call the Web method and the javascript failed message:
function UploadComplete1() {
var str = PageMethods.GetImageID1(uploadSuccess1, uploadFailed);
}
function uploadFailed() {
alert('error occurred or some meaningfull error stuff');
}
*// javascript on the user control (upload_chart.ascx) to set the value of the hidden field*
function uploadSuccess1(result) {
document.getElementById('<%= fldImageID.ClientID %>').value = result;
}
note: Make sure your scriptmanager has EnablePageMethods="true".
The better and more simple solution is in code behind:
string script = String.Format("top.document.getElementById('hdnFilename').value='{0}';", safeFilename);
ScriptManager.RegisterClientScriptBlock(this, this.GetType(), "hdnFilenameFromCodeBehind", script, true);
In my case, safeFilename is the unique filename, after handling duplicate filename, i.e. sample_5.png in the 5th upload of sample.png.
See http://forums.asp.net/t/1503989.aspx