I want to upload files from different file inputs in SailsJS - sails.js

await sails.upload(inputs.logo);
await sails.upload(inputs.thumbnail);
I use sails-hook-uploads. Only the first incoming file (logo, in my case) is uploaded and for thumbnail, it shows this message.
Upstream (file upload: thumbnail) emitted an error: { Error: EMAXBUFFER: An upstream (thumbnail) timed out before it was plugged into a receiver. It was still unused after waiting 4500ms. You can configure this timeout by changing the maxTimeToBuffer option.
Is there anyway I can fix that or alternative simple way to upload files from different file inputs in sailsjs?

Since you are uploading files one by one, try using uploadOne.
await sails.uploadOne(inputs.logo);
await sails.uploadOne(inputs.thumbnail);

Related

Getting pdf data from cloudinary and saving it in flutter

I have been using cloudinary to store my pdf files the use firebase as my backend. However, am having this issue with it. When I use http.get on the link provided by cloudinary https://res.cloudinary.com/cloudname/raw/upload/vxxxxxxxxxxx/xxxxxxxxxxxxxxxxxxxx I get ascii data instead of the pdf file. I have tried to write the data as string to a file with .pdf extension that has not worked. I have tried to convert it to bytes then write to the file as bytes and that has not worked either. Any help on how I can save this file on android using flutter It would be a great help.
await http.post('https://api.cloudinary.com/v1_1/${DotEnv().env['cloud_name']}/raw/upload',body: { "file": "data:raw/pdf;base64,$pdf", "upload_preset": DotEnv().env['upload_preset']} );
var response = await http.get('https://res.cloudinary.com/<cloudname>/raw/upload/v1590958359/gsvxe4zp7bb6yyldrccu');
final directory = await getApplicationDocumentsDirectory();
final path2 = directory.path;
File x = File('$path2/trial.pdf');
x.writeAsString(response.body);
Here is the code as I currently have it. pdf in this case is the base64 encoding of the pdf. The link I have here in http.get is from a file I had already uploaded.
Thanks for adding the additional details.
Firstly, I would recommend you change the resource_type from raw to image. In addition, please update the MIME type in your Base64 Data URI to application/pdf.
The URL you then get from Cloudinary upon a successful upload will be to a PDF resource which means you can download it with a simple cURL request or via any HTTP library.
For example, using the secure_url to the asset returned in the Upload API Response to download the file to my_test_file.pdf -
curl -s https://res.cloudinary.com/demo/image/upload/multi_page_pdf.pdf --output my_test_file.pdf

How to catch a client disconnection when sending a large file in play framework?

I am sending a file in play application response. When the client downloads the file, i am cleaning the file from the local server. I am achieving this using below code:
val fileToServe = TemporaryFile(new File(fileName))
Ok.sendFile(fileToServe.file, onClose = () => { fileToServe.clean })
But when client disconnects the connection, the temporary file remains in the local server. But i want to handle this disconnection and clean up the
temporary file. I heard about onDoneEnumerating() but couldn't use it.
Can anyone point me out the easiest way to handle the disconnection and clean up the temporary file from local server?
TemporaryFile is for when Play receives a data stream that has to be kept in a temporary file location, rather than you sending a file out. It removes on finalization (pre 2.6) or on phantom file reference (2.6.x).
The easiest way to catch the disconnection is to call Files.deleteIfExists
https://docs.oracle.com/javase/8/docs/api/java/nio/file/Files.html#deleteIfExists-java.nio.file.Path-
in the onClose block. If that doesn't seem to be working for some reason, you can use the temporary file reaper:
https://www.playframework.com/documentation/2.6.x/ScalaFileUpload#Cleaning-up-temporary-files
that will clean out the temp files directory every so often.

Google Cloud Storage InvalidPolicyDocument caused by the submit field

For the past couple of years, I've been using Google Cloud Storage to handle storing files for a project that runs for a couple months every year. In the process of testing this year, I've been running into issues failing to upload certain files.
This is the response I've been getting:
<Error>
<Code>InvalidPolicyDocument</Code>
<Message>The content of the form does not meet the conditions specified in the policy document.</Message>
<Details>Missing upload</Details>
</Error>
As additional background, I use plupload, and its flash runtime, to handle the upload functionality and send the form submit to GCS.
The headers for the request do include an upload field
------------KM7gL6Ij5KM7KM7cH2Ij5cH2GI3cH2
Content-Disposition: form-data; name="Upload"
Submit Query
and the upload field is specified in the Policy Document I send to GCS
{
"expiration": "2014-09-25T11:32:54.000Z",
"conditions": [
...
["eq", "$Upload", "Submit Query"]
]
}
The only time I get this error is when I try to upload files greater than 100KB in size (file types tested: jpg, png, mp3).
In short, in cases where the file size is greater than 100KB, I get an error telling me that there is no form field for the "upload" variable referenced in the Policy Document, but if the file size is less than 100KB, it accepts the transaction.
I have tried not supplying the "upload" variable as part of the policy document and get the opposite result.
Has anyone encountered something similar or can anyone shed some light on the issue? I'm unsure at this point if plupload is causing the problem or if GCS in fact ignores the "Upload" (submit) field in certain cases but not in others.

GoodData Export Reports API Call results in incomplete file

I've developed a method that does the following steps, in this order:
1) Get a report's metadata via /gdc/md//obj/
2) From that, get the report definition and use that as payload for a call to /gdc/xtab2/executor3
3) Use the result from that call as payload for a call to /gdc/exporter/executor
4) Perform a GET on the returned URI to download the generated CSV
So this all works fine, but the problem is that I often get back a blank CSV or an incomplete CSV. My workaround has been to put a sleep() in between getting the URI back and actually calling a GET on the URI. However, as our data grows, I have to keep increasing the delay on this, and even then it is no guarantee that I got complete data. Is there a way to make sure that the report has finished exporting data to the file before calling the URI?
The problem is that export runs as asynchronous task - result on the URL returned in payload of POST to /gdc/exporter/executor (in form of /gdc/exporter/result/{project-id}/{result-id}) is available after exporter task finishes its job.
If the task has not been done yet, GET to /gdc/exporter/result/{project-id}/{result-id} should return status code 202 which means "we are still exporting, please wait".
So you should periodically poll on the result URL until it returns status 200 which will contain a payload (or 40x/50x if something wrong happened).

NSURLConnection(download large files more than 500M)

I want to download a large file (> 500MB) to my application from the server. I used NSURLConnection, that works well if the network is very good. but sometimes I tried to download 500MB file, but 200MB or 100MB only downloaded if the network is not very good.That means I got the connectionDidFinishLoading method when the task was not completed.Someone says set a timeout second to avoid this situation,but i set timeout second 30s,it did not work.Should I set 60s or more? Does someone have better idea,please help me.
in connectionDidFinishLoading method every time check the length of data to download and the downloaded data.
length of the data to be download is gain by this [response expectedContentLength]; in didReceiveResponse method
You should download such big file in parts. Specify the Content-Range field in the header of your HTTP request and ask only for a small portion of the file at once. When you get all portions, you can assemble the file together.
You can set HTTP headers with NSMutableURLRequest setValue:#"0-1023/*" forHTTPHeaderField:#"Content-Range"];, this example downloads only the first kbyte of the file. See also Content-Range in http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html