Have a requirement that when user is uploading a file it should work in a following manner
1)File upload dialog (in browser) is presented to the user. User picks a file.
2) Application should load only first x number of records (e.g. lets say total # of records are 100 then get first 10) and user will have a chance to do visual review of records (read only view).
3)User then decides one of two things : "Click on Submit" which will take in all the data and streams to the server, Or if s/he click on "Next" s/he can review next 10 records etc.
Is Scalaz-stream a good fit as a over all solution and in particular for doing 2) and 3) from above? To get only partial data and pause the stream then continue, consume, and repeating the process?
No, scalaz-stream is not a good idea. The Play! framework has its own framework with the Enumerator, Enumeratee, and Iteratee classes which can be used for asynchronous processing of streams, and the file upload code is already built to use it.
You have two options:
One, use HTML5 and front-end Javascript to get access to the file. This will only work in the newest browsers. This is the only option if you don't want any of the file uploaded until the user chooses "Submit".
Two, incrementally parse the upload as it comes to the server using the Enumerator framework, and respond over long-polling AJAX/Comet/Websocket to the front-end Javascript with a subset of records as they are parsed. The Iteratee that is parsing the incoming upload will have to pause and wait for further input from the front-end. This solution would be complex and would suffer from issues with the browser timing out.
Neither of these are a very good idea. It would be much simpler to have the entire file upload all at once, have the parsed records fed back to the front-end afterwards, and have the "Submit" button actually function as a "Save" button to tell the server to keep the received upload. Unless you are shoving 100 MiB+ Excel files up a mobile connection, this is likely the easiest and most compatible solution.
Related
I am uploading a large amount of photos and annotations to dropbox using the swiftyDropbox sdk. I want to update the UI to reflect the upload status of each item which is stored in coreData. My understanding of batchUpload is that you pass it an array of URLs and it uploads them asynchronously. I would like to use batch upload but I am not sure how to tell when a certain item is finished with batchUpload since it is operating on an array of URLs. Is there a way that I can use batchUpload, versus just iterating over the array with the upload function?
It seems that upload will would be the correct solution as I can just add each item to background thread asynchronously and update each one as they finish. Looking for arguments to persuade me either way.
The batchUploadFiles method in the SwiftyDropbox method is advantageous as only has to take one lock to upload the entire batch of files. It only calls the response block once the entire batch is done though, and the files are committed in a batch, so you wouldn't see individual uploads being completed one by one. You instead get the result for each all together at the end.
If you do need to be able to see individual file uploads completed one by one for whatever reason, you would need to use individual upload calls, but that has the disadvantage of not batching the uploads, so you're more likely to run in to lock contention.
I am currently working on a GWT screen which has a requirement of browsing GWT file once but submitting it to server many times.
But in GWT upload after clicking on submit. or even submitting using singleUploader.submit() method. File browsed by FileInputType get cleared.
Can you suggest any method to upload single file many times using gwt-upload?
Not sure if it is possible. I would try to use https://developer.mozilla.org/en/docs/Web/API/XMLHttpRequest/Using_XMLHttpRequest
and would create (using native javascript) two instances of XmlHttpRequest and would try to send them both.
The anticipated problem here is that input element on the page would receive incoming events as a result of the upload process (loadstart, progress, etc). I am in doubt that it can properly handle two parallel flows of those events successfully.
Another way is to try to send the upload requests consequently, but then you will have to generate second form submit. Which is not trivial, and browsers do not support that on the high level.
I have the following situation. We are using Zend Framework to create a web application that is communicating with it's database through REST services.
The problem I'm facing is that when a user tries to upload a big video file for example, the service is taking some time (sometimes a few minutes) to receive the request (which is also sending the video file encoded with base64_encode PHP function.) and returns the response for successful save or error.
My idea is to track how much of the data is sent and show the user a JS progress bar, which will be useful in these cases.
Does anyone have an idea, how I can track how much of the data is sent through the service and based on this I'll be able to show a progress bar?
Zend provides progress bar functionalities that might be paired with some javascript/jquery client.
You will easily find some example implementations like this one:
https://github.com/marcinwol/zfupload
However I don't think that REST services are the best solution for uploading videos as base64 encoding will make files bigger and slower to upload.
Check out Zend_File_Transfer that might be better suited to your needs:
http://framework.zend.com/manual/1.12/en/zend.file.transfer.introduction.html
This is my first question so I will do my best to conform to the question guidelines.
I am developing an iPhone app that parses and XML feed to be displayed in a table. Parsing is not a problem but I am not sure of the best way to optimize loading times after initial run of the app.
Here is the different approaches that I am considering:
Parse the XML feed each time the application is loaded. Easy way but possibly longer loading time each run of the app.
Grab the feed and store it locally (as .xml) then parse locally. Then, each time the app is opened, make an http call to see if the feed has been changed. If not, parse locally. If so, download the new feed and parse locally. The initial loading time will be longer but could be cut down on later runs (if the feed as not been updated). This option will be beneficial if the user has a bad signal but needs to see the data.
Parse the feed and store it into a local sqlite db. Then, each time the app is opened, make an http call the see if the feed has been changed. If so, detect which objects have been added/removed and alter local db accordingly. If not, load data from local db. This might be the best option but I am not sure how long finding the changes will take.
My feed is only about 100 or so items, each with roughly 20 fields.
Initial parsing time:
Roughly 4-5sec with full bars.
Roughly 5-7sec with 3 bars.
Any insight as to which option would work best would be very much appreciated.
I think the frequency of the xml data changing should be a factor. If its only going to change once a day/week? Id load it, save it, and check for updates. If update exists download new and overwrite old.
Third solution is clearly the best and it will allow your app to work offline and start quickly. To detect the change, you can simply store a MD5 of the xml file in database and match it against the MD5 of the new XML file. If data has changed, then simply discard all previous data.
I need to allows users to download multiple images in a single download. This download will include an sql file and images. Once the download completes, the sql will execute, inserting text into an sqlite database. This text will include references to the download images. The text and images are rendered in a UIWebView.
What is the best way to download everything in a single download? I was thinking to use a bundle since it can be loaded at runtime but not sure of any limitations/restrictions in this scenario. I have tested putting the bundle into the Documents folder and then accessing resources inside of it. That seems to work fine in a simple test.
You're downloading everything through a socket, which only knows about bytes, so a bundle, or even a file, doesn't "naturally" transfer through, the server side opens files and encodes and sends them into the connection, the client reads from the socket and reconstructs the original file structure.
Assuming the application has a UI for picking which items needs to be transferred, it could then request all items to the server, and the server could then send all the items through the single connection with some delimitation you invent, so that the iPhone app can split the stream back into the individual files.
Or another options is that the client could just perform individual HTTP requests for the different files, through pretty straightforward use of NSURLConnection.
The former sounds like an attempt to optimize the latter. Have you already tested and verified that the latter is too slow/inefficient? It definitely is more complex to implement.
There is a latency issue with multiple HTTP connections that you run in a sequence, however you can perhaps mitigate it by running multiple downloads connections in parallel -- for example through an NSOperationQueue with a limit of 2 to 5 concurrent download operations.