S3 Upload with Amazon IOS Sdk - iphone

I know its a known issue but has anyone found a way to "fix" the connection failure on iPhone in 3G of "relativly" large files ?
My application depends highly on S3 for upload and keeps failing uploads of files larger then 200KB

Depends on what's causing the failure.
An easy, albeit imperfect solution is to increase the timeout on your AmazonS3Client:
s3 = [[AmazonS3Client alloc] initWithAccessKey:S3_ACCESS_KEY_ID withSecretKey:S3_SECRET_KEY];
s3.timeout = 240;

I figured this out some time ago but forgot to update the reply, actually what was happening was that i was using an HTTP connection and it seems that if uploading Media files there are some Operators that have online "Conversors" dont know how to call them that take for instance your JPEG and "optimize" that jpg for mobile devices (this also applies to other media types), and since that modified the file that wont match S3 Header with the file "HASH", the way i worked around the problem was to use an HTTPS connection which prevents those intermediary servers to modify my upload

Related

Best Practices for serving dynamic files in a backend

does anyone know of best practices or common strategies in backend design for serving dynamic images and videos to client applications?
Background: I'm currently building an application that allows users to upload their own images and videos. I'm not really sure about how to serve these media files back to the client in the most efficient way. Do I store the files on the same VPS that my application server is running on? Do I need to save the files in different qualities / densities to better adjust for the clients' screen resolution? (I'll have mostly mobile clients)
I tried googling these questions but apparently I'm asking the wrong questions :-)
I would really appreciate maybe a reference or professional vocabulary on these topics.
Thanks in advance.
1) You need to split web server and application server.
First of all do not try to stream media files from your backend unless you can offload low-level stuff to OS - most likely you will do it wrong.
Use proxy server as an web server to serve such files.
nginx will do.
Also you need to have backup of your media files the same way as you do backup of your database.
Storing static huge media files along with application server is wrong move - it will not scale at all.
You can add cron task to move files to some CDN server - when your move is complete you replace URL in database to match new location.
So by using nginx you will save precious CPU and RAM while file is getting moved to external server.
And CDN will help you to dedicate bandwidth and CPU/RAM resources to application server.
2) Regarding image resolution and downsampling:
Screens of modern handsets have the same or even better resolution compared to typical office workstation.
Link speeds have much bigger impact on UX.
If client has smartphone with huge screen but with slow link you still have to deliver image or video as fast as possible even if quality of media will not be match the resolution of handset.
It makes sense to downsample images on demand and store result on disk for nginx/CDN to serve it again.
In case of videos it makes sense to make "bad" version with big compression(quality loss) for the cases of slow link - device will downsample it itself during playback.
And you can keep client statistics (screen sizes/downlink speeds) and generate optimized versions of such video file later when you see that it is "popular".
FYI: Several years ago some social meda giant dropped idea to prepare all possible versions of the same media file in favour of FPGA on-the-fly resampler.
I do not remember the name of the company and URL to the article. It was probably instagram.
Some cloud providers have offers with FPGA or CUDA on board to do heavy lifting.
So in some cases you could exchange storage for heave horsepower to do conversion on the fly.

Filepicker.io - Picasa size limitation

We're noticing some issues with Filepicker.io and the Picasa integration. It seems that large images (over ~ 2MB) aren't being processed - the POST to www.filepicker.io/api/store/ returns The specified bucket is not valid. for those files. Smaller size files process just fine.
Not sure where the issue lies. The response would indicate an issue with our S3 bucket perhaps but we've been able to process large Computer uploads without issue. Could it be a limitation in the Picasa Web API? Any information would be helpful. Thanks.

How to troubleshoot streaming video (rtmp) performance?

I'm streaming videos via rtmp from Amazon Cloudfront. Videos are taking a loooong time to start playing, and I don't have any way of figuring out why. Normally I'd use the "Net" panel in Firebug or Web Inspector to get a good first impression of when an asset starts to load and how long it takes to be sent (which can indicate whether the problem is on the server end or network versus the browser rendering). But since the video is played within a Flash player (Flowplayer in this case), it's not possible to glean any info about the status of the stream. Also since it's served from Amazon Cloudfront, I can't put any kind of debugging or measuring tools on the server (if such a tool even exists).
So... my question is: what are some ways I can go about investigating this problem? I'm hoping there would be some settings I can tweak on either the front-end (flowplayer) or back-end (Cloudfront), but without being able to measure anything or even understand where the problem is, I'm at a loss as to what those could be.
Any ideas for how to troubleshoot streaming video performance?
You can use WireShark (can diessect RTMP) or Fiddler to check what is going on... another point (besides the client and the server) to keep in mind is your ISP.
To dig deeper you can use this http://rtmpdump.mplayerhq.hu/ OR http://www.fluorinefx.com/ OR http://www.broccoliproducts.com/softnotebook/rtmpclient/rtmpclient.php.
You need to keep in mind that RTMP isn't ideal since it usually bypasses proxies and tries to make direct connection... if this doesn't work it can fallback, but that means that some time has already passed (it wait for a connection timeout etc.)... if you have an option to set CloudFront/Flowplayer to RTMPT then I would recommend doing so since that uses Port 80 for the connection.
Presumably - if you go and attempt to view a video - then come back 20min later and hit it again - it loads quickly?
SAN -> Edge Servers ---> Client
This is all well and good in a specific use case (i.e. small filesize of the origin content, large long running cache) - but, it becomes an issue when it's scaled out, with lots of media hosts running content through the system i.e. CloudFront.
The media cache they keep on their edge servers gets dumped fairly often - after the cache is filled - start dumping from the oldest file in cache - so if you have large video files that are not viewed often - they won't be sitting in the edge server cache, and take a long time to transfer to the edges - thus, giving an utterly horrific end user experience.
The same is true of youtube, for example - go and watch some randomly obscure, high duration video - and try it through a couple of proxies, so you hit different edge servers, you'll see exactly the same thing occur.
I noticed a very noticable lag when streaming RMTP from cloudfront. I found that switching to straight http progressive from the amazon S3 bucket made the lag time go away.

Need advice on improving ftp upload performance in C#

I'm writing software that uploads and downloads a number of files using ftp with a remote server. Download speeds are fine and stay consistent at upwards of 4mb/s. Small uploads are instantaneous. The problem I'm experiencing is when my program uploads a large 40Mb zip file, i'm getting extremely poor performance. It seems to upload in bursts (100-200Kb/s) and then delay for a second and do this repeatedly until the file eventually finishes uploading. Programatically downloading the file from the same server takes 30 seconds tops, uploading the same file to the same server using filezilla takes about the same amount of time. Uploading through the software can take up to 15 minutes. Something is clearly wrong.
I am using the starksoft ftp library to handle uploads/downloads from here: http://starksoftftps.codeplex.com/
Here is an example of the problematic code:
FtpClient ftp = new FtpClient(sourcecfg.Host);
ftp.MaxUploadSpeed = 0;
ftp.MaxDownloadSpeed = 0;
ftp.FileTransferType = TransferType.Binary;
ftp.DataTransferMode = TransferMode.Passive;
ftp.Open(sourcecfg.FtpUserName, sourcecfg.FtpPassword);
ftp.PutFile(backupTempPath, targetcfg.getFullPath() + "wordpress-backup.zip", FileAction.Create);
I've also tried using an overloaded version of PutFile that takes a Stream object instead of a path string. The results were unchanged.
Incidentals: I'm compiling in visual c# express 2008 in winxp inside of a virtualbox instance. I've tried both the debug and production exe's with no change in results.
The issues feels like a buffering or a throttling issue but while looking at the internal code of the ftp classes i don't see anything unusual and I'm specifically setting it not to throttle. Any suggestions or comments about this particular ftp component library?
It might be interesting to know if the connection using FileZilla is an Active mode connection or Passive mode connection. Other things of interest would be to try downloading the file using an FTP client that shows the dialog between the client and the server. I am not sure if FileZilla will show you this information or not.

UIPasteboard size

this is no longer relevant since iOS4+ - so please stop downvoting! or at least explain your downvotes
I am trying to do an upgrade path for a lite to full version of an application, that can store an indefinite amount of data (I dont want to do in app purchase).
I would like to be able to upgrade using a custom url without needing an online presence to cache the data to.
So was thinking of using a UIPasteboard object.
Does anyone know, or done any investigations on the max possible size of data stored to a UIPasteboard? There seems to be no apple documentation, that i can find, regarding this.
Will this vary from device to device? i.e. is it RAM limited?
I tried a 50MB file and know this fails (even in simulator) though a 5 MB file is OK. There is no way of knowing if it has failed until you come to get the data with dataForPasteboardType:
Also, has anyone done 2 app custom URLs that will do a kind of request/response inter app comms? I was thinking that i could support arbitrary sized data this way...
I'll answer my own question, after doing some investigation, it's 8MB. Device independent.
I have also managed to support upgrade of arbitrarily large amounts of data using 2 custom URLs and the openURL method recursively between apps. Tested up to 100MB - it doesn't look too pretty because it opens each app 12 times!
I'm surprised there's any limit. I don't see why/how there would be. You give UIPasteboard a dictionary with a retained pointer to your data. Who cares how big it is? I suspect if you are detecting a limit, it is really a limit on the size of an NSString, or NSData or whatever it is you're supplying to UIPasteboard.