We have a http live streaming running on our iOS app. We want to get thumbnail images every 1 minute. I tried using MPMoviePlayerController methods
thumbnailImageAtTime:timeOption:
and
requestThumbnailImagesAtTimes:timeOption:
But both these options return nil. The documentation doesn't say if these methods do not work for http live streaming. Any ideas what could be the issue?
Now documentation for this method says:
"This method is not not called when the source URL is an HTTP Live Streaming (HLS) content source. See HTTP Live Streaming Overview."
Try registering for MPMoviePlayerThumbnailImageRequestDidFinishNotification before calling this method and on getting notification check value of this key MPMoviePlayerThumbnailImageKey. In case image capture was successful value of this key will contain a valid UIImage for you to use.
Related
I have an HTML5 audio control and would his src property is pointing to my middleware, an express / node server that delivers a streaming mp3 file.
On the middleware I'm using res.pipe() to output the mp3 file.
It's working great with one caveat: I can't send my authorization header.
So I want to use axios to access my middleware which works fine but I can't figure out how to "feed" the audio element.
If I do:
const response = axios.get('/api/stream',{requestHeaders:'stream'});\
myAudio.src = response;
It throws an error and I'm block from there...
Thanks for any help :)
Finally managed to pass a token as a get parameters instead of trying to use axios.
When the server gives the response JSON the webviewdidfinishload method is not get called and I did'nt know how to read this steam from the server. For reference see screenshot:
http://www.flickr.com/photos/97186141#N05/9388886673/
I'm looking for a solution for reading the http status code with a UIWebView.
I have found this page on the topic How do I get the last HTTP Status Code from a UIWebView? but i cannot use AsiHttpRequest in my case.
Si I was wondering if somebody have found a solution since 2009, and if something similar to NSLog(#"Status code = %#",[webView stringByEvaluatingJavaScriptFromString:#"document.status"]);
could possibly work.
Thanks,
I don't think you can get it from the UIWebView, but I think it would work to have the result of an HTTP request put into an NSString, then parse the status code out of the header part of that string, thing feed that string to a UIWebView.
See the NSURL Class Reference and the URL Loading Programming Guide.
A possible alternative would be to implement an HTTP proxy directly inside your App, then feed a localhost URL to UIWebView. Your proxy would just make an HTTP connection with the web server and sit passively by while UIWebView drives the HTTP protocol. You then snoop on the incoming data before passing it on to UIWebView from your proxy. That would avoid the need to buffer the whole page in an NSString before feeding it to your UIWebView.
I noticed couple thread on this already and they even provided sample code.
http://brunofuster.wordpress.com/2010/11/03/uploading-an-image-from-iphone-with-uiimagepicker-and-asihttprequests3/
But what baffled me is that - there was no response to get handled? is it because that s3 doesn't return any response? I am expecting to receive at least an URL to the image on S3, how could I get that?
If you look at the S3 REST object PUT documentation you will see the response that is returned from S3.
When you post to S3 you know the bucket name you are putting the image into plus you know the filename. These two pieces of information should be all you need to get a url to the image.
The documentation states that in addition to the PUT response header(s) you can see some of the common headers too.
This implementation of the operation
can include the following response
headers in addition to the response
headers common to all responses. For
more information, see Common Response
Headers.
If you look at the ASIHTTPRequest Amazon Simple Storage Service (S3) support you will see how to get a response from the ASIS3ObjectRequest object.
Tom,
If you wish to just get your S3 image url, you don't need the response information considering you already know the image name and the bucket (if there was no error).
Anyway, you can get the response from a sync request by using [request responseString|responseData].
But the right thing to do is an async call using operation queues and delegates to get the response success or error. My blog post just provided a minimal sample. I will look into that and improve the post itself.
Thanks!
In addition to the answers already provided, you might also want to look at Amazon's recently released AWS SDK for iOS, which includes sample code for uploading images, etc. to S3.
On the iphone, when trying to play a broken url using MPMoviePlayerController the user gets an alertbox with the message "the server is not correctly configured".
Is there any way to change this to something more user-friendly? Alternatively, is there any way to get an error status from the player instead of getting this message?
Thanks in advance..
MPMoviePlayer provides two notifications for the case of a broken / invalid movie URL:
From MPMoviePlayer initWithContentURL:
To check for errors in URL loading, register for the MPMoviePlayerContentPreloadDidFinishNotification or
MPMoviePlayerPlaybackDidFinishNotification notifications.
On error, these notifications contain an NSError
object available using the #"error" key in the notification’s userInfo dictionary.
You should be able to hook to one oh these notifications and perform the needed actions
if a broken URL is about to be loaded.
Usually this error message means that the web server that's serving the file does not support HTTP byte ranges.
iPhone OS uses HTTP byte ranges for streaming audio and video content. This makes it possible to "scrub" forwards and backwards in the content without downloading the entire content first.
Once I've got this error when trying to play urlencoded url string. I removed urlencoding method call before ask MPMoviePlayerController to play url and everything ok.