I am trying to implement adaptive bitrate streaming in ios.
I have m3u8 url which has around 8-10 other url according to the bandwidth.
Question how to implement it in ios. Do we have a specific player which will automatically change the bandwidth or do we have to manual do it.
If manually how to do it?
iOS devices support HLS natively, which is Apple's Adaptive Bit rate streaming protocol.
You can find up to date info about specific codecs etc supported here :
https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/FrequentlyAskedQuestions/FrequentlyAskedQuestions.html
Related
Is it possible to play rstp video&audio on ionic (cordova) ?
If so, how can we accomplish it ?
I want to stream live rstp with my ionic app.
With many audio and video features you are dependent on the underlying devices capabilities and rules.
Specifically, iOS devices require you to use HLS at this time, if your app is to work on a mobile network (https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html#//apple_ref/doc/uid/TP40008332-CH102-SW5):
Requirements for Apps
Warning: iOS apps submitted for distribution in the App Store must conform to these requirements.
If your app delivers video over cellular networks, and the video exceeds either 10 minutes duration or 5 MB of data in a five minute period, you are required to use HTTP Live Streaming. (Progressive download may be used for smaller clips.)
If your app uses HTTP Live Streaming over cellular networks, you are required to provide at least one stream at 64 Kbps or lower bandwidth (the low-bandwidth stream may be audio-only or audio with a still image).
These requirements apply to iOS apps submitted for distribution in the App Store for use on Apple products. Non-compliant apps may be rejected or removed, at the discretion of Apple.
There do exist apps which appear to be able to play RTSP on iOS (e.g. https://itunes.apple.com/us/app/rtsp-player/id1070125481?mt=8 ) so it is not clear if they comply with the duration size rules above or this is just an example of Apple's 'discretion'.
Android devices should support RTSP (depending on version and possibly model etc) - https://developer.android.com/guide/appendix/media-formats.html, although Android media players can be tricky (look through stackoverflow question around video playback on Android).
If streaming video to an iOS device, do I have to use HTTP Live Streaming? Is HDS supported? The problem is we have limited storage space and HTTP Live Streaming would require us to have more video files. Can someone give me some elucidation on these matters?
If your app will stream more than 10 minutes of video Apple requires that you use HTTP Live Streaming to deliver the video, otherwise your app will be rejected when you submit it to the app store. (This happened to me the first time I submitted my app, before I knew about this requirement.)
From the HTTP Live Streaming Overview:
Warning iOS apps submitted for distribution in the App Store must
conform to these requirements.
If your app delivers video over cellular networks, and the video
exceeds either 10 minutes duration or 5 MB of data in a five minute
period, you are required to use HTTP Live Streaming. (Progressive
download may be used for smaller clips.)
If your app uses HTTP Live Streaming over cellular networks, you are
required to provide at least one stream at 64 Kbps or lower bandwidth
(the low-bandwidth stream may be audio-only or audio with a still
image).
These requirements apply to iOS apps submitted for distribution in the
App Store for use on Apple products. Non-compliant apps may be
rejected or removed, at the discretion of Apple.
iOS devices support HTTP progressive download for .mp4 files, the server could be simply Apache or Nginx. The user experience is quite similar to HTTP live streaming.
RTSP is also possible. You can migrate live555 to iOS platform as the RTSP client, as use DarwinStreamingServer as the RTSP server.
I think that HTTP progressive download is the alternative solution. We already done it through a simple HTTP server. For RTSP or other type of protocol you have to implement it by yourself.
David
Hello all i use windows encoder to stream video online and have a server that i use to broadcast this stream. i am trying to make an app that streams video to the iPhone/iPad using a unique link. i have seen apps out that stream their own DVR cameras so there must be a type of converter or encoder to use. any suggestions?
The short answer is no, not at this time. The iPhone/iPad/iPod Touch work natively with the Apple HTTP Adaptive segmented streaming protocols. MMS (Windows Media) streams are not compatible with "i" devices and will not play. You will need to look into encoding your video with this other format. Check out the Apple specs for a full description of the protocol. Future versions of Windows Media Services (4.0) are claiming that they will support the Apple protocols but this is only a preview/beta at this time and may not truly support the Apple specs.
If your trying to do on-demand iPhone video, you can utilize a service such as Encoding.com to pre-encode your files in the adaptive segmented format for your users to view. For live encoding, Telestream has a product called Wirecast which can encode in a h.264 Apple approved baseline format which can be sent to a service such as Akamai, Multicast Media, or Wowza Server for distribution to your clients.
Will the ALAC format support live streaming in iPhone ? the ALAC audio recording format is streamed to Server machine? so will i be able to play the audio chunk data, does ALAC format support?
Thank You.
Assuming you mean "Apple Lossless" audio...
I don't see why it wouldn't, but I don't know the details. You'll probably need to embed it in a transport stream instead of a MPEG 4 container (but then, I don't know how the HTTP live streaming works either).
I don't think streaming lossless audio is sensible, though.
Streaming lossless audio is possible, we have flac streaming using icecast and it works beautifully. However, we are not using HTTP Live Stream (HLS) to do it. We stream flac from the source generator to a number of servers and they create HLS's from there.
It is technically possible to mux alac into mpegts (ffmpeg can do this) as well as play it back (using ffmpeg), but there isn't a format identifier for other clients. Adding this feature to HLS will be as easy as calling/writing Apple and asking them to add ALAC to this list:
http://www.smpte-ra.org/mpegreg/mpegreg.html
and update their products accordingly. If you've purchased an Apple product less than 90 days ago, or you have AppleCare: give them a call. They have to work on the issue for you if you are covered. The more requests that get elevated to their engineers, the more likely they are to add support for alac in HLS.
We are creating a video intensive app and want to be sure that adaptive streaming is required (given that we need to write an automated transcoding and segmentation system to support this). Does anyone know if the YouTube app is using adaptive streaming?
Yes. From the documentation you can read:
Important: iPhone and iPad apps that
send large amounts of audio or video
data over cellular networks are
required to use HTTP Live Streaming.
I don't think the YouTube app uses adaptive streaming. It definatly doesn't look like it does on the iPad. But it does have wifi and 3G modes but I don't think it does fully adaptive streaming.
For your app if its allowed to stream over 3G then yes you are required to make it adaptive. But if you just are making an app for wifi use then you don't.