iPhone MPMoviePlayerController : download files while streaming en play them locally - iphone

I've m3u8 file with all the TS files. MPMoviePlayerController play them fine via http request on the streaming server. But I'd like to get the files locally in order to play them again later without any connection.
I managed to download m3u8 file and all the TS files locally on my device, I edited m3u8 files to point to local .ts instead of http ones, but I can't read them from this emplacement.
(VLC can do it well)
Is there a way to download the segments while playing (to avoid 2 downloads) and then to play them locally with MPMoviePlayerController or else.

.m3u8 is Apple HTTP Live Streaming, right? I think what you're trying to do simply goes against the design of that technology. You should expose the original file and allow it to be downloaded.
From what I understand, it's in the design of streaming that you don't get explicit access to the pieces in order to put them back together. For instance, Netflix uses streaming via Silverlight, and one of the benefits (to Netflix) is that it protects the data from being saved as if it were downloaded. Also, since HTTP Live Streaming allows a stream to switch bitrates on the fly, it's designed such that each time slice can be encoded at any number of bitrates, and none of them is canonical.
In theory, there might be a way to collect all the slices for a particular bitrate and re-encode them into a single video. But Apple's playback APIs are not going to give you that opportunity.
Instead of HTTP Live Streaming, consider progressive download. Just serve the original video file (transcode it to something the iPhone likes if necessary). If your server is configured properly, the playback APIs will do small requests to get particular chunks of the file, rather than the whole thing in one go, and it's a close second to proper streaming. I wish I could find where I read about this so I could give the proper name for it. Amazon S3 is set up to serve this way, if you need a quick solution.
But beware, Apple's docs say,
If your app delivers video over
cellular networks, and the video
exceeds either 10 minutes duration or
5 MB of data in a five minute period,
you are required to use HTTP Live
Streaming. (Progressive download may
be used for smaller clips.)

Related

Saving m3u8 video to disk in swift cocoa

I am trying to save any m3u8 stream playlist as video to disk as 1 complete video file, similar to vlc. I can create an AVAsset and play it in an AVPlayer fine, however the m3u8 links i have tried all return false from asset.isExportable so using AVAssetExportSession does not work. I thought it might be possible opening the link as an InputStream and then writing it to an OutputStream but was lost on how to do this. Is this a viable option or will it only return the actual m3u8 file instead of the .ts video links? Any guidance in the right direction would be appreciated. I am fine doing the research on how to use the different classes, i'm just kinda lost on where to go from here.
Thank you,
Phil
Building a single video from all the streams in a m3u8 playlist may not actually give you what you want, depending on the m3u8 file.
This is because m3u8 playlists can contain multiple bit rate versions for a single video - so if you added them all together you would get the same video with different quality levels (bit rates) one after another.
Its also worth noting that some videos streams will be encrypted, in fact most high value streams such as Netflix etc will be, so downloading them will not allow you to play them back unless you do it as part of the providers own 'download and go' service.
Finally, some services may make it hard for you to access the streams by requiring some form of authentication in parallel with the video stream URL.
Assuming all the above is fine or does not apply in your case, then the video files themselves can be downloaded as files using a HTTP downloading function. Good examples of these exist such as: https://stackoverflow.com/a/35510812/334402

Streaming audio/video to iPhone other than http server

I find most of streaming audio discussions are about the streaming media from http server, e.g. AudioStreamer from cocoa with love or MPMoviePlayerController. They both init with NSURL. But my case is other than that. I use SMB to access the media files on some window shared server. The media content is got with SMB message (thru socket) and is accumulated in memory (NSMutableData)
So is there a way to play them (those NSMutableData) before download is finished ?
Update, so for streaming audio I understand I need audio queue service.
What about stream video other than http? I think it is doable because there is a free app called TIOD which does not only stream audio but also video from SMB server.
BTW, I never expect others to do work for me. I check all the document I can find and can't find a way to do it (for video). I had thought, well, that may mean it can't be done. But then I find TIOD can do that. That's why I raised the quesion in the first place to see if other has experiences for it.
Yea you can stream that as well, its the same thing as getting the data from an NSURL... if you look at audio streaming example by matt gallagher here you see that he is getting data from some URL, but ultimatly when he calls the parse function he is giving it bytes of data, same things should apply for your situation, with the data you get you should be able to call the parse function and have the Audio Player stream your audio file..

iPhone: HTTP live streaming without any server side processing

I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.

streaming video FROM an iPhone

I can get individual frames from the iPhone's cameras just fine. what I need is a way to package them up with sound for streaming to the server. Sending the files once I have them isn't much of an issue. Its the generation of the files for streaming that I am having problems with. I've been trying to get FFMpeg to work without much luck.
Anyone have any ideas on how I can pull this off? I would like a known working API or instructions on getting FFMpeg to compile properly in an iPhone app.
You could divide your recording to separate files with a length of say, 10sec, then send them separately. If you use AVCaptureSession's beginConfiguration and commitConfiguration methods to batch your output change you shouldn't drop any frames between the files. This has many advantages over frame by frame upload:
The files can be directly used for HTTP live streaming without any server side processing.
The gap between data transfers allow the antennas to sleep in between if the connection is fast enough, saving battery life.
Conversely, if the connection is slow so upload is slower than recording, managing delayed upload of a set of files is much easier than a stream of bytes.

Simultaneously stream and save a video?

I'm writing an app, part of which allows the user stream/play videos. I want to restrict the functionality so that they can only stream videos if they have a WiFi connection. I will then save the video so that when they have a 3G only (or lesser) connection they can't stream videos and can only replay videos that are saved on the phone.
Ideally, I'd like to get MPMoviePlayerController to stream/play the movie and then access the movie data and save it. However, the MPMoviePlayerController api doesn't seem to support access to the movie data.
I'd like to avoid and download-then-play scenario. Any ideas?
Two solutions come to mind.
Both this solutions require that the file is in a format that can be played progressive, e.g. that you don't need the whole file to be able to play it (but that would be a prerequisite anyway).
use a thread to download the data and append it to a file, and play the file from another thread. Now, that requires that you can handle EOF events in the MPMoviePlayerController and pause the playing until the cache file is appended to and then resume for the same point.
So far what I've seen people doing this it doesn't work because MPMoviePlayerController can't handle the EOF event. (not tested it my self yet) [Caching videos to disk after successful preload by MPMoviePlayerController
Skip the playing from a file and setup a local HTTP server and stream from that (on localhost). This is also not tested.
The idea is that MPMoviePlayerController would handlle the event of missing data better from a HTTP stream then from reading the file directly.
Downside might be that it is less efficient, but I think that is a minor increase in CPU. I don't know if the network interface would handle it, but I'm assuming it's not an issue.
I leave this answer as a wiki, because I don't have a working solution but I too want one.
There is a way to make this work, but you have to write your own HTTP Live Streaming downloader.
Basically, you parse the .m3u8 file (it's a pretty simple standard, but can get tricky with alternate streams and the possibility that the stream will simply drop out and need a new playlist to continue) and then download the chunks in .ts format to your local storage, say the Documents folder or Caches etc.
Then you'll have to set up a local HTTP server to allow the MPMoviePlayerController or AVPlayer to access the files over HTTP (since they won't touch a local file path), including a re-coded playlist file pointing to the local files, which you'll have to create yourself from the original playlist(s).
CocoaHTTPServer works great for this.
Once you've done all that, it works great. It's unavoidable that you get a little delay while you download the first chunk or two before presenting your local HTTP URL to the movie player, but after that you get seamless download, recording and preview playback.
Good luck!
the iPhone is using progressive download so it will not save on the device. For that you need to explicitly download it and then play the video from your local folder.