We are developing an open-source streaming server and are running into some trouble with our implementation of HLS.
We've been able to successfully convert to TS and segment into HLS segments any stream we want to, and it plays back beautifully on most every player... except for the Apple players (iPad, iPhone, Safari, Quicktime). On those, the H264 encoding settings need to be picked very carefully, and even when sticking to Baseline/3.0 some visible glitching can be seen.
The AAC audio, no matter how we encode it (both ffmpeg's aac and the libfkd_aac encoders were tried in nearly all possible configurations) sounds choppy as well. (Again, all these versions play back just fine on non-Apple players.) Changing the encoding settings does yield better results sometimes, but we've not been able to find any combination that will work for every video we've been testing with.
This leads us to conclude that perhaps the Apple-based players require something in the TS stream itself that we're not doing correctly. Is there anything that could cause this kind of behavior? For reference, an HLS teststream outputted by our packager/segmenter can be found here: link
We appreciate any feedback!
Related
I am attempting to stream a video, in a format unity3d can access, like an mjpg. I have gone through several possible solutions, including gstreamer(only does client side as far as I could tell by the examples), yawcam(I couldn't find a way to access the image directly), and silverlight(due to simply not being able to find how the heck webcam streaming was doable) I am currently just looking for any more methods of getting video over from one side to the other. Could I possibly simply read the images into a byte array and send it over a socket? Maybe I missed something in the previous three possible solutions?
If you are looking to stream video from a server than you can use Ogg encoding + WWW.movie to map it to a texture. Assuming you have a Pro license, as I think this is a Pro only feature. If this is a local file, either bundled with the app or in external folder, we use the brilliant AVPro Windows Media or AVPro QuickTime. MJPEG does offers super smooth scrubbing with AVPro but generates enormous files. Definitely not ideal for streaming or even download!
Finally RenderHead also has a Live Camera capture plugin that could meet your needs.
I'm working on getting a streaming video solution implemented for a client, with iOS devices targeted (mostly iPad).
I have diced up my video files into TS's and I have my accompanying m3u8 file. They are being hosted by a generic web host, and CDN'd by Amazon CloudFront, so on paper speed should be fine.
I am noticing that pretty much no matter what, the iPad is still having substantial buffering problems at pre-determined points (presumably where one segment ends and another begins).
My lowest bitrate for the TS files is 600kb/s which seems like that would be plenty low for typical WiFi streaming, but it still stops pretty hard.
I'm trying to figure out what is going wrong... I don't think it's the file hosting... as once it STARTS downloading, it goes fast. I feel like perhaps my m3u8 is somehow incomplete or inadequate...
As a side note, these videos are only 30-35 seconds long, and the media segmenter slices them into 3-4 pieces.
Has anyone seen anything like this?
I'd like to get real-time video from the iPhone to another device (either desktop browser or another iPhone, e.g. point-to-point).
NOTE: It's not one-to-many, just one-to-one at the moment. Audio can be part of stream or via telephone call on iphone.
There are four ways I can think of...
Capture frames on iPhone, send
frames to mediaserver, have
mediaserver publish realtime video
using host webserver.
Capture frames on iPhone, convert to
images, send to httpserver, have
javascript/AJAX in browser reload
images from server as fast as
possible.
Run httpServer on iPhone, Capture 1 second duration movies on
iPhone, create M3U8 files on iPhone, have the other
user connect directly to httpServer on iPhone for
liveStreaming.
Capture 1 second duration movies on
iPhone, create M3U8 files on iPhone,
send to httpServer, have the other
user connected to the httpServer
for liveStreaming. This is a good answer, has anyone gotten it to work?
Is there a better, more efficient option?
What's the fastest way to get data off the iPhone? Is it ASIHTTPRequest?
Thanks, everyone.
Sending raw frames or individual images will never work well enough for you (because of the amount of data and number of frames). Nor can you reasonably serve anything from the phone (WWAN networks have all sorts of firewalls). You'll need to encode the video, and stream it to a server, most likely over a standard streaming format (RTSP, RTMP). There is an H.264 encoder chip on the iPhone >= 3GS. The problem is that it is not stream oriented. That is, it outputs the metadata required to parse the video last. This leaves you with a few options.
Get the raw data and use FFmpeg to encode on the phone (will use a ton of CPU and battery).
Write your own parser for the H.264/AAC output (very hard)
Record and process in chunks (will add latency equal to the length of the chunks, and drop around 1/4 second of video between each chunk as you start and stop the sessions).
"Record and process in chunks (will add latency equal to the length of the chunks, and drop around 1/4 second of video between each chunk as you start and stop the sessions)."
I have just wrote such a code, but it is quite possible to eliminate such a gap by overlapping two AVAssetWriters. Since it uses the hardware encoder, I strongly recommend this approach.
We have similar needs; to be more specific, we want to implement streaming video & audio between an iOS device and a web UI. The goal is to enable high-quality video discussions between participants using these platforms. We did some research on how to implement this:
We decided to use OpenTok and managed to pretty quickly implement a proof-of-concept style video chat between an iPad and a website using the OpenTok getting started guide. There's also a PhoneGap plugin for OpenTok, which is handy for us as we are not doing native iOS.
Liblinphone also seemed to be a potential solution, but we didn't investigate further.
iDoubs also came up, but again, we felt OpenTok was the most promising one for our needs and thus didn't look at iDoubs in more detail.
I have a webservice returning .flv file, it has to be played in iphone application, how do i play a .flv (flash file) in iphone?
Does anyone has faced this scenario? Programmatically is it possible to convert to some format and play in iphone?
Thanks.
IPhone doesn't and judging by the Apple official statements won't ever (or at least in the forseeable future) support flash content.
Converting the content to another format on the server side should be easy to do and would allow content playback on an iDevice.
SInce the video is probably already h.264 encoded inside the FLV container, you may want to try FLV Extract on the server to avoid recompression:
http://www.videohelp.com/tools/FLV_Extract
Basically you just need to run it once for each of the videos on the server and keep the results around.
I would recommend setting up your webservice to use something like ffmpeg ( http://www.ffmpeg.org/ ) to convert the .flv file to an mp4 file which can be played directly from the iPhone's web browser.
Pioto and Josaih are on the right track in suggesting that you should convert the video server-side using a tool like FFMpeg. As far as I know there is zero support for flv in any part of iOS, so you'd be unable to transcode it locally. Even if you could, it would make your users angry, since transcoding is a resource-intensive process that would kill their battery life and take a significant amount of time.
So, your solution is to transcode your videos to h.264 server-side. However, I'd caution against transcoding from flv->h.264 if there are any other options available. If you have the original, uncompressed (or at least less-compressed) source video available, you'll get higher-quality video by transcoding that to h.264. Each time lossy compression (eg, squeeze or h.264) is used on a file, you lose some information and quality. If you've ever seen a 3rd or 4th generation copy of a VHS tape, you can understand what I'm getting at.
Once you have a h.264 formatted video, you can play it on iOS. Not sure about the exact details of this.
You may be able to use ffmpeg or something on your server to transcode it to H.264. I'm not so sure you would really want to do that transcoding on the phone. Given Apple's current stance on Flash, this is probably your best option.
For FLV files, what I do is I upload them on Google Drive and watch them from Google Drive app.
Using FFMPEG, Live555, JSON
Not sure how it works but if you look at the source files at http://github.com/dropcam/dropcam_for_iphone you can see that they are using a combination of open source projects like FFMPEG, Live555, JSON etc. Using Wireshark to sniff the packets sent from one of the public cameras that's available to view with the free "Dropcam For Iphone App" at the App Store, I was able to confirm that the iphone was receiving H264 video via RTP/RTSP/RTCP and even RTMPT which looks like maybe some of the stream is tunneled?
Maybe someone could take a look at the open source files and explain how they got RTSP to work on the iphone.
Thanks for the info TinC0ils. After digging a little deeper I'v read that they have modified the Axis camera with custom firmware to limit the streaming to just a single 320x240 H264 feed, to better provide a consistent quality video over different networks and, as you point out, be less of a draw on the phone's hardware etc. My interest was driven by a desire to use my iphone to view live video and audio from a couple of IP cameras that I own without the jerkiness of MJPEG or the inherent latency that is involved with "http live streaming". I think Dropcam have done an excellent job with their hardware/software combo, I just don't need any new hardware at the moment.
Oh yeah, I almost forgot the reason of this post RTSP PROTOCOL DOES WORK ON THE IPHONE!
They are using open source projects to receive the frames and decoding in software instead of using hardware decoders. This will work, however, this runs counter to Apple's requirement that you use their HTTP Streaming. It will also require greater CPU resources such that it doesn't decode video at the desired fps/resolution on older devices and/or decrease battery life compared to HTTP streaming.