Is there any available open-source (preferred) or commercial library for on-fly segmenting and streaming of video to iPhone / iPad?
Also, is there any open-source/commercial server (alternative to Wowza) which supports this?
Apple offers mediastreamsegmenter:
https://developer.apple.com/library/content/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/UsingHTTPLiveStreaming/UsingHTTPLiveStreaming.html
You might also want to peek at Best Practices for Creating and Deploying HTTP Live Streaming Media for the iPhone and iPad:
https://developer.apple.com/library/content/technotes/tn2224/_index.html
There's also Darwin Streaming Server, but you may not need it.
Your first preoccupation should be to try to peek a good segmenter (video speaking): Apple's one is fine.
Then, if you wan't in-memory segmenting, mount the input source folder to a RAMdisk...
Take a look at following link;
http://developer.apple.com/library/ios/#samplecode/MoviePlayer_iPhone/Introduction/Intro.html
This example will show how to play videos from a network based URL.
For more details about playing videos;
You can check red5(java & opensource).
Check lastest beta, or code in svn (as lot hav changed since last officeial release).
It may or may not be able to do that out-of-the-box, if not you can code your red5 app and/or check others people code built arround red5...
Related
So generally, I want to make an app which has video chat functionality for iPhone. But after many searches, I am still not able to find any successful results. Is there any public or even for that matter, private API available for doing this on iPhone??? If you have an YES answer, please help me.
Basically, what I want is to read the streams of the video on both the devices connected for chatting. Thanks a lot in advance and please help me if you can.
p.s - I have already checked iDoubs but it failed and always shows some unknown problem and for that reason, doesn't allow me to connect to anyone.
ALSO : The suggested method I have found is via HTTP Live Streaming. But, in that too, I have multiple doubts.
1.) I need to find how do I upload my video from iPhone to the HTTP server from where I would be broadcasting?
2.) Can you please post something related to setting up the server? How do I feed the video to the FFMPEG Server?
Mainly, I need to find the upload method. I am right now simply sending hex-code in the form of NSDATA to the server and I am stuck there. The main problem is, It is live. How do I handle that?
It would be best, if you could help me make the iDoubs work properly.
Thank you so much for any kind of support!
have a look on this how to implement video chat in iphone But before starting you must have a IMS server up & running.
here is the live video chat framework what you are looking for. Its easy and simple to implement for face to face video chat. I have already tried this. Its working very fine. Great thing about this framework is multiple platform support.
Tokbox : https://tokbox.com/platform
https://tokbox.com/opentok/tutorials/
Sample Code:
https://github.com/opentok/opentok-ios-sdk-samples/
Edit:
Here is the article explaining opentok using parse.
http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/
HTTP live streaming is primarily an approach for adaptive streaming from server-to-client. For client-to-server rather go for traditional streaming. There exists an open library for streaming, see this question.
Whilst it is possible to facetime to do two-way chat, it is not certain that you will be able to using public iOS APIs. That said, I have implemented one-way live streaming for iPhone and the difficult part was not the core streaming itself, but encoding of the payload. You will be able to do H264 in hardware and AAC / iLBC in software.
How you want to feed this to the FFMPEG depends on your transport, possibly changing from 'file' H264 frames to 'streaming' H264. Check out the H264 frame types if you implement frame dropping; reconfiguring the H264 encoder on-the-fly is not possible to my knowledge, but restarting with fresh parameters typically does not take more than a second or so.
Did you attempt to play back a live resource while capturing? That is a good starting point. If you come across an open API for H264 encoding, please post it here ;-)
I just started work on live streaming on iPhone. So any help of how to do live streming in iPhone. I think if I can add video tag in HTML5 and then load that html in UIWebView will work.
Am I right? If not what is your sugestion to do live streaming. I want to embed some news channel live streaming link in the application so from where I can find those links.
You have to go through HTTP Live streaming document provided by Apple.There are some sample live streaming URLs.The file extension will be .m3u8.If you want to configure your own webwserver , you have to configure FFMPEG server in your webserver.The links which will help you
1)Apple document
2)stackoverflow
3)stackoverflow
4)stackoverflow
If you're making a web app in html5 then the video tag is a good choice.
But, If you're developing a native app then MPMoviePlayerController would be a much better choice. There are many example of how to use it online.
iOS doesn't support RTMP or RSTP, so your stream would need to be a HTTP Live stream. From memory the codec choice is very limited too, eg if you supply H264+mp3 you won't get any sound despite iOS supporting mp3.
Also remember that streams from other people (such as the BBC) will normally be protected by international copyright law, so unless you have prior permission to use their stream in your app you may be breaking the law.
Apple has some nice resources on Http Live Streaming.
I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.
I have a video stream that I used in an iPhone application. I'm now working to port the application to Android so I want to use the same stream.
As Apple requiered, I created a HTTP Live Streaming (media segmenter, m3u8 file, etc.). You can find the stream here: http://envue.insa-lyon.fr/smartphone/aloun_stream/prog_index.m3u8 .
I want to use this same stream on Android. Did someone have the same a resembling experience?
Honeycomb/Android 3.0 has limited support for HLS. Anything before that does not have built in support, but there are supposed to be third party SDKs that will do it, but searching shows a lot of people that can't ge a hold of the third party dev.
Check the Android dev docs to find out what is not supported.
I've given up on the m3u8 stream. I just used mp4-s with android-streaming capabilities.
you have to use webscoket to continuously get TS files as Apple defines, and send to a player to decode the H.264+AAC within TS packet
Check android 4.0 - it claims to support HTTP Live Streaming 3.0 fully, including HTTPS. For older versions I've seen some people reommening it,but haven't tried myself
I have run into a bit of a problem. I built an iPhone app that streams my podcasts via the MPMoviePlayerController. Apple will not approve it because it can use too much bandwidth over the Carrier Network. So their workaround is to use a Stream Segmenter. I am unable to install a stream segmenter on my server. Are their ANY other solutions people have come up with that can help me stream my podcast to iPhone devices? Even if I have to make it a Web Application as opposed to a native application.
Thanks,
John
You could use a simple service like Encoding.com to create iphone segmented ondemand versions of your files for multi bitrate adaptive playback. You could also provide a high and low quality and only display the high when the reachability class shows that your using wifi. I had to do the second option to get one of my apps to pass approval. Hope this helps!
Well if you don't want a native app, I think you can just put a video link on a webpage and when the user clicks it Quicktime will take over and play the file. It will play the file as it downloads it.
I don't have any experience streaming large files over the iPhone, so I can't help guide you on alternatives and keeping it a native app.