Basically, I want to stream and play a MMS video with iOS SDK. I can stream some videos with MPMovieplayer but not MMS or RSTP.
I researched this, but I couldn't find a clear solution. Can anybody help me?
I tried VLC Mobile: http://wiki.videolan.org/MobileVLC
Dropcam: https://github.com/dropcam/dropcam_for_iphone
But I am unable to use these options.
You should use ffmpeg library, as this library can connect any streaming server (supporting rtsp, mms, tcp, udp ,rtmp ...) and then draw pictures to the screen.. (for drawing you can use opengles or uiimage also works)
First of all, use avformat_open_input to connect to your ip address then use avcodec_find_decoder & avcodec_open2 to find codecs and to open them (you should call them for both audio & video)
Then, in a while loop read packets from server by using av_read_frame method When you get frame, if it is audio then sent it to AudioUnit or AudioQueue, if it is video, then convert it from yuv to rgb format by using sws_scale method and draw the picture to the screen.
That's all.
look at this wrapper also (http://www.videostreamsdk.com), it's written on ffmpeg library and supports iOS
You can take a look at Apple Http Live Streaming. Some docs here.
Related
I'm looking for a way to create an app that will allow captured camera video to be streamed on a computer. For example, one person could be walking an iPhone around a room and another person could have that video streamed on their computer. Something kind of like a one-way Facetime except the receiver is on a computer. Also, I can't just use an existing app as later I would like to change the program to do some computer vision processing on the incoming data.
At the moment, I've found that AV Foundation should be the correct option for the video capture (from this question). However, I'm having difficulty finding the method by which I can actually stream this data. In particular, searching for how to create the apps on the iPhone frequently results in existing apps that do the task, but not how to create the app.
Can anyone give me a pointer to the information on how to stream the video capture from the iPhone? Thank you much.
You can use "Wowza media Server" for Streaming purpose
For wowza media server doenload :
Wowza Download
After installing wowza Now you need to set up live setting in wowza for that purpose you need:
Setting Up Live Application
For iOS side there is library is useful for video streaming using RTMP connection
You can get Library at
RTMP library for Streaming
Library example
RTMP library for Streaming example
In this good example of Streaming from iOS side
I had success with ANGL lib and Wowza media server. It gives smooths RTMP stream.
I need to implement iphone video streaming to server. I've googled a lot but I found only receiving video streams from server. It is made using UIWebView or MPMoviewPlayer.
But how can I stream my captured media to server in realtime?
How can it be done?
check out this Apple sample code. this is using a AVFoundation.
StitchedStreamPlayer
I know how to get frame from iOS sdk.
[How to capture video frames from the camera as images using AV Foundation(http://developer.apple.com/library/ios/#qa/qa1702/_index.html)]
It's pixel, and i can transfer it to JPEG.
What the way I want to transfer the video is like this:
One iOS device A:
Get the pixel or JPEG from call function
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
Using existed technology encoding to h.264 - ffmpeg
encapsulate video with TS stream
Run http server, and wait for request
The other iOS device B:
http request to A(using http simply instead of rtp/rtsp)
So my question is, do I need to use ffmpeg to get h.264 stream or i can get from iOS API?
If I use ffmpeg to encode to h.264(libx264), how to do that, is there any sample code or guideline?
I've read the post What's the best way of live streaming iphone camera to a media server?
It's a pretty good discussion, but i want to know the detail.
The license for ffmpeg is incompatible with iOS applications distributed through the App Store.
If you want to transfer realtime video and have any kind of a usable frame rate, you won't want to use http nor TCP.
Although this doesn't directly answer your question about which video format to use, I would suggest looking into some 3rd party frameworks like ToxBox or QuickBlox. There is a fantastic tutorial using Parse and OpenTok here:
http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/
I'm capturing CGImageRef frames using UIGetScreenImage() on an iPhone 3GS 3.1.3 and want to stream this over the interwebs in any appropriate, well known format.
What video codec libraries are available that would allow me to encode the raw pixel data into a video? (i'm not too familiar with video codecs, but am familiar with network programming to actually send the data out)
I wouldn't recommend encoding on the client. Have the video available on the server already encoded and use MPMoviePlayerController to present it:
good tutorial on how to stream video:
http://buildmobilesoftware.com/2010/08/09/how-to-stream-videos-on-the-iphone-or-ipad/
I'm trying to stream video with static IP : http://38.117.88.148/GemTVLink
via iPhone. Can you show me some information as to how I could implement this? I see an apple video stream app but it seems it can show only .mp4 movies? Am I right?
I want my app to load the HTTP address and play the movie, that's it.
This link works on media player.
The iPhone does not support all streaming video formats.
You should start by reading HTTP Live Streaming Overview
iOS native video player (AVPlayer, MPMoviePlayerViewController ...) can stream from http server in m3u8 format.
I looked at the link, that you mentioned(GemTVLink), it's an mms stream, iOS can not stream from microsoft streaming servers (mms), if you want to do that, you should use ffmpeg library, as this library can connect any streaming server (supporting rtsp, mms, tcp, udp ,rtmp ...) and then draw pictures to the screen.. (for drawing you can use opengles or uiimage also works)
First of all, use avformat_open_input to connect to your ip address then use avcodec_find_decoder & avcodec_open2 to find codecs and to open them (you should call them for both audio & video)
Then, in a while loop read packets from server by using av_read_frame method When you get frame, if it is audio then sent it to AudioUnit or AudioQueue, if it is video, then convert it from yuv to rgb format by using sws_scale method and draw the picture to the screen.
That's all.
look at this wrapper also (http://www.videostreamsdk.com), it's written on ffmpeg library and supports iOS