I need to develop an app what is capable of receiving a RTSP Stream.
I tried to find solutions/tutorials in the internet for the whole day now, but without any success.
I read a lot about using FFMPEG or Live555 (more FFMPEG, also I read that Live555 is not necessary when using the newest version of FFMPEG), but nowhere I looked it was described in a form I could understand, when I found questions on stackoverflow the answers were really short and I could not figure out what they tried to explain.
So now I need to ask myself.
I used "Homebrew" to download and install FFMPEG, now when I look at my dir /usr/local/
I can see this, the installed files are contained in subfolders of "Cellar"
I also tried to have a look at these projects:RTSPPlay by Mooncatventures and kxmovie by kolyvan.
I did not really figure out how to work with these projects, the Documentation is indefinite and "murky".
Well, when I tried to compile these projects the kxmovie failes with errors that are like "missing avformat.h",
I added the dylibs from the usr/local/cellar/ffmpeg/1.2.1/lib to the project but it seems that this is not the right method.
Nearly the same Issue with the RTSPPlay xcodeprj, it gives back the error that an "Entitlements.plist" is missing, after removing the linkings to that file completely I am getting 99+ Apple Mach-O Linker Errors, honestly I could not understand why.
I wanted to try the Live555 too but I cant see through all these obscure and confusing files, again I could not oversee the documentation and how to build the libraries for iphoneos (I read it is the easiest way to receive RTSP Stream but it was the same stack of confusing files as the other projects had)
Maybe if someone tried with these Projects or developed an Application himself could help me with his/her SourceCode or if somebody is seeing through all the Content of FFMPEG / Homebrew made dir's he/she could maybe explain me how to use it, that would probably help me and all the other desperate developers who are searching for a solution.
Just a little edit: I am trying to receive a RTSP H.264 decoded Video Stream.
Thanks in advance, Maurice Arikoglu.
(If you need any kind of SourceCode, Links, ScreenShots etc. please let me know)
For anyone who is searching for a working RTSP Player Library have a look at Durfu's Mooncat Fork I've found online, it is working perfectly fine on iOS 8. There is a sample code included, if you struggle with the implementation I can help you.
I've examined many of those projects that you mentioned, and experienced on those well..
You can use ffmpeg libraries to make a rtsp streaming client, as ffmpeg supports rtsp protocol... But there is an important issue which I've seen in my tests that ffmpeg's rtsp protocol has some important issues when using UDP transport layer for streaming... Even with the latest version (2.01) many RTP packets are dropping during streaming so images become glitchy sometimes... If you use TCP or http transport layer, then it works well...
For live555 project, this library works well with both UDP and TCP transports when streaming rtsp streams.. But ffmpeg is so much powerful/has many capabilities than live555.
If you decide to use ffmpeg then, basically, you should follow the below steps,
make a connection - avformat_open_input
find audio/video codecs - avcodec_find_decoder
open audio/video codecs - avcodec_open2
read encoded packets from file/network stream in a loop - av_read_frame
A.
a. decode audio packets - avcodec_decode_audio4
b. fill audio buffers with audiounit/audioqueue framework
B. a. decode video packets - avcodec_decode_video2
b. convert yuv pictures to rgb pictures - sws_scale or opengles shaders
good luck...
Related
I'm writing an Chrome Packaged App that needs to be able to play a lot of local video files. I can use the tag to play files encoded in h.264 and mp3, but not much else. I'll require playback of at least DivX videos and AC3 audio. Is there any way to do this using the HTML5 platform or otherwise using some kind of plugin?
There are alternatives, but in my opinion the final solution is not going to be very good.
1 - You can try to use a plug-in, for example:
VLC Plug-in - sorry, I have not enough reputation to post more than 2 links :(
Divx Web Player - sorry, I have not enough reputation to post more than 2 links :(
But then you need to rely on the user installing the plug-in. For VLC, the plug-in is not compatible with the latest versions of Mac OS X.
2 - Encode to H.264 or VP8 from a server with an ffmpeg or using a cloud video provider.
3 - Encode from the client side using JavaScript! There is a port of the ffmpeg on javascript (http://bgrins.github.io/videoconverter.js/). I didn't try this method with large files.
4 - Encode from the client side using a Native Client component (https://developers.google.com/native-client/dev/). But seems a daunting task to me.
If you are going to go with the first option, assure that your audience is going to install/configure your player and that their OS are supported.
VLC ported to NaCL would be a great first step.
According to a poster on https://forum.videolan.org/viewtopic.php?f=5&t=107178, libVLC has been ported to NaCL, but I am not familiar with VLC internals so I could not say how far this gets you in terms of being able to decode different streams.
I am attempting to stream a video, in a format unity3d can access, like an mjpg. I have gone through several possible solutions, including gstreamer(only does client side as far as I could tell by the examples), yawcam(I couldn't find a way to access the image directly), and silverlight(due to simply not being able to find how the heck webcam streaming was doable) I am currently just looking for any more methods of getting video over from one side to the other. Could I possibly simply read the images into a byte array and send it over a socket? Maybe I missed something in the previous three possible solutions?
If you are looking to stream video from a server than you can use Ogg encoding + WWW.movie to map it to a texture. Assuming you have a Pro license, as I think this is a Pro only feature. If this is a local file, either bundled with the app or in external folder, we use the brilliant AVPro Windows Media or AVPro QuickTime. MJPEG does offers super smooth scrubbing with AVPro but generates enormous files. Definitely not ideal for streaming or even download!
Finally RenderHead also has a Live Camera capture plugin that could meet your needs.
So generally, I want to make an app which has video chat functionality for iPhone. But after many searches, I am still not able to find any successful results. Is there any public or even for that matter, private API available for doing this on iPhone??? If you have an YES answer, please help me.
Basically, what I want is to read the streams of the video on both the devices connected for chatting. Thanks a lot in advance and please help me if you can.
p.s - I have already checked iDoubs but it failed and always shows some unknown problem and for that reason, doesn't allow me to connect to anyone.
ALSO : The suggested method I have found is via HTTP Live Streaming. But, in that too, I have multiple doubts.
1.) I need to find how do I upload my video from iPhone to the HTTP server from where I would be broadcasting?
2.) Can you please post something related to setting up the server? How do I feed the video to the FFMPEG Server?
Mainly, I need to find the upload method. I am right now simply sending hex-code in the form of NSDATA to the server and I am stuck there. The main problem is, It is live. How do I handle that?
It would be best, if you could help me make the iDoubs work properly.
Thank you so much for any kind of support!
have a look on this how to implement video chat in iphone But before starting you must have a IMS server up & running.
here is the live video chat framework what you are looking for. Its easy and simple to implement for face to face video chat. I have already tried this. Its working very fine. Great thing about this framework is multiple platform support.
Tokbox : https://tokbox.com/platform
https://tokbox.com/opentok/tutorials/
Sample Code:
https://github.com/opentok/opentok-ios-sdk-samples/
Edit:
Here is the article explaining opentok using parse.
http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/
HTTP live streaming is primarily an approach for adaptive streaming from server-to-client. For client-to-server rather go for traditional streaming. There exists an open library for streaming, see this question.
Whilst it is possible to facetime to do two-way chat, it is not certain that you will be able to using public iOS APIs. That said, I have implemented one-way live streaming for iPhone and the difficult part was not the core streaming itself, but encoding of the payload. You will be able to do H264 in hardware and AAC / iLBC in software.
How you want to feed this to the FFMPEG depends on your transport, possibly changing from 'file' H264 frames to 'streaming' H264. Check out the H264 frame types if you implement frame dropping; reconfiguring the H264 encoder on-the-fly is not possible to my knowledge, but restarting with fresh parameters typically does not take more than a second or so.
Did you attempt to play back a live resource while capturing? That is a good starting point. If you come across an open API for H264 encoding, please post it here ;-)
I want to be able to (live) stream the frames/video FROM the iPhone camera to the internet. I've seen in a Thread (streaming video FROM an iPhone) that it's possible using AVCaptureSession's beginConfiguration and commitConfiguration. But I don't know how to start designing this task. There are already a lot of tutorials about how to stream video TO the iPhone, and it is not actually what I am searching for.
Could you guys give me any ideas which could help me further?
That's a tricky one. You should be able to do it, but it won't be easy.
One way that wouldn't be live (not answering your need, but worth mentioning) is to capture from the camera and save it to a video file. see the AV Foundation Guide on how to do that. Once saved you can then use the HTTP Live Streaming segmenter to generate the proper segments. Apple has applications for Mac OSX, but there's an open source version as well that you could adapt for iOS. On top of that, you'd also have to run an http server to serve those segments. Lots of http servers out there you could adapt.
But to do it live, first as you have already found, you need to collect frames from the camera. Once you have those you want to convert them to h.264. For that you want ffmpeg. Basically you shove the images to ffmpeg's AVPicture, making a stream. Then you'd need to manage that stream so that the live streaming segmenter recognized it as a live streaming h.264 device. I'm not sure how to do that, and it sounds like some serious work. Once you've done that, then you need to have an http server, serving that stream.
What might actually be easier would be to use an RTP/RTSP based stream instead. That approach is covered by open source versions of RTP and ffmpeg supports that fully. It's not http live streaming, but it will work well enough.
Using FFMPEG, Live555, JSON
Not sure how it works but if you look at the source files at http://github.com/dropcam/dropcam_for_iphone you can see that they are using a combination of open source projects like FFMPEG, Live555, JSON etc. Using Wireshark to sniff the packets sent from one of the public cameras that's available to view with the free "Dropcam For Iphone App" at the App Store, I was able to confirm that the iphone was receiving H264 video via RTP/RTSP/RTCP and even RTMPT which looks like maybe some of the stream is tunneled?
Maybe someone could take a look at the open source files and explain how they got RTSP to work on the iphone.
Thanks for the info TinC0ils. After digging a little deeper I'v read that they have modified the Axis camera with custom firmware to limit the streaming to just a single 320x240 H264 feed, to better provide a consistent quality video over different networks and, as you point out, be less of a draw on the phone's hardware etc. My interest was driven by a desire to use my iphone to view live video and audio from a couple of IP cameras that I own without the jerkiness of MJPEG or the inherent latency that is involved with "http live streaming". I think Dropcam have done an excellent job with their hardware/software combo, I just don't need any new hardware at the moment.
Oh yeah, I almost forgot the reason of this post RTSP PROTOCOL DOES WORK ON THE IPHONE!
They are using open source projects to receive the frames and decoding in software instead of using hardware decoders. This will work, however, this runs counter to Apple's requirement that you use their HTTP Streaming. It will also require greater CPU resources such that it doesn't decode video at the desired fps/resolution on older devices and/or decrease battery life compared to HTTP streaming.