H.264 call setup SDP parameters negotiation - sip

I am working on a video call application which uses H.264 codec for Video Packets. Session setup is done using good old SIP. As a comfort I am using a UVC webcam that has H.264 encoder inbuilt which works on H.264 baseline profile (which is preferred for webcams if I am not wrong). Also the camera support various resolutions.
My question is "is it possible to negotiate on a particular resolution using say SDP parameters or any other mechanism during call setup"?
I am going through SDP parameters used for H.264 based video call but have not found any parameter that negotiates the resolution?
Can anyone suggest how to negotiate the resolution?
But first is it really possible to negotiate the resolution parameters at all?
If it is not possible then do I need to decode the received frame first and then would I be able to check what is the resolution of the received frame?
Any help is really welcome and is deeply appreciated.
Regards,
gs

SDP is used for session description, it is not used for session negotioation.
If you are using SIP then after SDP exchange you can negotiate codec using sip "NEGOTIATE" method.
following links should help you -
http://www.hjp.at/doc/rfc/rfc4317.txt
http://www.ietf.org/mail-archive/web/sip/current/msg27863.html

Related

Does sip provider can control sip video quality?

I am working on a sip video call using pjsip on android. I have tried all possible way to improve video quality .but still i get same bad video quality. Does sip provider controls video quality as well ?
It can have impact as it is involved with codec negotiation being in the middle of signaling path, but it is also quite likely that even if operator does something video media parameters are ignored anyway.
You can try direct IP calls to remove provider from the equation.

Live Video Chat for iPhone and HTTP Live Streaming

So generally, I want to make an app which has video chat functionality for iPhone. But after many searches, I am still not able to find any successful results. Is there any public or even for that matter, private API available for doing this on iPhone??? If you have an YES answer, please help me.
Basically, what I want is to read the streams of the video on both the devices connected for chatting. Thanks a lot in advance and please help me if you can.
p.s - I have already checked iDoubs but it failed and always shows some unknown problem and for that reason, doesn't allow me to connect to anyone.
ALSO : The suggested method I have found is via HTTP Live Streaming. But, in that too, I have multiple doubts.
1.) I need to find how do I upload my video from iPhone to the HTTP server from where I would be broadcasting?
2.) Can you please post something related to setting up the server? How do I feed the video to the FFMPEG Server?
Mainly, I need to find the upload method. I am right now simply sending hex-code in the form of NSDATA to the server and I am stuck there. The main problem is, It is live. How do I handle that?
It would be best, if you could help me make the iDoubs work properly.
Thank you so much for any kind of support!
have a look on this how to implement video chat in iphone But before starting you must have a IMS server up & running.
here is the live video chat framework what you are looking for. Its easy and simple to implement for face to face video chat. I have already tried this. Its working very fine. Great thing about this framework is multiple platform support.
Tokbox : https://tokbox.com/platform
https://tokbox.com/opentok/tutorials/
Sample Code:
https://github.com/opentok/opentok-ios-sdk-samples/
Edit:
Here is the article explaining opentok using parse.
http://www.iphonegamezone.net/ios-tutorial-create-iphone-video-chat-app-using-parse-and-opentok-tokbox/
HTTP live streaming is primarily an approach for adaptive streaming from server-to-client. For client-to-server rather go for traditional streaming. There exists an open library for streaming, see this question.
Whilst it is possible to facetime to do two-way chat, it is not certain that you will be able to using public iOS APIs. That said, I have implemented one-way live streaming for iPhone and the difficult part was not the core streaming itself, but encoding of the payload. You will be able to do H264 in hardware and AAC / iLBC in software.
How you want to feed this to the FFMPEG depends on your transport, possibly changing from 'file' H264 frames to 'streaming' H264. Check out the H264 frame types if you implement frame dropping; reconfiguring the H264 encoder on-the-fly is not possible to my knowledge, but restarting with fresh parameters typically does not take more than a second or so.
Did you attempt to play back a live resource while capturing? That is a good starting point. If you come across an open API for H264 encoding, please post it here ;-)

Mac/iPhone:Streaming video file to iPhone

I have a http streaming link which gives me .flv streaming feed. I want to convert that and access in my iPhone program. How can i do that? I want to have a desktop software like VLC and input this streaming feed URL and convert to iPhone supported and stream again to iPhone. I tried VLC with H.264 and Mpeg-1 audio, but seems to be it doesn't give the supported format, so as iPhone program doesn't play the video.
Could someone please guide me how can i setup a desktop software which can stream iPhone supported file?
Thanks in advance.
I think even the great VLC can't convert FLV on the fly...(or even do anything with FLV). As far as streaming goes, you'll probably be limited to the local network (Wi-Fi). I'd start with the simple way—create an ad-hoc file server on the desktop, then use AVPlayer's initWithURL method to find that video.
On the desktop, you could query the IP address of the computer, and ask the user to enter that URL (along with an optional port assignment and file component, like http://192.168.0.2:2234/streamingVideo.mp4) onto the iDevice, then convert to NSURL.
What exactly is the http streaming link? This matters a lot as in order to stream to the iPhone you need to use HTTP Live Streaming which requires some different bits than a typical flash media, or more properly RTMP, server. Typically you need two different streaming architectures or some expensive boxes.

Streaming live H.264 video via RTSP to iphone does work! w/example

Using FFMPEG, Live555, JSON
Not sure how it works but if you look at the source files at http://github.com/dropcam/dropcam_for_iphone you can see that they are using a combination of open source projects like FFMPEG, Live555, JSON etc. Using Wireshark to sniff the packets sent from one of the public cameras that's available to view with the free "Dropcam For Iphone App" at the App Store, I was able to confirm that the iphone was receiving H264 video via RTP/RTSP/RTCP and even RTMPT which looks like maybe some of the stream is tunneled?
Maybe someone could take a look at the open source files and explain how they got RTSP to work on the iphone.
Thanks for the info TinC0ils. After digging a little deeper I'v read that they have modified the Axis camera with custom firmware to limit the streaming to just a single 320x240 H264 feed, to better provide a consistent quality video over different networks and, as you point out, be less of a draw on the phone's hardware etc. My interest was driven by a desire to use my iphone to view live video and audio from a couple of IP cameras that I own without the jerkiness of MJPEG or the inherent latency that is involved with "http live streaming". I think Dropcam have done an excellent job with their hardware/software combo, I just don't need any new hardware at the moment.
Oh yeah, I almost forgot the reason of this post RTSP PROTOCOL DOES WORK ON THE IPHONE!
They are using open source projects to receive the frames and decoding in software instead of using hardware decoders. This will work, however, this runs counter to Apple's requirement that you use their HTTP Streaming. It will also require greater CPU resources such that it doesn't decode video at the desired fps/resolution on older devices and/or decrease battery life compared to HTTP streaming.

Is that possible to stream mms,ASX,RTSP stream on iPhone?

I am developing one music streaming application.
I can stream mp3 using a method described here. Does anybody know approach to stream other formats(ASX, RTSP or mms) using Core Audio or other framework.
Thanks in advance.
mms, ASX, and RTSP are historically somewhat proprietary protocols (by microsoft and real, in particular), so you may have trouble finding an official apple implementation.
There's a LGPL implementation of the mms protocol here: https://launchpad.net/libmms
Or you can get the documentation for the protocol from microsoft here: http://download.microsoft.com/download/9/5/E/95EF66AF-9026-4BB0-A41D-A4F81802D92C/%5BMS-MMSP%5D.pdf
ASX is just a metadata format in XML; you'd use it to get a mms or http URL to stream from. The official reference for it is on microsoft's site: http://msdn.microsoft.com/en-us/library/bb249663.aspx
RTSP has an LGPL implementation here: http://www.live555.com/liveMedia/
It's a standard protocol (RFC 2326 and RFC 3550) but is apparently often used with proprietary extensions such as Real's RDT transport, so again it might be easier to just use a library if you're able.
Try the free FStream iPhone app http://www.sourcemac.com/?page=fstream that can handle mms, asf, wmv, asx and ogg
FStream is good for audio. You can also use Streamer for video streaming. It is a good app except that it is not friendly at all. Type the URI mms://server/ in your favorites. Then click on it. You will find a button that says: "Pause". Click that again to read: "Unpause". Then wait for 10-15 seconds, the video will start streaming after that. Make sure that you choose a URI that you know works for sure.