I found this article:
www.codeproject.com/Articles/1077937/Possible-ways-to-organize-interaction-between-co,
and I know that there exist a code for the flash player.
Can I use only code for managing connections (as in the articles examples) and free flash player code and therefore get rid from integration software?
You need to be more specific but in reality the idea of the integration software is the following:
Session management
Multi-codec/format support
Interface Resource
Scheduling
Normally IP PBX supports SIP only, hence you need the transcoding between the SIP world (Audio + Video) and the Webcast world (Web browser/client/camera). Integration software as the one defined do a pretty good job and some of them are open Source (Wooza), If you want to replace them, I would do it with an MCU which support RMTP/Flash. Take a look at McuWeb project. Otherwise you need to write SIP Client code as well to integrate it with the SIP world
These applications stream video from client app to their own server. I am interested in knowing what type of protocol they use? I am planning on building a similar application but I dont know how to go about the video streaming. Once I get the stream to my server I will use OpenCV to do some processing and return the result to the client.
I recommend you to send only a minimum of data and do the processing as much as possible on the client. Since sending the whole video stream is a huge waste of traffic (and can not be done in realtime I think)
I would use a TCP connection to send an intermediate result to the server, that the server can process further. The desing of that communication depends on what you are sending and what you want to do with it.
You can wrap it in xml for instance, or serialize an object and so on.
I am working on a small iOS application which has to download some files from a server using HTTP. In some cases, it is desirable to do this over another interface than the default (i.e., use 3G rather than Wifi).
Currently, I use the NSURLConnection (and the other NSURL-classes) to retrieve the files. However, I have not found a way to bind this connection to a specific interface/socket. I have successfully created active sockets for each interface (using the normal BSD socket-calls).
Is there any way to configure NSURLConnection to use these sockets, or another way to force it to use a specific network interface? Or is my only option to use CFNetwork (I want to do as little work as possible related to HTTP :))?
Check out the Reachability example, it can tell you if your target host is reachable via wwan, wifi or not at all.
I am working on a project of P2P instant messenger, like ichat, but just for LAN.
I use jmdns library for service discovering, and test with pidgin and log in as a Bonjour user. as so far,the service _presence._tcp.local. is well discovered,
then we know the user's information in LAN like name#host:port, so how to start a conversation with Bonjour client ?
I looked into XMPP, but it do not support P2P, but I can hardly find the library for the extension jingle which supports P2P.
maybe I should use SIP to make a conversation? but the packet format is compatible with Bonjour ? or I have to study the structure of packet exchanged?
Can anyone explain a little about how does ichat work for LAN?
Many thanks for your kind help!
I think you're a little confused.
Bonjour is a mechanism for finding a service. It is not for communicating with a service. Once you have found the name#host:port information, you are finished with Bonjour.
The next step will require you to talk a protocol that the service understands. The token _presence in the service string indicates that this is an XMPP service. You will need to talk XMPP to it. You cannot talk SIP to it. Have you tried opening an XMPP connection to the host and port you have found?
You talk about SIP and Jingle. These are used to set up an audio or video call. If you are writing an instant messaging program, you do not need to do this. XMPP alone is enough.
If you do want to support audio or video, then you will need one of those protocols. Because the service you have found is an XMPP service, you will need to use Jingle. If you don't have a library that can speak Jingle, you will have to write the code yourself. There is nothing in the Bonjour information that identifies a SIP service, so you cannot use SIP - unless you can make a different Bonjour query and find a SIP service.
I infer that you are working in Java. The most popular XMPP library for Java seems to be Smack.
Thanks for everybody's attention, now I have found something. XMPP doesn't support P2P mode, only supports clients-server-clients. but there is another standard "XEP-0174: Serverless Messaging" which is right for p2p chat in local network. DNS-SD + XEP-0174 , ichat works in this way.
as I used smack library, it do not support p2P; but someone did some changes, here is the link
http://issues.igniterealtime.org/browse/SMACK-262 .
I didn't try this XMPPLLConnection, I have looked into the source code of smack, it is based on socket connection. unfortunately there is not any java library for XEP-1074, so I have to work on xml stream over socket.
You may use SIP for that. MDNS will be your discovery mechanism, then you'd use plain SIP for calling, one you learned the URI you wish to dial.
SIPSIMPLE SDK (http://sipsimpleclient.com) implements this feature by sing this expired draft: https://datatracker.ietf.org/doc/html/draft-lee-sip-dns-sd-uri-03 it could be a good start.
Basically your client would generate a URI like sip:random_stuff#ip:port and then publish it along with a display name by using MDNS. The application also browses MDNS for peers on the LAN: _sipuri._udp for example. Once you get some URI you can just dial using SIP.
I'm thinking about developing a streaming server and I have the following question, do over RTSP (example url: rtsp://192.168.0.184/myvideo.mpg) or RTP (example url: rtp://192.168.0.184).
As I have understood, an RTSP server is mainly used for streaming of files that already exist, ie, not live. RTP server is used to broadcast.
Somebody correct me if I'm wrong, am I right?.
What I want to develop a server to broadcast live content on the computer screen, that is, which is displayed at the time that is broadcast in streaming.
You are getting something wrong... RTSP is a realtime streaming protocol. Meaning, you can stream whatever you want in real time. So you can use it to stream LIVE content (no matter what it is, video, audio, text, presentation...). RTP is a transport protocol which is used to transport media data which is negotiated over RTSP.
You use RTSP to control media transmission over RTP. You use it to setup, play, pause, teardown the stream...
So, if you want your server to just start streaming when the URL is requested, you can implement some sort of RTP-only server. But if you want more control and if you are streaming live video, you must use RTSP, because it transmits SDP and other important decoding data.
Read the documents I linked here, they are a good starting point.
AFAIK, RTSP does not transmit streams at all, it is just an out-of-band control protocol with functions like PLAY and STOP.
Raw UDP or RTP over UDP are transmission protocols for streams just like raw TCP or HTTP over TCP.
To be able to stream a certain program over the given transmission protocol, an encapsulation method has to be defined for your container format. For example TS container can be transmitted over UDP but Matroska can not.
Pretty much everything can be transported through TCP though.
(The fact that which codec do you use also matters indirectly as it restricts the container formats you can use.)
Some basics:
RTSP server can be used for dead source as well as for live source. RTSP protocols provides you commands (Like your VCR Remote), and functionality depends upon your implementation.
RTP is real time protocol used for transporting audio and video in real time. Transport used can be unicast, multicast or broadcast, depending upon transport address and port. Besides transporting RTP does lots of things for you like packetization, reordering, jitter control, QoS, support for Lip sync.....
In your case if you want broadcasting streaming server then you need both RTSP (for control) as well as RTP (broadcasting audio and video)
To start with you can go through sample code provided by live555
I hear your pain. I'm going through this right now (years later).
From what I've learned, you can think of RTSP as a "VCR controller", the protocol allows you to specify which streams (presentations) you want to play, it will then send you a description of the media, and then you can use RTSP to play, stop, pause, and record the remote stream. The media itself goes over RTP. RTSP is normally implemented over a different socket or communication layer. Although it is simply a protocol, most often it's implemented by a server over a socket. For live streams, the RTSP stream you request is simply a name of a stream. It doesn't need to refer to a file on the server, the server's RTSP implementation can parse that stream, put together a live graph, and then provide the SDP (description) for that stream name. But, this is of course specific to the way the RTSP server has been implemented. For "live" streams, it's probably simpler to just use RTP, but you'll need a way to transfer the SDP from the RTP server to the client that wants to play that stream.
I think thats correct. RTSP may use RTP internally.
RTSP is widely used in IP camera, running as RTSP server in camera, so that user could play(pull) the RTSP stream from camera. It is a low cost solution, because we don't need a central media server (think about thousands of camera streams). The arch is bellow:
IP Camera ----RTSP(pull)---> Player
(RTSP server) (User Agent)
The RTSP protocol actually contains:
A signaling over TCP, at port 554, used to exchange the SDP (also used in WebRTC), about media capabilities.
UDP/TCP streams over serval ports, generally two ports, one for RTCP and one for RTP (also used in WebRTC).
Comparing to WebRTC, which is now available in H5:
A signaling over HTTP/WebSocket or exchange by any other protocols, used to exchange the SDP.
UDP streams(RTP/RTCP) over one or many ports, generally bind to one port, to make cloud services load balancer happy.
In protocol view, RTSP and WebRTC are similar, but the use scenario is very different, because it's off the topic, let's grossly simplified, WebRTC is design for web conference, while RTSP is used for IP camera systems.
So it's clear both RTSP and WebRTC are solution and protocol, used in different scenario. While RTP is transport protocol, also it can be used in live streaming by WebRTC.
Note: RTSP is not available for H5 or internet live streaming, but we could covert it by FFmpeg and a gateway server, please see here.
RTP is the transport protocol for real-time data. It provides timestamp, sequence number, and other means to handle the timing issues in real-time data transport.
RTSP is a control protocol that initiates and directs delivery of streaming multimedia data from media servers. It is the "Internet VCR remote control protocol." Its role is to provide the remote control; however, the actual data delivery is done separately, most likely by RTP.
also, RTCP is the control part of RTP that helps with quality of service and membership management.
These three related protocols are used for real-time multimedia data over the Internet. Read the excellent full documentation at this link: RTP, RTCP & RTSP
RTSP (actually RTP) can be used for streaming video, but also many other types of media including live presentations. Rtsp is just the protocol used to setup the RTP session.
For all the details you can check out my open source RTSP Server implementation on the following address: https://net7mma.codeplex.com/
Or my article # http://www.codeproject.com/Articles/507218/Managed-Media-Aggregation-using-Rtsp-and-Rtp
It supports re-sourcing streams as well as the dynamic creation of streams, various RFC's are implemented and the library achieves better performance and less memory then FFMPEG and just about any other solutions in the transport layer and thus makes it a good candidate to use as a centralized point of access for most scenarios.