Send text/strings/char* with Appsrc in gstreamer - plugins

Can the appsrc plugin send text to matroskamux?
I need to send some strings that are read from the serial port.

Related

Audio + visual through TCP/IP

im in the process of making an application similar to skype to interact with another computer and i have a few questions.
I know all the basics such as how to send data over tcp etc in the form of an image and audio.
How does applications like skype send live audio? Does it litrally record 1 byte of audio, send it and play it and then repeat the process? For me its not instant so i dont see how that would be possible.
How would u send string and image through tcp at the same time (video call + chat), would you use multiple ports? i can see how that would be very bad. The way im doing it atm is when i click to recive an image, i set it up to receive an image so it receives properly, if a string got sent at this time for example, it wouldnt work as it cant be converted to an image if you see what im saying. im not sure how else i would do it. I Could send each thing with its type as the beginning for example "string Hello how are you" then decypher the data type through that, but that seems abit tedious and slow.
If anyone could give me an insight, that would be fantastic
I can't speak for how skype does it, but this should be a starting point:
Streaming audio/video is usually transported over UDP sockets, not TCP. TCP guarantees delivery whereas UDP is best effort. If you have a temporary connection loss you care more that the video you're receiving is current, not that you receive the whole stream.
The data is usually compressed (and sometimes encrypted) using a standard compression algorithm after being received from a camera/microphone. Have a look at H264, which is commonly used to compress video.
RTP is often used to transmit audio/video. It allows multiple types of stream to be combined over a single socket.
Control traffic is usually sent separately over a different socket, usually TCP. For example SIP which is used for VoIP phones initiates a control connection over a TCP or UDP port (usually 5060). The two ends then negotiate which types of stream will be supported, and how those streams will be sent. For SIP, this will be an RTP stream which is set up on a different UDP port.

Send RAW network packets in Delphi (WinPCap)

I'd like to send RAW network packets from our Delphi application (XE3 to XE5, if important), but I can't find any info whatsoever about using PCap in Delphi at all.
How would one do that?
There is one Magenta library (sockmon and sockstat, monitoring and statistics library for observing raw incoming packets on selected interface), which works nice for receiving RAW packets, however there is no option to send RAW packets out.
Thanks.

Streaming live H264 video via RTP/UDP

I want to do live video streaming and encoding. I am using Leopardboard DM365. I can capture and encode live video into H264 and then stream using gstreamer plugins but how do I capture the rtp packets on windows? I can capture on vlc using sdp file, but I do not want to just view using VLC. I need to capture the buffer and then pass it ahead to my application. How can I do this?
I am using the following gstreamer plugin on server side:
gst-launch -v -e v4l2src always-copy=FALSE input-src=composite
chain-ipipe=true ! video/x-raw-yuv,format=(fourcc)NV12, width=640,
height=480 ! queue ! dmaiaccel ! dmaienc_h264 encodingpreset=2
ratecontrol=2 intraframeinterval=23 idrinterval=46
targetbitrate=3000000 ! rtph264pay ! udpsink port=3000
host=192.168.1.102 sync=false enable-last-buffer=false
Thank you,
Maz
In your application if you know the exact parameters that you are going to receive why do you need the sdp file?
The sdp file is needed to get the streaming parameters. The rtsp protocol allows exchange of sdp because receiver does not know what the sender will send.
If your application knows what the sender will send you just need to capture the data and start decoding it. You many want to configure rtph264pay with config-interval=1 to send the SPS PPS every 1 second so that your application can decode the content that is coming in. Feel free to change the duration of config-interval to match your intraframeinteral.

Receive Matlab/simulink UDP blocks in other software

Can we send Matlab/Simulink UDP blocks to other software like eclipse ? could eclipse read/use that for building android apps ? How should I do that ?
when sending data through UDP Send block of an xPC target it packed in binary I think , how can we on pack that in eclipse?
I simply built a simulink model and used udp send blocks to send the data . then in eclipse I wrote a client side to get the data from server and convert it from binary to ascii

Is there any way of sending DTMF tones during a call with linphone?

I'm trying to send a sequence of DTMF tones during a SIP call from linphone, compiled for the iPhone, in order to do some call management at a local exchange I've set up. I see from the code that the individual digits send DTMF (without audio on the line), but I can't seem to send a string of digits manually.
When I try, I just get a single digit sent. I could put in a delay and timer, but that just doesn't seem the way to go about it - and a long string of tones would take a long time to send with the necessary acknowledgements.
I've read that you can send DTMF as part of a SIP INFO message, but can't find the facility in linphone to construct a SIP INFO message.
Has anyone been able to do this or have any suggestions as to what I could try?
For me, changing the audio codec to speex # 32000 Hz solved the problem. I'm not sure exactly why it solved it, but beforehand DTMF signals were not being recoknized by the server, whereas now they are.
For reference, I'm using the recent Linphone 3.8.1 build.