RTSP RTP client streaming, timestamp, live555 - streaming

I have an IP camera that is located in a different country (with a different time zone) and that has it's own date-time values applied, (for example:~2012-04-16 11:30:00) then the one my PC is located at. (so my PC's time for example is ~2012-14-16 06:10:00)
My purpose:
When streaming, i need to get this date-time value that is set in camera ("11:30:00")
(I'm not interested in a current local time of my PC).
Is there any way to calculate camera's date-time value from RTP's timestamp?
Is there any other approach?
I'm using a Live555 library, and for frame's date-time retrieval I was using a "presentation time" value, but this gives me a local time of my PC (not the time that is set in my camera)
So I'm stuck here..

Read the RFC on RTP packet layout
Note that the Timestamp is in the RTP packet at 0x32. This is the timestamp from the camera that encoded the stream.
For a CPP implementation, processing RTP packet and headers including the timestamp , see the link.
Java implementation of RTP packet handler here

Related

How do I know when a byte is the first of a sequence of bytes representing MIDI delta time?

I want to read and encode MIDI .mid files at the byte (and bit level). When reading through a .mid file, how do I recognize that a specific byte is the first byte of a delta time value?
For example, below is Figure 2.12 from Mandal's Multimedia Signals and Systems a diagram of a track chunk of a .mid file:
How do I know that the 01, 01, 78, 00, 00, and 00 are delta time bytes, given that the events they are attached to are of varying byte lengths? (For example, the instrument change appears to be two bytes beyond the delta time byte, while the first Note On event appears to contain 3 bytes beyond the delta time byte). Is it based on interpreting what follows the delta time byte? If so the fact that the second Note On event is throwing me: why does it appear to have only two bytes following the delta time byte?
It does not appear in Mandel's example, but I would appreciate an answer that clarified this for multi-byte delta times also.
And of course, I appreciate input on how to improve my question.
Finally, I am not asking for libraries that will automate reading .mid files. Those are good (yay!) but I want to understand the how to read the file format down to the byte level.
You indeed have to decoce a MIDI message before you know where the next delta time begins.
The actual Standard MIDI Files specification says:
<MTrk event> = <delta-time> <event>
<delta-time> is stored as a variable-length quantity. It represents the amount of time before
the following event. If the first event in a track occurs at the very beginning of a track, or if
two events occur simultaneously, a delta-time of zero is used. Delta-times are always present.
[…]
<event> = <MIDI event> | <sysex event> | <meta-event>
<MIDI event> is any MIDI channel message. Running status is used: status bytes of MIDI
channel messages may be omitted if the preceding event is a MIDI channel message with the
same status. The first event in each MTrk chunk must specify status. Delta-time is not
considered an event itself: it is an integral part of the syntax for an MTrk event. Notice that
running status occurs across delta-times.

Timestamp for Ethernet frames

I am receiving CAN and Ethernet messages I would like to compare them according to timestamp.
i am using socket CAN and "ioctl(s, SIOCGSTAMP, &tv);" function to get timestamp for can frames and it works great;where s is the socket of can
but when i use it for ethernet frames it gives a negative value sometimes but more importantly each timestamp is the same ,what might be wrong?

Missing packets after merging two files wih Wireshark/mergecap

I have two pcapng files. Each one is a traffic capture that occurred at the same router but on different interfaces.
Since I want to study the behavior of the router's protocols globally I thought on merging these two files into one, so it would be easier to study the different protocols.
I've used the tool mergcap, such as this:
mergecap -w new_file.pcapng file1.pcapng file2.pcapng
According to the manual of mergecap, the files will be merged chronologically, based on the timestamp of each packet within each file1.pcapng and file2.pcapng.
The problem I'm facing now is that after the merge has taken place, packets that I had in file1.pcapng are not found with the same timestamp on new_file.pcapng.
Has anyone done something like this before? I'm using mergecap 2.0.2.
Thanks!
Lucas
By default wireshark orders the packets chronologically starting from the first captured packet. Since you merged two capture files you have two packets that were the start of the capture but only one of them is the first packet in the file. Aligning packets by time based on the first captured packet does not make sense in case of a merged capture.
To be fair, it could make sense if wireshark ordered all packets in chronological order before picking which packet was captured first. Currently, the first packet in the file is the time reference (see time references) by default.
Thankfully wireshark stores the packet time as a timestamp since EPOCH. This allows to align the packets in a merged file chronologically using the several options in View > Time Display Format.
Captures from different machines
The above has one limitation: Since the timestamps are based on EPOCH, if you capture packets from different machines you need to make sure that the clocks of these machines are aligned.
In the case that your capture files originate from different machines and the clocks on these machines are not aligned, you need to shift the timestamps on one of the captures before merging. That, in turn, can be accomplished with wiresharks Edit > Time Shift.

RTP packet identification

I'm working on implementation of RTSP/RTP client and thinking on how to deal with client sockets per RTP session. Is it possible to reuse same socket pairs for different RTP sources? Say I have many IP cameras and I wish to receive media data from them to the single set of RTP client ports (two ports for video (data and control ports) and two ports for audio). In other words I don't want to have as many client sockets sets as amount of IP cameras. If I would receive many data streams into one socket I wonder how to differentiate which RTP packet belongs to which camera?
That's what the SSRC field of the RTP header is for. From RFC3550:
Synchronization source (SSRC): The source of a stream of RTP
packets, identified by a 32-bit numeric SSRC identifier carried in
the RTP header so as not to be dependent upon the network address.
All packets from a synchronization source form part of the same
timing and sequence number space, so a receiver groups packets by
synchronization source for playback. Examples of synchronization
sources include the sender of a stream of packets derived from a
signal source such as a microphone or a camera, or an RTP mixer
(see below). A synchronization source may change its data format,
e.g., audio encoding, over time. The SSRC identifier is a
randomly chosen value meant to be globally unique within a
particular RTP session (see Section 8). A participant need not
use the same SSRC identifier for all the RTP sessions in a
multimedia session; the binding of the SSRC identifiers is
provided through RTCP (see Section 6.5.1). If a participant
generates multiple streams in one RTP session, for example from
separate video cameras, each MUST be identified as a different
SSRC.
As long as you only bind the sockets but don't connect them it is possible. You then have to call recvfrom which provides you with the sender of the packet.

When listening for messages from a device, what is the unit of AbosoluteTime?

When listening for MidiEvents in NAudio from a MidiDevice, we get the long "AbsoluteTime" property on each event. But what unit is this time in and from what starting point is it measured?
In a MIDI file, each event has a delta in "ticks" since the last event. To make MIDI files easier to work with, NAudio keeps a running total, storing the value in AbsoluteTime. The meaning of this depends on delta ticks per quarter note (which is a property on the MidiFile class), and the tempo (MIDI files ought to include at least one TempoEvent).
When listening for MIDI events from a device, the AbsoluteTime of the MIDI Event created will be 0. However, you can use the TimeStamp property of the MidiInMessageEventArgs which I believe is in milliseconds since MidiInStart was called.