I want to use RTSP server for serving videos from camera(s). In my solution I need to provide exact capture timestamp for each of the video frames. For this I wanted to use the GstReferenceTimestampMeta struct. From the server side the code looks like that:
GstCaps *imCaps = gst_caps_new_empty_simple("stream/x-timestamp");
gst_buffer_add_reference_timestamp_meta (buffer, imCaps, timeStamp, GST_CLOCK_TIME_NONE);
Everything seems fine - I'm able to get the reference timestamp metadata further in the server GST pipeline correctly, but from the client perspective it seems there is no such metadata (other fields like PTS or DTS have correct values):
gst_buffer_get_reference_timestamp_meta(buffer, imCaps); // returns NULL on client side
In the gstreamer logs I didn't see any obvious messages that indicate a problem with sending the GstReferenceTimestampMeta.
Do you know if my observations are expected, i.e. the GstReferenceTimestampMeta is ignored somewhere in the RTSP server <-> client path? I would appreciate any suggestions on this topic.
Thanks,
Tomasz
Related
I have a PostGIS + Debezium/Kafka + Debezium/Connect setup that is streaming changes from one database to another. I have been watching the messages via Kowl and everything is moving accordingly.
My problem relies when I'm reading the message from my Kafka Topic, the geometry (wkb) column in particular.
This is my Kafka message:
{
"schema":{
"type":"struct"
"fields":[...]
"optional":false
"name":"ecotx_geometry_kafka.ecotx_geometry_impo..."
}
"payload":{
"before":NULL
"after":{
"id":"d6ad5eb9-d1cb-4f91-949c-7cfb59fb07e2"
"type":"MultiPolygon"
"layer_id":"244458fa-e6e0-4c6c-a7e1-5bf0afce2fb8"
"geometry":{
"wkb":"AQYAACBqCAAAAQAAAAEDAAAAAQAAAAUAAABwQfUo..."
"srid":2154
}
"custom_style":NULL
"style_id":"default_layer_style"
}
"source":{...}
"op":"c"
"ts_ms":1618854994546
"transaction":NULL
}
}
As can be seem, the WKB information is something like "AQAAAAA...", despite the information inserted in my database being "01060000208A7A000000000000" or "LINESTRING(0 0,1 0)".
And I don't know how to parse/transform it to a ByteArray or a Geometry in my Consumer app (Kotlin/Java) to further use in GeoTools.
I don't know if I'm missing an import that is able to translate this information.
I'm have just a few questions around of people posting their json messages and every message that has a geom field (streamed w/ Debezium) got changed to this "AAAQQQAAAA".
Having said that, how can I parse/decoded/translate it to something that can be used by GeoTools?
Thanks.
#UPDATE
Additional info:
After an insert, when I analyze my slot changes (querying the database using pg_logical_slot_get_changes function), I'm able to see my changes in WKB:
{"change":[{"kind":"insert","schema":"ecotx_geometry_import","table":"geometry_data","columnnames":["id","type","layer_id","geometry","custom_style","style_id"],"columntypes":["uuid","character varying(255)","uuid","geometry","character varying","character varying"],"columnvalues":["469f5aed-a2ea-48ca-b7d2-fe6e54b27053","MultiPolygon","244458fa-e6e0-4c6c-a7e1-5bf0afce2fb8","01060000206A08000001000000010300000001000000050000007041F528CB332C413B509BE9710A594134371E05CC332C4111F40B87720A594147E56566CD332C4198DF5D7F720A594185EF3C8ACC332C41C03BEDE1710A59417041F528CB332C413B509BE9710A5941",null,"default_layer_style"]}]}
Which would be useful in the consumer app, the thing definitely relies on the Kafka Message content itself, just ain't sure who is transforming this value, if Kafka or DBZ/Connect.
I think it is just a different way to represent binary columns in PostGIS and in JSON. The WKB is a binary field, meaning it is has bytes with arbitrary values, many of which has no corresponding printable characters. PostGIS prints it out using HEX encoding, thus it looks like '01060000208A7A...' - hex digits, but internally it is just bytes. Kafka's JSON uses BASE64 encoding instead for exactly the same binary message.
Let's test with a prefix of your string,
select to_base64(from_hex('01060000206A080000010000000103000000010000000500'))
AQYAACBqCAAAAQAAAAEDAAAAAQAAAAUA
I want to send data generated by my application to OPC UA client using OPC UA Server. I have gone through Eclipse Milo Project which is great resource for the same. But I don't know how to integrate it in our Application.The Application produces output in JSON format. The data of multiple nodes is stored, processed and sent using JSON format like following.
{"deviceId":"36860","timestamp":"2019-03-07 10:37:20+05:30","1":"228.6","2":"237.65","3":"237.21","4":"0.13","5":"0.0","6":"0.11","7":"-2.95","8":"0.0","9":"4.03","10":"22.2","11":"0.0","12":"16.43","13":"-21.83","14":"0.0","15":"-15.72","16":"-0.13","17":"1.0","18":"0.25","19":"262.35","20":"0.0","21":"284.18","22":"234.49","23":"703.47","24":"0.08","25":"0.24","26":"0.36","27":"1.08","28":"12.87","29":"38.62","30":"-12.52"}
where 36860 is the nodeId/deviceId, timestamp is the time when data is captured from the node rest are the Parameter Ids and their actual reading values of node in key value pair.
How to use the ExampleServer to send this data and How client would receive it ?
If possible can anyone provide an example ?
I think the first think you should decide is how to model the data in the server.
You could of course stuff that String into a single VariableNode with DataType of String but then why bother using OPC UA?
I started a OPC UA project using the Milo project to create a OPC UA Client. I am still very new to OPC UA. Right now I am stuck looking for the best practice to read values from several Nodes after a data change of one specific node.
The information model looks like this:
RfidSensorType
On my server i will have several objects of this RfidSensorType. The client creates a subscription on the CurrentAtTag Node to listen for data changes.
My Question:
When the value of CurrentAtTag is changed a callback function will be called in my client which contains the UaMonitoredItem and the DataValue of the CurrentAtTag.
In my application i need to process (at the same time) also the values of Station, IOLPort and CurrentValue which are changed at that moment too.
How can i access those values within the callback from CurrentAtTag?
My only solution is: Using a synchronous read request within that callback
-> Is that an legit approach?
My Research:
1) TriggeringService
I've seen that a TriggerigService exists, which monitors items will send reports only if one specific node changes it values.
Problem: This will call several callsbacks and noz just one..i need all the informations at the same time to further process them..
2) Event Monitoring
In event monitoring one can select "Event fields" which will be returned for each Event notificaiton. I am not sure if i could select the CurrentAtTag, Station, IOLPort and CurrentValue...
Just like you can subscribe to the server's ServerStatus (nodeid "i=2256"), you should be able to subscribe to the nodeid corresponding to 'RfidSensor_Station1'. The server will send PublishResponse with data of type 'RfidSensorType' encoded as an ExtensionObject. The trick is decoding the ExtensionObject.
As Kevin corrected, because 'RfidSensor_Station1' is not node class 'Variable' then it doesn't have a value attribute and you can not monitor the node for data changes. If you are using a PLC, I might combine all properties of the sensor into a string, or byte array. Then I monitor the new variable, and parse the string in the client.
Or you could make ReadRequest as you describe. That will work just fine.
The Firebase REST API describes how to write server values (currently only timestamps are supported) at a location, but it appears that one must submit a separate request in order to do this. Is there (or has there been planned) any way of setting timestamps (like createdAt) at the same time one submits other data? Seems like this would really help reduce traffic and improve performance.
Sure, this is possible. The documentation is admittedly a little unclear, but all you need to do is include the {".sv": "timestamp"} object as part of your JSON payload. Here's an example that saves it to a key timestamp.
curl -X PUT -d '{"something":"something", "timestamp":{".sv": "timestamp"}}' https://abc.firebaseio-demo.com/.json
I read all those discussions and mailing lists which propose how to standardize the inclusion of a timestamp in a pubsub item but there is no answer how it is done practically today. Do I have to tweak my server to include the creation timestamp (because each server stores that information for some reason ;) or are there plugins or known source code modifications for openfire or ejabberd?
I have entries which are published in json format and not in Atom. BTW where the "published" and "updated" timestamps come from in the XEP-0060? Will the timestamp be added automatically if I configure the node to publish in Atom?
You would have to tweak your server, since the timestamp is not part of the spec.
The only way to have timestamps right now, is to put them in at the application level, which means the publisher is inserting it into the payload.