I want to retrieve data per Publishing Interval, even data changed or not. As I read OPC documentation, I can only get DateChangeFilter to retrieve the values on OnChange.
Is there any way to do that?
If creating your item with a DataChangeTrigger of StatusValueTimestamp doesn't get you all the changes you want there is nothing you can do.
Alternatives would be periodically calling the Read service instead of using a MonitoredItem or maybe calling the ResendData method every time you want to force the subscription to send the last value for all its items.
Related
I have a materialized view that is updated from many streams. Every one enrich it partially. Order doesn't matter. Updates comes in not specified time. Is following algorithm is a good approach:
Update comes and I check what is stored in materialized view via get(), that this is an initial one so enrich and save.
Second comes and get() shows that partial update exist - add next information
... and I continue with same style
If there is a query/join, object that is stored has a method that shows that the update is not complete isValid() that could be used in KafkaStreams#filter().
Could you share please is this a good plan? Is there any pattern in Kafka streams world that handle this case?
Please advice.
Your plan looks good , you have the general idea, but you'll have to use the lower Kafka Stream API : Processor API.
There is a .transform operator that allow you to access a KeyValueStatestore, inside this operation implementation you are free to decide if you current aggregated value is valid or not.
Therefore send it downstream or returning null waiting for more information.
I am new to opcua. I have a simple python3 client that I want to use to monitor a few voltages and currents from the opcua server.
I can subscribe to them and when they change, I can see the changed value but I don't know what the value is for.
I am trying to figure out how to use the info I know I can get like the node.nodeid.Identifier and use that to somehow get the path which is associated with the the id. That should tell me what the value is(?)
I thought it might be in the browse_name but that got me nowhere.
Any push in the right direction would be greatly appreciated. Thanks!
When your OPC UA Client want to get notified of the value update of a Node, it actually subscribe to the Attribute Value of this Node. You could try to subscribe to the Attribute BrowseName or DisplayName of the Node to get notified for the name.
You should then send a CreateMonitoredItems and set the corresponding AttributeId for each iTemToMonitor.
However, not every OPC UA Server support this feature.
Most OPC UA Client use the Browse and Read Services just before sending a CreateSubscription/CreateMonitoredItems in order to get the BrowseName/DisplayName or other attributes values they want.
I started a OPC UA project using the Milo project to create a OPC UA Client. I am still very new to OPC UA. Right now I am stuck looking for the best practice to read values from several Nodes after a data change of one specific node.
The information model looks like this:
RfidSensorType
On my server i will have several objects of this RfidSensorType. The client creates a subscription on the CurrentAtTag Node to listen for data changes.
My Question:
When the value of CurrentAtTag is changed a callback function will be called in my client which contains the UaMonitoredItem and the DataValue of the CurrentAtTag.
In my application i need to process (at the same time) also the values of Station, IOLPort and CurrentValue which are changed at that moment too.
How can i access those values within the callback from CurrentAtTag?
My only solution is: Using a synchronous read request within that callback
-> Is that an legit approach?
My Research:
1) TriggeringService
I've seen that a TriggerigService exists, which monitors items will send reports only if one specific node changes it values.
Problem: This will call several callsbacks and noz just one..i need all the informations at the same time to further process them..
2) Event Monitoring
In event monitoring one can select "Event fields" which will be returned for each Event notificaiton. I am not sure if i could select the CurrentAtTag, Station, IOLPort and CurrentValue...
Just like you can subscribe to the server's ServerStatus (nodeid "i=2256"), you should be able to subscribe to the nodeid corresponding to 'RfidSensor_Station1'. The server will send PublishResponse with data of type 'RfidSensorType' encoded as an ExtensionObject. The trick is decoding the ExtensionObject.
As Kevin corrected, because 'RfidSensor_Station1' is not node class 'Variable' then it doesn't have a value attribute and you can not monitor the node for data changes. If you are using a PLC, I might combine all properties of the sensor into a string, or byte array. Then I monitor the new variable, and parse the string in the client.
Or you could make ReadRequest as you describe. That will work just fine.
Basically, I want to implement SYNC functionality; where, if internet connection is not available, data gets stored on local sqlite database. Whenever, internet connection is available, SYNC gets into the action.
Now, Say for example; 5 records are stored locally, and then internet connection is available. I want the server to be updated. So, What I do currently is:
Post first record to the server.
Wait for the success of first request.
Post local NSNotification to routine, that the first record has been updated on server & now second request can go.
The routine fires the second post request on server and so on...
Question: Is this approach right and efficient enough to implement SYNC functionality; OR anything I should change into it ??
NOTE: Records to be SYNC will have no limit in numbers.
Well it depends on the requirements on the data that you save. If it is just for backup then you should be fine.
If the 5 records are somehow dependent on each other and you need to access this data from another device/application you should take care on the server side that either all 5 records are written or none. Otherwise you will have an inconsistent state if only 3 get written.
If other users are also reading / writing those data concurrently on the server then you need to implement some kind of lock on all records before writing and also decide how to handle conflicts when someone attempts to overwrite somebody else changes.
In DDS what my requirement is, I have many subscribers but the publisher is single. My subscriber reads the data from the DDS and checks the message is for that particular subscriber. If the checking success then only it takes the data and remove from DDS. The message must maintain in DDS until the authenticated subscriber takes it's data. How can I achieve this using DDS (in java environment)?
First of all, you should be aware that with DDS, a Subscriber is never able to remove data from the global data space. Every Subscriber has its own cached copy of the distributed data and can only act on that copy. If one Subscriber takes data, then other Subscribers for the same Topic will not be influenced by that in any way. Only Publishers can remove data globally for every Subscriber. From your question, it is not clear whether you know this.
Independent of that, it seems like the use of a ContentFilteredTopic (CFT) is suitable here. According to the description, the Subscriber knows the file name that it is looking for. With a CFT, the Subscriber can indicate that it is only interested in samples that have a particular value for the file_name attribute. The infrastructure will take care of the filtering process and will ensure that the Subscriber will not receive any data with a different value for the attribute file_name. As a consequence, any take() action done on the DataReader will contain relevant information and there is no need to check the data first and then take it.
The API documentation should contain more detailed information about how to use a ContentFilteredTopic.