Need to read Struct variable fom OPC UA Server with OPC UA .NET from opcfoundation - opc-ua

I'm new to OPC UA.
I'm using
OPCFoundation.NetStandard.Opc.Ua.Client (1.4.368.58)
OPC Server opc.tcp://opcuaserver.com:48010
Node to read : ns=2;s=Demo.WorkOrder.WorkOrderVariable
This structure contains simple types (Guid, Int64, String, etc...) and a custom types "WorkOrderStatusType"
I can read values with : session.ReadValues(...) and process with :
var encodeable = val as ExtensionObject;
var decoder = new BinaryDecoder(encodeable.Body as byte[], ServiceMessageContext.GlobalContext);
decoder.ReadGuid("ID");
It works for simple types but not for custom types.
I would like to know how to parse sub custom types.
I've searched a lot and can't find anything specific to understand.
I would like to create an object myself and have the OPC library automatically map my object, but I don't know how.
Could you help me ?
Thank you.

Related

Q: OPC UA location of sensor data

I have done some research on OPC UA and noticed that all sensor data on the Prosys sample server is stored in subfolders of the Object (i=85) folder.
On the OPC UA server of a machine I have seen that the sensor data like the measured value, the unit etc. can ONLY be accessed via the Types (i=86) folder.
The path here would be i=84 -> i=86 -> i=88 -> i=58...
There is really no other path through which you can reach these nodes otherwise.
I have never seen such an implementation.Is this normal that such data is also stored in the Types folder or are there any guidlines that forbid this?
The machine is also a bit older.
Thanks for your help
UPDATE:
the further path of i=58 looks like this, where --(i=45)-> symbolizes the Referencetype from the previous to the following node (in this case i=45, HasSybtype) and the word in the parentheses next to the NodeId is the NodeClass.
i=58 --(i=45)-> ns=2;i=1(ObjectType) --(i=35)-> ns=2;i=2(Object)
--(i=35)-> ns=2;i=3(Object) --(i=47)-> ns=2;s=#setPressure(Variable) --(i=46)-> ns=2;i=5(Variable)
ns=2;s=#setPressure contains the value 250.0 and ns=2;i=5 an Engineering unit
This is not normal. It sounds like a bad implementation done by somebody who didn’t know any better.
Depending on the reference types they used to build this structure you could argue it is forbidden. DataType Nodes should only be the source of HasProperty, HasSubtype, and HasEncoding references.
edit: The path you mention is Root -> Types -> ObjectTypes -> BaseObjectType. Are you sure the Nodes you're finding under here are Variable Nodes with values or are you just seeing additional types defined by this server?

Is it possible to create a function that returns a different struct based on an enum input in Unreal?

So I am working on a data driven game in Unreal and essentially I am working more on the back end trying to make it easier for the objects to get all of the relevant data. I was really trying to create a singular function that by passing in a type I could return a different output but, from what I've tried in the editor this doesn't seem possible.
I get the error: The type of Out Struct is undetermined. Connect something to Return Node to imply a specific type. I was trying to use a wildcard as an output but, it doesn't seem to be able to do that. Any insight on this problem would be greatly appreciated.
As mentioned in comments, just make base struct and inherit your *Config structs from it. In return you will send base struct.
You can read more about this solution here:
https://en.wikipedia.org/wiki/Factory_method_pattern

Serving different media types in AWS Lambda with Java/Scala

When writing a function for AWS Lambda (in Java/Scala), the handler function can have one of different signatures:
// Raw input / output
def handleRequest(is: InputStream): OutputStream = ???
// Using the AWS dependency
def handleRequest(input: APIGatewayProxyRequestEvent, context: Context): APIGatewayProxyResponseEvent = ???
This is the two possible signatures I know of, at least. Maybe there is more.
The difference is that the first one is a raw response, that needs to be packed into a REST response by the API Gateway with manually configured properties like media type and response code.
The second response seems to utilize the Lambda Proxy Integration and will extract all configuration from the APIGatewayProxyResponse.
Now for my question:
The field body of APIGatewayProxyResponse is of type String.
To me it looks like this POJO is serialized to JSON before being sent to the API Gateway.
This would make it impossible to serve binary data like images or PDF files.
The raw OutputStream cannot carry the information about headers etc. (or can it?), which means I cannot serve multiple different media types.
Of course I could convert images, for example, to Base64. But that is far from optimal.
Is there a way I can serve different (binary and non-binary) media types (with correct headers etc.) in a single AWS Lambda handler? How do I have to configure the API Gateway for that?
Note: This answer is based on reading the docs. I haven't tried it in practice so it may not work.
If you open the "Output Format of a Lambda Function for Proxy Integration" section of the documentation you may see isBase64Encoded field and following text:
The output body is marshalled to the frontend as the method response payload. If body is a binary blob, you can encode it as a Base64-encoded string and set isBase64Encoded to true. Otherwise, you can set it to false or leave it unspecified.
And if you open APIGatewayProxyResponse in the .Net library you can see the IsBase64Encoded property there. It looks like it is just Java API that does not expose this field. But all the rest of the infrastructure should support it. You may also see that a similar field was added to APIGatewayProxyRequestEvent.java at some point but not to APIGatewayProxyResponse. So I think a following workaround should work: create your own APIGatewayProxyResponseEvent class with isBase64Encoded field and use it. Or just extend from the standard com.amazonaws.services.lambda.runtime.events.APIGatewayProxyResponseEvent and add this field in your subclass. I expect that if you match the naming convention, it should work. It should be marshalled to a JSON after all.

NodeId as string in ModelCompiler OPC UA

I am trying to develop a OPC UA server on my own, but since I am quite a newbie in coding, it is quite hard for me.
I have started from the QuickstartApplication found here: https://github.com/OPCFoundation/UA-.NET-Legacy
in particular I edit the ModelDesign.xml file to customize it as I wish
https://github.com/OPCFoundation/UA-.NET-Legacy/blob/master/ComIOP/Common/Common/ModelDesign.xml
I would like to define some nodes with NodeId as string (all the NodeId in the ModelDesign.xml in the example are numeric)
Following this xsd, I have found "StringId" and "NumericId" that look like what was looking for
https://github.com/OPCFoundation/UA-ModelCompiler/blob/master/ModelCompiler/UA%20Model%20Design.xsd
but changing their value in ModelDesign.xml does nothing about the NodeId. There is no error, just the compiler assigns new NodeIds (all numeric) as if it does not consider the changes I have made.
As a compiler, I am using the ModelCompiler found on GitHub
https://github.com/OPCFoundation/UA-ModelCompiler
Can somebody help me, please? How can I customize the NodeId of the nodes?
Thank you
Edo
the best suggestion that I can offer at this stage is to clone the UA-.NETStandard and run the NetCoreConsoleServer in
UA-.NETStandard/SampleApplications/Samples/NetCoreConsoleServer
through the debugger. The boiler node manager, if my memory serves me well, uses stringIDs. The Interface INodeIdFactory in ISystemContext.cs offers some insight in how ID's are generated.
IMHO, the model designer has no switch to enforce string ID's as you know. So you'll need to programmatically allocate stringID's rather than numeric ID's to nodes upon server boot. I haven't figured it out yet either.
So, you may set breakpoints in the BoilerNodeManager.cs and see how the nodeID is actually constructed.

How to apply SerializationStreamWriter for storage

Is there an easy way to use SerializationStreamWriter for custom purposes without writing an own generator?
(for example html5 storage)
The javadoc of GWT tells very little.
We are writing an implementation for our current project, which does exactly what you want do: serialize an custom object into string and save it into the localstorage, deserialize the string into an object...
So, for me, it is possbile, to use SerializationStreamWriter for serialization, and use SerializationStreamReader for deserialization in CLIENT SIDE.
To realize this,
you don't need a generator for SerializationStreamWriter/SerializationStreamReader, but a generator for TypeSerializer (which implements com.google.gwt.user.client.rpc.impl.SerializerBase). And this is quiet simple, take a look at com.google.gwt.user.rebind.rpc.TypeSerializerCreator, and use it in your generator. OR, if all your custom objects are referenced in one RPC service, you can just use the generated rpc service's TypeSerializer.
And you must write a proper implementation of SerializationStreamWriter OR SerializationStreamReader. Because there has two serialized string formats(request used format and response used format):
IN GWT, you have
ClientSerializationStreamWriter, ClientSerializationStreamReader for client side serialization/deserialization;
ServerSerializationStreamWriter, ServerSerializationStreamReader for server side serialization/deserialization;
Client SerializationStream Writer will serialize the object into FORMAT_1, and only Server SerializationStream Reader can read it (deserialize it into object).
Server SerializationStream Writer will serialize the object into FORMAT_2, and only Client SerializationStream Reader can read it (deserialize it into object).
so what you need to do, if you want to use ClientSerializationStreamWriter to serialize your object, then write a similar implementation of ServerSerializationStreamReader for client side. Or if you want to use ClientSerializationStreamWriter to deserialize the string, then write a similar implementation of ServerSerializationStreamWriter in client side. This is not difficult, because the difference between FORMAT_1 and FORMAT_2 is just the order.
No.
Because the GWT-RPC serialization is asymmetric, it cannot be used for local storage: the server understands what the client sent, the client understands what the server sent, but they won't understand what they themselves wrote.