I want to run an OpenModelica simulation on an pc running Linux and then connect via opc ua to the simulation. I tried simulating my model in OMShell with simulate(MyTestModel). This worked.
When I add the simulation flags simulate(MyTestModel,simflags ="-rt 1.0 -embeddedServer opc-ua") to start a OPC UA Server OMShell crashes and nothing happens. I tried this in Linux and Windows.
When using OMEdit and setting the flags in the simulation setup it seems to work. However I also have to press the Play Button in the Interactive Plot view to start the simulation. I tested this by connecting to the server with UAExpert.
My question is: Does anybody know why OMShell crashes (doesn't react anymore) when trying to start the OPC UA Server? Or does anybody have any other suggestions on how to achieve my goal of establishing a opc ua connection with an open modelica simulation running on Linux?
Thanks!
My test model is:
model MyTestModel
input Real x;
Real y;
equation
y = x;
end MyTestModel;
As always I have already tried to find the solution for several days but after posting the question I found the solution myself. I have to start the simulation via the OPC UA interface after starting the server. I changed the node "Run" to true and the simulation of my model started. This is probably the same as pressing the play button in OMEdit (as described above).
Related
I am new to opc ua. I need a signal simulator and connect it/send data to opc ua server.
I am thinking to write a program as the signal simulator or if there is a recommendation for existing simulator. The requirements for simulator is sending some signals simulating a production line. Signal examples includes packing_package_started, packing_package_finished, sending_package_start, sending_package_finished.... As you can see, those signals need to follow a specific order. I am thinking to write a python program(or nodejs program, it can be other languages) to generate these signals, but I don't know if it is possible and easy to connect this python program to the opc ua server? Or if there is an existing simulator that I can configure and send data to opcua server?
I do not know a simulation client for that but if python work for you fine you can use the asyncio-opcua package. I think for a simple application like you plan, writing a python script should be good.
If you want something more "graphical" you can also use node-red
Hallo this is my first post here.
I'm currently doing my internship at a company that uses twincat 3 programs for automation in conjunction with a yaskawa robot system. The assignment I received is to expand the simulation software they currently use to simulate the plc software to also be able to simulate the yaskawa interface so they can test if changes in eithers software causes issue's in the other without having to upload the program to a PLC. I've been looking online and found a couple options but I'm curious if I missed some other options or if the options I've found will not work.
The options I'm currently considering are.
using the ethernet port on the simulation pc to connect to the Yaskawa controller. I've found a driver(https://infosys.beckhoff.com/english.php?content=../content/1033/el7031-0030/1036996875.html&id=) that will allow the user to make an ethernet port on their pc function in real time so I theorized It might be possible to then directly connect the computer to the yaskawa controller so the user would not need to upload the software but have the PC act as a PLC instead.
Use ROS to simulate the robot. there are libraries for ROS industrial to program and simulate a yaskawa robot. It would require using 2 computers or using a virtual machine as ROS is used best on an linux system. Only problem is I can't figure out if the ROS simulation can use regular yaskawa jobs and functions or if it can only the adapted ones programmed in ROS.
Use motomans simulation software motosim to simulate the yaskawa robot. Seems like an obvious solution but can't figure out if it's possible to have it communicate with twincat3 however I might have missed something.
write a twincat 3 software that emulates a yaskawa interface. I consider this a last resort if all else fails. It would require the code be updated each time something about the yaskawa software changes which is not ideal for quick testing.
I'd really appreciate any suggestions or additional information more experienced programmers might have.
I can stream motion capture data live from Xsens MVN Studio into Unity3D and animate a character with it in real-time, because the Unity3D character listens to port number 9763 of the same machine and receives the motion data from there, which means Xsens MVN Studio sends the motion data to that port. In the attachment, please see the attached image localhost:9763
Where does this port number come from? I mean who decides the port number? I could configure this live streaming between Unity3D and Xsens MVN Studio, because Xsens people provide a manual for the Unity3D plug-in, so that is how I knew about it...
My other question is that: Now, I would like to do the exact same thing with Matlab; i.e. get Unity3D to stream data live into Matlab and get back whatever Matlab produces, etc... I have found a simple Matlab script online that apparently does this with a remote website:
t = tcpip('www.EXAMPLE_WEBSITE.com', 80);
set(t, 'InputBufferSize', 30000);
fopen(t);
fprintf(t, 'GET /');
pause(1)
while (get(t, 'BytesAvailable') > 0)
t.BytesAvailable
DataReceived = fscanf(t)
end
fclose(t);
delete(t);
clear t
The code comment says: I should substitute www.EXAMPLE_WEBSITE.com with an actual website or any remote application with which I wish to communicate. But firstly, what is the www replacement for the Matlab application on the same machine (localhost??) and secondly, how do I find out which port number Matlab can send data to and Unity can listen to?
I would appreciate it if someone please help me understand these concepts...
I think they decide port number. You can choose port (Ps: Only one app can use same port at the same time.)
I think Matlab needs to behave like server. So unity request calculation result from Matlab. Matlab listens requests and gives response. I don't know how you can do in Matlab but you can request calculation result from Unity3D to Matlab server with using WWW class.
There is a Simulink model composed of some subsystems, It is supposed to run each subsystem in a computer and all computers are connected to a LAN network.
There are interconnections between these subsystems and data shall be transferred between them so they should run synchronously and preferably real time.
But the computers have Windows operating system and so they do not have real time operating system.
I am searching to find a solution for this problem. what I have found up to now is that:
Simulink real time workshop can make executable code of simulink models but the code can be run real time just with real time processors and it doesn't seem that they can support connection between multiple computers.
XPC target is known but it just make connection between one host PC or hardware and one target PC and the target PC shall have real time OS. so it doesn't cover this problem.
There is a library for Matlab titled "Hardware Input / Output Library for Matlab / Simulink " by Werner.Zimmermann that have some good facilities such that it can make simulink to be run near real- time and it can make connection and send data via TCP/IP between two computers but it seems it can just make the connection between two computers running simulink.
It also have some constraints for OS and version of Matlab/Simulink and is not updated. So I'm not sure it would be enough.
After all of these, does anyone know a better way of handling this problem? I would appreciate any help on any of these topics to be useful or another ways to solve this problem.
Thanks in advance
see the following links (especially the first one):
http://www.mathworks.de/help/toolbox/instrument/brbv41k-1.html
http://www.mathworks.com/matlabcentral/fileexchange/4934
http://www.mathworks.com/matlabcentral/fileexchange/27290
I am communicating across USB, using a proprietary protocol, with some custom hardware I've built. I have a GUI that handles all the communications/interaction with that hardware and a (C#) DLL which exposes all the relevant USB functionality. I need to write a LabVIEW driver (VI) for communicating with the hardware. My thought is that I just use LabVIEW to open up my GUI and have a socket with which I expose all the relevant control to LabVIEW with. Is it possible to open a socket in LabVIEW and communicate with my GUI? Is this a bad approach or should I just try and make LabVIEW invoke the DLL and handle the hardware control instead of my GUI (polled communications, solicited/unsolicited commands, etc)?
IS there a reason you want to use your GUI only? In terms of time, I would say build a good front panel in LabVIEW and just communicate to the hardware directly using the DLL. Adding the GUI is just an added layer of complexity which might be difficult to maintain later on? Why not do everything in LabVIEW if you can?
Yes, LabVIEW supports sockets using both TCP/IP and UDP.
You should be able to create a program/service that continually runs acting as TCP/IP server. You can send commands and receive responses as strings. If you need to pack data, you can use the flatten to string command.
Essentially, your application should be structured as a loop running the TCP/IP server, and another loop that actually communicates with the instrument. This might change if you need to get data back from the devices to your TCP client. A producer consumer model, if you will :)
To get you started off, open up the NI Example Finder (Help -> Find Examples) and browse to Networking->TCP and UDP-> Simple Data Server.vi
It depends who is going to be using the LabVIEW driver and for what. If you're handing over this hardware to someone else who is going to want to create their own application(s) for it, they would probably prefer to talk directly to the DLL rather than go through your GUI. If it's more about automating your existing software from LabVIEW to do testing or repetitive tasks on the hardware, for example, then driving your GUI from LabVIEW might be less work.