Retrieve real world time from time saved using "Clock" → "To Workspace" in Simulink - simulation

I have made some measurements with an NI6024 acquisition card using Simulink, with the following model:
I have run the simulation with simulation time = "inf" and a fixed time step of 0.2, in order to collect real time data from the card. But I didn't realize that the values that "Clock" gives do not correspond to real-world time. More specifically, I have run the experiment for about a minute but the data in the variable "t" range from 0 to about 50000, which is clearly wrong. I have saved the workspace data, and I have access to the recorded data (the variables "t" and "h"), but have no means to reproduce the experiment.
Is there any way to retrieve the real world time of the simulation?

You've basically got two choices.
Run your model in real time, using for instance something like Simulink Real-Time, or other real time OS. In this case the (wall clock) time will represent time since the model was started.
Write an S-Function to slow down the simulation so that it fakes real time. There are multiple examples of doing this on the File Exchange. See Real-Time Pacer for Simulink for one such example.

Related

matlab script node in Labview with different timing

I have a DAQ for Temperature measurment. I take a continuous sample rate and after DAQ, calculating temperature difference per minute (Cooling Rate: CR) during this process. This CR and temperature values are inserted into the Matlab script for a physical model running (predicting the temperature drop for next 30 sec). Then, I record and compare the predicted and experimental values in LabVIEW.
What i am trying to do is the matlab model is executing every 30 sec, and send out its predictions as an output from matlab script. One of this outputs helps me to change the Air Blower Motor Speed until next matlab run( eventually affect the temperature drop for next 30 sec as well, which becomes a closed loop). After 30 sec while main process is still running, sending CR and temperature values to matlab model again, and so on.
I have a case structure for this Matlab script. And inside of case structure i applied an elapsed time function to control the timing for the matlab script, but this is not working.
Yes. Short answer: I believe (one of) the reasons the program behaves weird on changed timing are several race conditions present in the code.
The part of the diagram presented shows several big problems with the code:
Local variables lead to race conditions. Use dataflow. E.g. you are writing to Tinitial local variable, and reading from Tinitial local varaible in the chunk of code with no data dependencies. It is not known whether reading or writing will happen first. It may not manifest itself badly with small delays, while big delays may be an issue. Solution: rewrite you program using the following example:
From Bad:
To Good:
(nevermind broken wires)
Matlab script node executes in the main UI execution system. If it is executing for a long time, it may freeze indicators/controls as well as execution of other pieces of code. Change execution system of other VIs in your program (say to "other 1") and see if the situation improves.

Simulink Set sample time the same as Data

I am trying to make a simulation in Simulink, with the fuzzy model.
As inputs, I have set four time series variables from the workspace (compatible to simulink after performing Simulink.Timeseries function): every row of the variables has a linked time starting from 0 to 10566 (seconds, I believe). How can I set the sample time in simulink source block in order to pick every exact case without interpolation?
Thank you for your kind answers,
Phalaen
Are you using the From Workspace block? If so, it's simply a matter of specifying the sample time in the block parameters and unticking "Interpolate data". You can also display the sample time information of the model to check which sample time each block is using, see View Sample Time Information in the documentation.

Dymola/Modelica real-time simulation advances too fast

I want to simulate a model in Dymola in real-time for HiL use. In the results I see that the Simulation is advancing about 5% too fast.
Integration terminated successfully at T = 691200
CPU-time for integration : 6.57e+005 seconds
CPU-time for one GRID interval: 951 milli-seconds
I already tried to increase the grid interval to reduce the relativ error, but still the simulation is advancing too fast. I only read about aproaches to reduce model complexity to allow simulation within the defined time steps.
Note, that the Simulation does keep up with real-time and is even faster. How can I ín this case match simulated time and real time?
Edit 1:
I used Lsodar solver with checked "Synchronize with realtime option" in Realtime tab. I have the realtime simulation licence option. I use Dymola 2013 on Windows 7. Here the result for a stepsize of 15s:
Integration terminated successfully at T = 691200
CPU-time for integration : 6.6e+005 seconds
CPU-time for one GRID interval : 1.43e+004 milli-seconds
The deviation still is roughly about 4.5%.
I did however not use inline integration.
Do I need hard realtime or inline integration to improve those results? It should be possible to get a deviation lower than 4.5% using soft realtime or not?
Edit 2:
I took the Python27 block from the Berkeley Buildings library to read the System time and compare it with the Simulation advance. The result shows that 36 hours after Simulation start, the Simulation slows down slightly (compared to real time). About 72 hours after the start of the simulation it starts getting about 10% faster than real time. In addition, the jitter in the result increases after those 72 hours.
Any explanations?
Next steps will be:
-changing to fixed step solver (Might well be this is a big part of the solution)
-Changing from DDE Server to OPC Server, which at the Moment doesn't not seem to be possible in Dymola 2013 however.
Edit 3:
Nope... using a fixed step solver does seem to solve the problem. In the first 48 hours of simulation time the deviation seems to be equal to the deviation using a solver with variable step size. In this example I used the Rkfix 3 solver with an integrator step of 0.1.
Nobody knows how to get rid of those huge deviations?
If I recall correctly, Dymola has a special compilation option for real-time performance. However, I think it is a licensed option (not sure).
I suspect that Dymola is picking up the wrong clock speed.
You could use the "Slowdown factor" that is in the Simulation Setup, on the Realtime tab just below "Synchronize with realtime". Set this to 1/0.95.
There is a parameter in Dymola that you can use to set the CPU speed but I could not find this now, I will have a look for this again later.
I solved the problem switching to an embedded OPC-Server. Error between real time and simulation time in this case is shown below.
Compiling Dymola Problems with an embedded OPC-Server requires administrator rights (which I did not have before). The active folder of Dymola must not be write protected.

control simulink from M-file

I am trying to control a simulink from a M-file.
What I want to do in the M-file is give the simulink model some input, run the simulink model, change one input value at 0.6 seconds, continue running the simulink model with the new input.
I already know that by using set_param, I can start, pause and continue the simulink, but the problem is I don't know how to pause the simulink model at a certain time(0.6s), is it possible to get the current time from simulink model and read it in the M-file?
Another way I already know is using sim to run simulink model from 0 to 0.6s, and use SimState to save the information at 0.6s, then load these information to resume the simulation. I am trying to change the input before the simulation resumed, but it seems that the model will load the input values from the information it saved, it won't take the new input value.
I stuck in this problem for a very long time, could someone help me with this please?
Thank you very much.
You can get the current time of a running simulation with:
get_param('simulink_model_name', 'SimulationTime');
So for instance by checking this value from your M-file during simulation by using
timer(...)
you can detect when the simulation is at 0.6 seconds.
I used a combination of simulink and m-script to achieve a similar goal.
In your model, add one 'assert' block. Double click it, and uncheck 'Stop Simulation when assertion fails'. In the 'Simulation Callback when assertion fails' field, add three commands:
set_param(bdroot,'SimulationCommand','pause');
run('myscript.m'); %insert the script name
set_param(bdroot,'SimulationCommand','continue');
Now connect the inport of this block to a 'not equal to' relational operator. Connect the first inport of the relational operator to a clock (pls set the decimation for analog clock or the sample time [usually -1 for inherited] for the digital clock).
The second inport is connected to constant block with a value of 0.6
On simulating the model, the simulation will pause at 0.6 sec, execute the m-file to change the input parameter (considering that it's tunable) and then continue with the simulation.
The assertion block is called when its input signal becomes 0. At 0.6 sec, the output of the relational operator will be 0.
Let me know if it worked.
This is not currently possible from an M-file. If you want to dynamically change the input at a given time externally, it will require an S-Function. Even this solution is difficult and wrought with flakey-ness since the Mathworks does not want to support this functionality in that it defeats one of the features of another toolbox they sell. In time, I believe they will grant this privledge, but it does not exist today. Also, why not use a dynamic input block to change the input value, like a map, signal builder, etc. ?

Simulink Storing Data to Workspace has time nonmonotonic

I am running a simulink simulation using the fixed-step discrete solver. I've even specified the fixed-step size. I save some data via a "To Workspace" block (I've used the Scope to save to workspace as well with equivalent results). When I look at the time data in the object, the time is not monotonically increasing.
The time value is constant for 5-10 samples, then continues. Any ideas why this happens?
I took a screen shot of the Time vector. You can see it goes flat, then continues, then is flat. I expected a single line.
I asked on the Mathworks site as well. I'll update both if I get an answer.
EDIT: I am working with Mathworks now too. They did show me how to visually inspect sample times. Navigate to Format > Sample Time Display > All. This will show all of the sample times in the simulation.
This problem was caused, at least partly by the existence of algebraic loops. Since Simulink was recalculating the algebraic loops, the output was capturing these changes.
I was able to click the "minimize algebraic loop occurrences" in the
Configurate Parameters > Model Referencing
and my loops were eliminated. This is a YMMV answer, but its the best I've found working with Mathworks support.