I'm using Sensoray 626 card with simulink real time (rtwin), the problem is that when I try to plot some graph using scope block in real time no more than 800 points are plotted. In other words, it seems that the scope updates the graph by deleting the old points and starting new frame from zero again and again.
I tried to export data to be plotted from simulink to workspace in order to plot it after the real-time simulation is finished but, unfortunately, the same problem occurred. I have got no more than 800 points in workspace (in some attempts I've got less than 200).
The weird thing is that such problem doesn't occur with the same matlab version and with the same pc but using das 1002 card instead. both The scope and save-to-workspace blocks work well.
I'm using matlab 2009 on Windows Xp.
I would have used das 1002 card but it doesn't contain any encoder.
PS: solver configuration was properly set , necessary libraries were loaded.
Any help that can solve this problem would be appreciated.
thanks in advance.
solver configuration
solver
scope properties
simple simulink example
The Scope is only able to display an amount of samples equal to external
mode buffer length. So please go to Tools->External Mode Control
Panel->Signal & Triggering and check the Duration parameter there. I'd
bet it is 1000, so 1000 samples at 0.001 s sampling rate gives the 1
second of data you get. If you want more, try to increase this number.
Related
Using Matlab, I've made some random noise, filtered it and then successfully saved it as a gnuradio readable file for a file source. Once used in gnuradio, I set the file source to repeat and then viewed it using QT Gui Frequency Sink. I can see the filtered noise fine, but every now and then (every 10 seconds or so), the spectrum will drop in power and jump around for around a tenth of a second, then return back to normal power. My sample rate for the matlab filter is 320k and same with my gnuradio sample rate if that matters.
I think it may have to do with the fact that the noise generated on matlab is going to be a sequence that is repeated on gnuradio. I think the discontinuity happens right when the sequence repeats. Any idea how I can stop this discontinuity so I can transmit without having to worry about it? If I'm missing any info, please let me know and I'll edit the question. Thanks in advance.
NOTE: I needed to create a matlab binary file to be able to read it on GNU Radio. GNU Radio reads the binary file from my desktop, then uses the information as the file source.
I just wanted to share my way to solve a bug in Simulink (in Matlab v2010a and the same pause instruction which is the cause of the problem found in MATLAB2014a as well).
When I get a serial input via the simulink serial acquisition block , if the data input rate is moderately fast (like more than 100 sample/sec.) , I see that the 1st 3 sec or so, the data input will be good then after those few seconds a very strange noise will appear.
By digging deep in the source code of this serial acquisition block, I saw that it is using a delay instruction ‘pause(0.001)’ and has been used apparently to delay the code running for 1 ms after each 1 sample acquisition.
Answering my own question:
An article I found in MSDN titled “Windows Time” states: “GetTickCount and GetTickCount64 are limited to the resolution of the system timer, which is approximately 10 milliseconds to 16 milliseconds.” [1]. That means that delay will effectively limiting the maximum samples per sec. that can be read by this block.
I have deleted this pause line in the simulink serial acquisition block (line no. 331 in the script of the ‘serial receive’ block named ‘sserialrb.m’ in MATLAB R2010(a)) and everything worked well.
Hope this will help somebody!
[1] MSDN, "Windows Time," http://msdn.microsoft.com/en-us/library/windows/desktop/ms725496%28v=vs.85%29.aspx , 2012.
how to import a bitsream form binary vector from workspace into simulink.Actually I have found that I can use simin block or In block but my binary vector is independant of time. I tried to use Const block and it works but afer that when I wanted to put my output in the Buffer block in simulink, it didn't work because the input is continuous and not discrete. So I am asking if it's a way to add time to my binary uni-dimensional without having any influence on the result?and how can I do it?
Or is there another way to import this date to avoid this problem with Buffer block?
Your screenshot shows your constant block to have a sample time of Inf. As the error message suggest, you need to change that to a discrete sample time. In addition, you should also:
check your model is using a fixed-step solver
check what time step you are using for your chosen fixed-step solver (ideally the same as your constant).
You can have multi-rate models, but you need to manage the rate transitions with Rate Transition blocks. For more details on sample times, see the documentation, in particular how to view sample time information in a Simulink model. You should probably also have a quick look at the Choose a Solver section.
I want to simulate a model in Dymola in real-time for HiL use. In the results I see that the Simulation is advancing about 5% too fast.
Integration terminated successfully at T = 691200
CPU-time for integration : 6.57e+005 seconds
CPU-time for one GRID interval: 951 milli-seconds
I already tried to increase the grid interval to reduce the relativ error, but still the simulation is advancing too fast. I only read about aproaches to reduce model complexity to allow simulation within the defined time steps.
Note, that the Simulation does keep up with real-time and is even faster. How can I ín this case match simulated time and real time?
Edit 1:
I used Lsodar solver with checked "Synchronize with realtime option" in Realtime tab. I have the realtime simulation licence option. I use Dymola 2013 on Windows 7. Here the result for a stepsize of 15s:
Integration terminated successfully at T = 691200
CPU-time for integration : 6.6e+005 seconds
CPU-time for one GRID interval : 1.43e+004 milli-seconds
The deviation still is roughly about 4.5%.
I did however not use inline integration.
Do I need hard realtime or inline integration to improve those results? It should be possible to get a deviation lower than 4.5% using soft realtime or not?
Edit 2:
I took the Python27 block from the Berkeley Buildings library to read the System time and compare it with the Simulation advance. The result shows that 36 hours after Simulation start, the Simulation slows down slightly (compared to real time). About 72 hours after the start of the simulation it starts getting about 10% faster than real time. In addition, the jitter in the result increases after those 72 hours.
Any explanations?
Next steps will be:
-changing to fixed step solver (Might well be this is a big part of the solution)
-Changing from DDE Server to OPC Server, which at the Moment doesn't not seem to be possible in Dymola 2013 however.
Edit 3:
Nope... using a fixed step solver does seem to solve the problem. In the first 48 hours of simulation time the deviation seems to be equal to the deviation using a solver with variable step size. In this example I used the Rkfix 3 solver with an integrator step of 0.1.
Nobody knows how to get rid of those huge deviations?
If I recall correctly, Dymola has a special compilation option for real-time performance. However, I think it is a licensed option (not sure).
I suspect that Dymola is picking up the wrong clock speed.
You could use the "Slowdown factor" that is in the Simulation Setup, on the Realtime tab just below "Synchronize with realtime". Set this to 1/0.95.
There is a parameter in Dymola that you can use to set the CPU speed but I could not find this now, I will have a look for this again later.
I solved the problem switching to an embedded OPC-Server. Error between real time and simulation time in this case is shown below.
Compiling Dymola Problems with an embedded OPC-Server requires administrator rights (which I did not have before). The active folder of Dymola must not be write protected.
I am running a simulink simulation using the fixed-step discrete solver. I've even specified the fixed-step size. I save some data via a "To Workspace" block (I've used the Scope to save to workspace as well with equivalent results). When I look at the time data in the object, the time is not monotonically increasing.
The time value is constant for 5-10 samples, then continues. Any ideas why this happens?
I took a screen shot of the Time vector. You can see it goes flat, then continues, then is flat. I expected a single line.
I asked on the Mathworks site as well. I'll update both if I get an answer.
EDIT: I am working with Mathworks now too. They did show me how to visually inspect sample times. Navigate to Format > Sample Time Display > All. This will show all of the sample times in the simulation.
This problem was caused, at least partly by the existence of algebraic loops. Since Simulink was recalculating the algebraic loops, the output was capturing these changes.
I was able to click the "minimize algebraic loop occurrences" in the
Configurate Parameters > Model Referencing
and my loops were eliminated. This is a YMMV answer, but its the best I've found working with Mathworks support.