How to extract an MSL model, modify the code, and use locally? - modelica

I am interested to replace my own PID-regulator models with MSL/Blocks/Continuous/LimPID. The problem is that this model restricts limitations of output signals to be parameters and thus do not allow time-varying limits, which I need to have.
Studying the code I understand that the output limitation is created by a block MSL/Blocks/Nonlinear/Limiter and I just want to change this to the block VariableLimiter.
I can imagine that you need to ensure that changes of output-limitations vary in a time-scale slower than the regulator in order to not excite unwanted behaviour of the controller. Still here is a class of problems where it would be very useful to allow this limits to vary slowly.
Thanks for the good input to my question and below a very simple example to refine my question. (The LimPID is more complicated and I come back to that).
Let us instead just modify the block Add to a local block in MyModel.
I copy the code from Modelica.Blocks.Math.Add and call it Addb in MyModel. Since here is a dependence of Interfaces.SI2SO I need to make an import before the extends-clause. This import I take from the ordinary general MSL package, instead of copying also that in to MyModel. Then I introduce a new parameter "bias" and modify the equation. The annotation may need some update as well but we do not bother with that now.
MyModel
...
block Addb "Output the sum of the two inputs"
import Modelica.Blocks.Interfaces;
extends Interfaces.SI2SO;
parameter Real k1=+1 "Gain of input signal 1";
parameter Real k2=+1 "Gain of input signal 2";
parameter Real bias=0 "Bias term";
equation
y = k1*u1 + k2*u2 + bias;
annotation (...);
end Addb;
MyModel;
This code seems to work.
My added new question is whether it is enough to look up "extends-clauses" and other references to MSL and make the proper imports since the code is now local, or here are more aspects to think of? The LimPID code is rather complex with procedures for initialization etc so I just wonder if here is more to do than just bring in a number of import-clauses?

The models in Modelica Standard Library (MSL) should only be seen as exemplary models, not covering all possible applications. MSL is write protected and it is not possible to replace the limiter block in LimPID (and add max/min input connectors). Also, it wouldn't work out if you shared your simulation model with others, expecting their MSL to work like your modified MSL.
Personally, I have my own libraries of components where MSL models are inadequate. For example, I have PID controllers with variable limits, manual/automatic functions and many more functions which are needed in my applications.
Often, I create a copy of an MSL model, place it in the same package in my own library and make the necessary modifications and additions, e.g. MyLibrary.Blocks.Continuous.PID.

Related

How to programmatically configure the tunability of model parameters?

I'm porting a large Simulink model from Simulink R2010a → R2017b.
The main model is basically a glue-layer for many interwoven reference models. My objective is to generate a standalone executable out of this main model using Coder.
Parameter tunability in this context is not done via the Signals and Parameters section on the Optimization tab in the Model Configuration Parameters dialog (as is the case in stand-alone models), but rather, via constructing Simulink.Parameter objects in the base workspace, and referencing those in the respective referenced models, or in their respective model workspaces.
Now, AFAIK, in R2010a it was enough to set
new_parameter.RTWInfo.StorageClass = 'Auto';
new_parameter.RTWInfo.CustomStorageClass = 'Define';
to make the parameter non-tunable and convert it into a #define in the generated code. In R2017b, this is no longer allowed; the StorageClass must be 'Custom' if you set a non-empty CustomStorageClass:
new_parameter.CoderInfo.StorageClass = 'Custom'; % <- can't be 'Auto'
new_parameter.CoderInfo.CustomStorageClass = 'Define';
But apparently, this does not make the parameter non-tunable:
Warning: Parameter 'OutPortSampleTime' of '[...]/Rate Transition1' is non-tunable but refers to tunable variables (Simulation_compiletimeConstant (base workspace))
I can't find anything in the R2017b documentation on making parameters non-tunable, programatically; I can only find how to do it in stand-alone models via the dialog, but that's not what I want here.
Can anyone point me in the right direction?
NOTE: Back in the day, Simulink Coder was called Real-Time Workshop (well, Real-time Workshop split into Coder and several other things), hence the difference RTWInfo vs. CoderInfo. Note that RTWInfo still works in R2017b, but issues a warning and gets converted into Coderinfo automatically.
In generated code it should appear as #define, the way you specified it.
https://www.mathworks.com/help/rtw/ug/choose-a-built-in-storage-class-for-controlling-data-representation-in-the-generated-code.html
Btw, yes, it's a bit confusing, because in m-file you specify CustomStorageClass = 'Define';, in GUI you specify Storage class as Define (custom), but in documentation they say Storage Class as Defined.
I am not sure why warning about tunability shows up.

How to pass heatPorts.T to DynamicPipe flowModel?

In the implementation of a flow models that function with Modelica Standard Library DynamicPipe (or a similar model that builds from PartialTwoPortFlow) there are examples of flow models that take place in an environment with heat transfer that requires wall properties (e.g., heatPorts.T and/or heatPorts.Q_flow) in order to calculate the pressure drop.
For example, a pressure drop model may need to calculate a new visocisty or Prandtl number based on the medium pressure and the wall temperature to capture cooling/heating effects, etc.
The heat transfer model obtains properties of the medium via passing the "states" however there is no existing connection in DynamicPipe or PartialTwoPortFlow that goes the other way.
I've tried numerous variations of ideas and have had no success, including creating a new PartialTwoPortFlow that contains all the heat transfer calls that exist in DynamicPipe.
I hesitate to post this question as I am surprised I am having so much difficulty with this and would not be surprised to find a straight forward solution. Nevertheless I need this ability and curious if others have already solved this issue as I am running short on ideas.
So my questions is:
What is a proper/efficient means of passing the heatPorts.T values to the flowModel?
For those familiar with the MSL Fluids library and more specifically the Pipe models provided, this answer should (hopefully) make sense.
Aside:
It seems the dynamic pipe could be improved a little bit by not restricting the heat transfer area to the perimeter x lengths and instead introduce a parameter (e.g., heatTransferArea) that would permit the user to define it and default to perimeter x lengths. See below
parameter SI.Area heatTransferArea = perimeter*lengths "Total heat transfer area";
HeatTransfer heatTransfer(
...
final surfaceAreas=heatTransferArea , //perimeter*lengths <- replaced
...
End Aside:
In order to communicate heatPorts.T to the flowModel and to not have errors when I checked each of the models I had to do the following:
Make an "input" in the flowModel for Ts_w. Not parameter (take a look at how mediums.state is passed)! Might have to do some finagling with it like "diameters" (see DetailedPipeFlow) to make it be used how you think it's going to be used.
Duplicate PartialTwoPortFlow and add the final Ts_w = Ts_wFM to flowModel. Additionally define the variable SI.Temperature[nFM+1] Ts_wFM in PartialTwoPortFlow and establish definitions similar to statesFM in the equation section.
This will require adding a HeatPorts model to be added.
Duplicate DynamicPipe and change the extension to the new PartialTwoPortFlow. Set use_HeatTransfer to true (as I've set it up this has to be true now for this to work which isn't ideal but manageable). Might be good to make it a final parameter so it can't be changed.
Don't forget to connect heatPorts to the heatports added in step 2.
I believe that this capture a quick version of how I was able to get the wall temperature passed to the flowModel. Perhaps there is a more elegant way but I though this was pretty serviceable. I now simply have one more Partial model and one more pipe model called PartialTwoPort_wTemp and GenericDynamicPipe (I also incorporated my surfaceArea correction in the new pipe).

Doing Integration Testing on Simulink Models

Say, I've UNITs 1,2,3,4 ( either as Model reference or Subsystem) for which I've units tests ready using matlab.unittest.TestCase framework.
What could be the easiest way to write integration test fro entire system ?
I need some way to set Global_Inputx ( x = 1,2,3 ) and verify Global_Outy ( y =1,2 ) in easiest possible way (may be utilizing the Unit tests) ?
I can use Matlab 14a
PS: I've already gone through this but it didn't help.
I think the question of integration testing in Simulink is a complex one that may involve formal methods like code and coverage analysis of the dynamic system under test, automatic test generation e.t.c. If you haven't already, you may want to check out "Verification, Validation and Test" section of the MathWorks product line up: http://www.mathworks.com/products/?s_tid=gn_ps.
However to answer your specific question of how you would set the global input and verify global output in your test:
Depending on where your global input data that feeds the inport blocks resides (MATLAB base workspace, model workspace, e.t.c), do you think you could set the external input of the model to that data. For example:
set_param(, 'ExternalInput', )
This could be defined in your test class setup, test method setup or in the test depending on when the data is available and where defining it is appropriate. The data could also be passed in directly to the sim() command. Parameterized testing (http://www.mathworks.com/help/matlab/matlab_prog/create-basic-parameterized-test.html) is an option to consider if you want to test the system with different sets of inputs. The external input value becomes the parameter in this context.
If you have your model set up for output logging, then once the simulation is done, you would get the logged outputs, which you could then compare against a baseline.
Does that help? or am I way off the base here. If you can add more details, I can try again.

How to include a c-header with constants in Matlab Simulink

I'm developing a Simulink modell with many C-s-functions. For an easier handling I want to use constants in the c-s-function as in the simulink-modell. So I have a c-header with preprocesser constants like:
#define THIS_IS_A_CONSANT 10
And there is the question:
How it is possible to include this in Simulink in this way I can use THIS_IS_A_CONSANT for example in a constant source like a workspace-variable?
Thanks and regards
Alex
There is functionality in Simulink that will allow you to include custom C header files that define constants, variables, etc.; however, as far as I know (and as one might expect) this really is only pertinent in cases where code is being generated and compiled.
So, for the most part, this particular functionality is only relevant when you are using Simulink Coder to generate a stand-alone executable from your model. For example, this link shows how to include parameters stored in an external header file during code generation through the use of Simulink.Parameter objects with Custom Storage Classes and the Code Generation - Custom Code Pane under the model's Configuration Parameters.
This link from the Simulink doc shows how to use the #define custom storage class to achieve similar results.
However, it sounds like neither of these really solve your issue, as you want to make use of the code in the header file during simulation.
That said, considering that there are elements in Simulink, such as Stateflow Charts and MATLAB Function blocks, that generate and build code "under the hood" during simulation, it's (at least hypothetically) possible that you might be able to use some of the concepts described above to access the values in your header file from one of those elements during simulation. For example, I was successfully able to access preprocessor macros in a Stateflow chart just by going to the Simulation Target->Custom Code pane under Configuration Parameters and including the text #include "header.h" under Include custom C code in generated: Header file. (In this case, header.h contained the line of C code that you included in your post)
Although it seems like you should be able to extend this functionality further, this really was the limit of what I was able to achieve as far as accessing the header file during simulation was concerned. For example, I know that running a model in Rapid Accelerator mode actually generates and builds code under the hood, so seemingly you should be able to use some combination of the techniques I described above to be able to access values from the header file during simulation. It looks like the code that Rapid Accelerator mode generates doesn't respect all of the settings defined by those techniques in the same way that Simulink/Embedded Coder do, though, so I just kept running into compilation errors. (Although maybe I'm just missing some creative combination of settings that could make that work).
Hopefully that helps explain Simulink's abilities (and limitations) regarding the inclusion of C header files. So to summarize, according to the links included above, what you are asking for is almost barely possible, but in practice... not really.
So if really all you want is to be able to create workspace variables out of the preprocessor #define's in your header file, it probably is just easiest to manually parse the file with a MATLAB script, as had previously been suggested in the comments. Here is a quick-and-dirty script that loads in a header file, iterates over each line, uses a regular expression (which you can improve upon if needed) to parse #define statements, and then calls eval to create variables from the parsed input.
filename = 'header.h';
pattern = '^\s*#define\s*(\w*)\s*(\d*\.?\d+)';
fid = fopen(filename);
tline = fgetl(fid);
while ischar(tline)
tokens = regexp(tline, pattern,'tokens','once');
if(numel(tokens) == 2)
eval([tokens{1} ' = ' tokens{2}]);
end
tline = fgetl(fid);
end
fclose(fid);
You could put this code in a callback so that it will execute every time you load your model. Just goto File->Model Properties->Model Properties, click on the Callbacks tab, and then place the code under whichever callback you desire (such as PreLoadFcn if you want it to run immediately before the model loads).

Good practices for formatting simulation output

This is almost a programming question, but geared towards physicists.
Suppose I am writing a piece of software that takes some system parameters as input and then calculates something from it, in my case a spectral function $A(k,\omega)$.
When I want to just take the output and feed it to gnuplot, I should make the program output a simple table with one column for the $k$-values, one for $\omega$ and one for $A(k,\omega)$.
But then I cannot store there all the additional information, such as what parameters were used. And maybe I want to store in that output some additional debugging information such as intermediate quantities. In my example, the spectral function is obtained from the self energy, so in some situations I might want to look at the self energy directly.
I do not want to constantly hack the source code depending on what output I want. It would be nicer if all the relevant data of a "run" would be present in a single file/entity but so that it is still easy to extract tables I can feed to gnuplot.
Not wanting to reinvent the wheel and develop a full-blown file format, are there some "standards" around that are best used when creating, processing and storing data from calculations or simulations? Maybe even in an SQL database format?
There are dozens of methods, and none too good; I'll share two mine:
If the program is worth it, I add a small parser of config files. Then I just make a cofig, let's say, SimA.in, and simulator makes a bunch of files with corresponding data SimA.paths, SimA.stats, SimA.log, etc. Unless the names are unique and I add version of the code to log, this makes the results fully reproducible and the simulation itself portable enough to be easily manageable.
If not, I just wrap a code a bit and use R as a host. Then I just return all the arrays and scalars (R data structures are very flexible, and it is easy to cast native R or C structs) and use R to manage, save/load and of course visualize and analyse the data. Moreover, with Sweave and CacheSweave the whole executing, analysis and reporting can be bunched in an elegant bunch, fully reproducible with one command.
If you want an "enterprise" solution, try NetCDF or HDF5. But I feel it may be an overkill here.
And of course a version control of the simulator code is a must. But that's obvious =)
For a project I'm currently working on that uses Python and C++ (via SWIG), I'm planning to use a short python script as input file. So, in a way, I'll be 'hacking the source' to change parameters, but in an interpreted language, not a compiled one.
Currently, I plan to have an input file like parameters.py, and use it like from parameters import params. But that might be too dependent on correct syntax.
params = {
"foods" : ["spam", "beans", "eggs"],
"costs" : [199, 4, 1],
"customerAge" : 23,
}
Another option might be to just define the variables at the script level in parameters2.py. This loses the nice dictionary packaging, but makes it a little harder for the user to mess it up. And it probably wouldn't be to hard to write a 'parser' that puts those script-level variables into a nice dictionary. A plus to method is that the user could parameterize things that weren't originally considered--from parameters2 import * would overwrite previous definitions of those parameters. Of course, this might be bad if the user overwrites something important.
foods = ["spam", "beans", "eggs"]
costs = [199, 4, 1]
customerAge = 23
parameters3.py would use a class, though it is contraindicated by Python's persnicketiness about indentation. from parameters3 import params:
class params:
foods = ["spam", "beans", "eggs"]
costs = [199, 4, 1]
customerAge = 23
I should also mention, for completeness, that our C++ code also defines a parameters class. That is, in our actual project, parameters.py is a SWIG wrapper for a corresponding C++ class. You'd use like from parameters4 import params. However, this allows only parameters that are already declared in the C++ class.
import parameters
params = parameters.Parameters()
params.foods = ["spam", "beans", "eggs"]
params.costs = [199, 4, 1]
params.customerAge = 23