Doing Integration Testing on Simulink Models - matlab

Say, I've UNITs 1,2,3,4 ( either as Model reference or Subsystem) for which I've units tests ready using matlab.unittest.TestCase framework.
What could be the easiest way to write integration test fro entire system ?
I need some way to set Global_Inputx ( x = 1,2,3 ) and verify Global_Outy ( y =1,2 ) in easiest possible way (may be utilizing the Unit tests) ?
I can use Matlab 14a
PS: I've already gone through this but it didn't help.

I think the question of integration testing in Simulink is a complex one that may involve formal methods like code and coverage analysis of the dynamic system under test, automatic test generation e.t.c. If you haven't already, you may want to check out "Verification, Validation and Test" section of the MathWorks product line up: http://www.mathworks.com/products/?s_tid=gn_ps.
However to answer your specific question of how you would set the global input and verify global output in your test:
Depending on where your global input data that feeds the inport blocks resides (MATLAB base workspace, model workspace, e.t.c), do you think you could set the external input of the model to that data. For example:
set_param(, 'ExternalInput', )
This could be defined in your test class setup, test method setup or in the test depending on when the data is available and where defining it is appropriate. The data could also be passed in directly to the sim() command. Parameterized testing (http://www.mathworks.com/help/matlab/matlab_prog/create-basic-parameterized-test.html) is an option to consider if you want to test the system with different sets of inputs. The external input value becomes the parameter in this context.
If you have your model set up for output logging, then once the simulation is done, you would get the logged outputs, which you could then compare against a baseline.
Does that help? or am I way off the base here. If you can add more details, I can try again.

Related

How to extract an MSL model, modify the code, and use locally?

I am interested to replace my own PID-regulator models with MSL/Blocks/Continuous/LimPID. The problem is that this model restricts limitations of output signals to be parameters and thus do not allow time-varying limits, which I need to have.
Studying the code I understand that the output limitation is created by a block MSL/Blocks/Nonlinear/Limiter and I just want to change this to the block VariableLimiter.
I can imagine that you need to ensure that changes of output-limitations vary in a time-scale slower than the regulator in order to not excite unwanted behaviour of the controller. Still here is a class of problems where it would be very useful to allow this limits to vary slowly.
Thanks for the good input to my question and below a very simple example to refine my question. (The LimPID is more complicated and I come back to that).
Let us instead just modify the block Add to a local block in MyModel.
I copy the code from Modelica.Blocks.Math.Add and call it Addb in MyModel. Since here is a dependence of Interfaces.SI2SO I need to make an import before the extends-clause. This import I take from the ordinary general MSL package, instead of copying also that in to MyModel. Then I introduce a new parameter "bias" and modify the equation. The annotation may need some update as well but we do not bother with that now.
MyModel
...
block Addb "Output the sum of the two inputs"
import Modelica.Blocks.Interfaces;
extends Interfaces.SI2SO;
parameter Real k1=+1 "Gain of input signal 1";
parameter Real k2=+1 "Gain of input signal 2";
parameter Real bias=0 "Bias term";
equation
y = k1*u1 + k2*u2 + bias;
annotation (...);
end Addb;
MyModel;
This code seems to work.
My added new question is whether it is enough to look up "extends-clauses" and other references to MSL and make the proper imports since the code is now local, or here are more aspects to think of? The LimPID code is rather complex with procedures for initialization etc so I just wonder if here is more to do than just bring in a number of import-clauses?
The models in Modelica Standard Library (MSL) should only be seen as exemplary models, not covering all possible applications. MSL is write protected and it is not possible to replace the limiter block in LimPID (and add max/min input connectors). Also, it wouldn't work out if you shared your simulation model with others, expecting their MSL to work like your modified MSL.
Personally, I have my own libraries of components where MSL models are inadequate. For example, I have PID controllers with variable limits, manual/automatic functions and many more functions which are needed in my applications.
Often, I create a copy of an MSL model, place it in the same package in my own library and make the necessary modifications and additions, e.g. MyLibrary.Blocks.Continuous.PID.

Modelica Variable size connectors

I want to do Monte Carlo simulation with a model. This model has some external functions linked to a library that reads its inputs from a file. I have been recently experimenting with ".mos" files to run the same model over and over. With external files it is easy to change them and re-simulate the model using the mos file (even in parallel). Yet I was unable to change the internal parameters of the model in the mos script. I have done many experiments and been looking for the answer for some time now. I wonder what is the correct syntax for changing the parameters of the model inside a simple .mos file like the one below.
cd(<working directory>);
loadFile(<address of the package that also loads the OpenModelica>);
simulate(<model>, stopTime=<...>, numberOfIntervals=<...>);
quit();
For instance, lets say the model has a parameter param and a submodel sub with the parameter sub_param. How can I change param and sub_param inside the mos file.

Initialize buildable Matlab Simulink Model with parameters from SQLite Database

Context: I have a huge Simulink Model that is going to be used for automated simulations on a Debian 10. Therefore it has to be built as standalone C-Code using the Matlab Coder. This code is then called to start the simulation.
What I need: I need to find a way to initialize my built model with ~500 parameters. These change with each simulation run and are stored in a SQLite file. The goal is to have parameters written to the database, then start the Model which reads the parameters from SQLite during initialization (presumably using the InitFcn Model Callback, although I'm open to alternatives).
What I have tried:
Direct SQL interface: I tried to use a direct Matlab-SQL interface such as JDBC (since I don't have access to the Database-toolbox) but those are not supported for Code generation.
Write a C-function that reads the SQLite file, then call the function during initialization in the InitFcn Callback using coder.ceval like this:
data = 0;
err = coder.ceval('read_function',4, 2, 12, coder.wref(data));
parameter = data;
Problem here is that coder.wref is not supported in Matlab and therefore doesn't work in the InitFcn. (Please correct me if I'm wrong)
This only seems to work inside a Matlab-Function-Block:
Error evaluating 'InitFcn' callback of block_diagram 'Model'.
Caused by:
The coder.wref function is not supported in MATLAB.
So my problem with the second approach is, that I can't call the C-function during initialization.
Using a Matlab-function-Block to read the parameters isn't really an option, since I would have to route all the signals out which makes maintaining and further development of the model really hard. Also my suggestion is, that the model would not even run because the parameters are needed to initialize the model.
Questions:
Is there a way to make one of the above approaches work? If yes, how? Where is my mistake?
Is there another (simpler) option to pass the data as an array or struct to my model?
Database looks like this:
Identifier Default
latitude 52.5
longitude 13.4
electricity_consumption 4000.0
ventilation_stream 50.0
PV_peak 30.0
PV_orientation 0.0
no_vessels 28.0
heatpump_exists 1.0
hotwater_consumption 1000.0
.
.
.
After having spent so much time on this issue, I would like to share my experience on this problem:
SQLite: This approach did not work out for me because the direct SQL-Matlab interfaces are not supported for code generation.
It is in fact possible to write a C-function, that reads from SQLite and call that function in a Matlab-function-block via coder.ceval wich allows to read in a signal during simulation. This works for code generation (Simulink coder) as well. However this will not work for initialization (see question).
So none of my original approaches ended up working.
Workaround: I ended up switching to an approach based on the Simulink RSIM-target wich generates code (also for Linux) and can be parametrized via a .mat file wich contains all the parameters. The .mat file can be modified to update parameters. This required some additional code wich automates this step. Also the model configuration for RSIM is a bit tricky.

#karate How to pass parameter to a feature file in gatling simulation class?

Let's consider a scenario, we have to run the performance test for "create an account api" which takes input as header/path param "Auth token" and input data like user account name . So for above scenario we have 2 feature file as,
to run performance test for POST http://baseUrl/auth_param/create/input_data
1. One feature(e.g: generateAuth.feature) file which will have the auth
token
2. Second feature(createAccount.feature) file which take parameter as a
auth token, input data.
Here is my simulation class,
class <MyClass> extends Simulation {
before {
println("Simulation is about to start!")
}
val generateAuthTest = scenario("generateAuth").exec(karateFeature("classpath:path/generateAuth.feature"))
val createAccountTest = scenario("test").exec(karateFeature("classpath:path/createAccount.feature"))
setUp(
createAccountTest.inject(rampUsers(1) over (10 seconds))).maxDuration(1 minutes)
after {
println("Simulation is finished!")
}
}
Here, can i read auth from generateAuth.feature file which is input for createAccount.feature file, so that i can pass as a parameter?
Please suggest me how to pass parameters to createAccount.feature while calling in karateFeature method.
Let me put a requirement here,
let's say we have some feature files for CRUD operations on a particular data. Here how i go to write functional scenario,
I will create new feature file to write a scenario
just use CRUD files to test a SINGLE flow.
Now if i go for Performance test cases on individual operation, i feel there are 2 ways,
Create new 4 performance test feature files (one for each CRUD
method) and call these CRUD feature files in the respective test
feature file. Finally we just call test feature files in the
respective gatling simulation class.
**(In this case, I will end up with creating more test feature files as well simulation classes for
performance, which I want to avoid) **
Just call CRUD files in the respective gatling simulation class and
pass the required parameters to them.(In this case , we just need to create only 4 simulation
classes and run them on the basic of operation like create,read,delete and so on)
Here just wanted to know 2nd way of performance test, is it achievable or not in karate and if yes please let me know how?
Summary- I think its achievable using 3rd feature file (extra) for
individual use case but I do not want to make an extra feature file
for each case so that I can avoid maintenance work and can take
advantage of re-usability of existing feature file from functional
test to performance test.
Just use the normal Karate concepts such as karate-config.js
You can easily switch environments by setting the karate.env system property.
For example:
mvn test -DargLine="-Dkarate.env=e2e"
EDIT: after you edited your question, it is clear you have a SINGLE flow you want to test. please use a SINGLE feature. I suggest you move the generateAuth into the Background of the feature. Also refer to the docs on callSingle() for advanced options.
If you are expecting 2 feature files to magically share data that is not possible and not needed if you structure your tests correctly.
If you really really need this, please create a Java singleton and access it from each feature. Totally don't recommend this though.
EDIT: In Karate 0.9.0 onwards, you can call a single scenario within a feature if it has a tag:
classpath:animals/cats/create.feature#sometagname

How to programmatically configure the tunability of model parameters?

I'm porting a large Simulink model from Simulink R2010a → R2017b.
The main model is basically a glue-layer for many interwoven reference models. My objective is to generate a standalone executable out of this main model using Coder.
Parameter tunability in this context is not done via the Signals and Parameters section on the Optimization tab in the Model Configuration Parameters dialog (as is the case in stand-alone models), but rather, via constructing Simulink.Parameter objects in the base workspace, and referencing those in the respective referenced models, or in their respective model workspaces.
Now, AFAIK, in R2010a it was enough to set
new_parameter.RTWInfo.StorageClass = 'Auto';
new_parameter.RTWInfo.CustomStorageClass = 'Define';
to make the parameter non-tunable and convert it into a #define in the generated code. In R2017b, this is no longer allowed; the StorageClass must be 'Custom' if you set a non-empty CustomStorageClass:
new_parameter.CoderInfo.StorageClass = 'Custom'; % <- can't be 'Auto'
new_parameter.CoderInfo.CustomStorageClass = 'Define';
But apparently, this does not make the parameter non-tunable:
Warning: Parameter 'OutPortSampleTime' of '[...]/Rate Transition1' is non-tunable but refers to tunable variables (Simulation_compiletimeConstant (base workspace))
I can't find anything in the R2017b documentation on making parameters non-tunable, programatically; I can only find how to do it in stand-alone models via the dialog, but that's not what I want here.
Can anyone point me in the right direction?
NOTE: Back in the day, Simulink Coder was called Real-Time Workshop (well, Real-time Workshop split into Coder and several other things), hence the difference RTWInfo vs. CoderInfo. Note that RTWInfo still works in R2017b, but issues a warning and gets converted into Coderinfo automatically.
In generated code it should appear as #define, the way you specified it.
https://www.mathworks.com/help/rtw/ug/choose-a-built-in-storage-class-for-controlling-data-representation-in-the-generated-code.html
Btw, yes, it's a bit confusing, because in m-file you specify CustomStorageClass = 'Define';, in GUI you specify Storage class as Define (custom), but in documentation they say Storage Class as Defined.
I am not sure why warning about tunability shows up.