I hope to save some test data based on whether the test pass/fail. Now I use a class member(property) to do that. Is there already a variable in the test framework for the purpose?
Related
First of all, apologies if this is a stupid question. I'm new to unit tests so I'm struggling a bit here.
I'm working on an app that queries an API, receives a JSON response and then processes that response to produce a series of complex data structures. Many of these data structures are daily time series, which means each of my functions produces a list (List<Datapoint>) containing hundreds of datapoint objects.
What I'm trying to test is that, for a given API response, each function produces the output it should.
For the input of each test I have already grabbed a sample, real JSON response from the API, and I've stored it inside a test_data folder within my root test folder.
However, for the expect part... how can I obtain a sample output from my function and store it somewhere in my test_data folder?
It would be straightforward if the output of my function were a string, but in this case we're talking about a list with hundreds of custom objects containing different values inside them. The only way to create those objects is through the function itself.
I tried running the debugger to check the value of the output at runtime, which I can do... but that doesn't help me copy it or store it anywhere as code.
Should I try to print the full contents of the output to a string at runtime and store that string? I don't think this would work, as all I see in the console are a bunch of Instance Of when I do functionOutput.toString()... I would probably need to recursively print each of the variables inside those objects.
Please tell me I'm being stupid and there's a simpler way to do this :)
Context: I have a huge Simulink Model that is going to be used for automated simulations on a Debian 10. Therefore it has to be built as standalone C-Code using the Matlab Coder. This code is then called to start the simulation.
What I need: I need to find a way to initialize my built model with ~500 parameters. These change with each simulation run and are stored in a SQLite file. The goal is to have parameters written to the database, then start the Model which reads the parameters from SQLite during initialization (presumably using the InitFcn Model Callback, although I'm open to alternatives).
What I have tried:
Direct SQL interface: I tried to use a direct Matlab-SQL interface such as JDBC (since I don't have access to the Database-toolbox) but those are not supported for Code generation.
Write a C-function that reads the SQLite file, then call the function during initialization in the InitFcn Callback using coder.ceval like this:
data = 0;
err = coder.ceval('read_function',4, 2, 12, coder.wref(data));
parameter = data;
Problem here is that coder.wref is not supported in Matlab and therefore doesn't work in the InitFcn. (Please correct me if I'm wrong)
This only seems to work inside a Matlab-Function-Block:
Error evaluating 'InitFcn' callback of block_diagram 'Model'.
Caused by:
The coder.wref function is not supported in MATLAB.
So my problem with the second approach is, that I can't call the C-function during initialization.
Using a Matlab-function-Block to read the parameters isn't really an option, since I would have to route all the signals out which makes maintaining and further development of the model really hard. Also my suggestion is, that the model would not even run because the parameters are needed to initialize the model.
Questions:
Is there a way to make one of the above approaches work? If yes, how? Where is my mistake?
Is there another (simpler) option to pass the data as an array or struct to my model?
Database looks like this:
Identifier Default
latitude 52.5
longitude 13.4
electricity_consumption 4000.0
ventilation_stream 50.0
PV_peak 30.0
PV_orientation 0.0
no_vessels 28.0
heatpump_exists 1.0
hotwater_consumption 1000.0
.
.
.
After having spent so much time on this issue, I would like to share my experience on this problem:
SQLite: This approach did not work out for me because the direct SQL-Matlab interfaces are not supported for code generation.
It is in fact possible to write a C-function, that reads from SQLite and call that function in a Matlab-function-block via coder.ceval wich allows to read in a signal during simulation. This works for code generation (Simulink coder) as well. However this will not work for initialization (see question).
So none of my original approaches ended up working.
Workaround: I ended up switching to an approach based on the Simulink RSIM-target wich generates code (also for Linux) and can be parametrized via a .mat file wich contains all the parameters. The .mat file can be modified to update parameters. This required some additional code wich automates this step. Also the model configuration for RSIM is a bit tricky.
Let's consider a scenario, we have to run the performance test for "create an account api" which takes input as header/path param "Auth token" and input data like user account name . So for above scenario we have 2 feature file as,
to run performance test for POST http://baseUrl/auth_param/create/input_data
1. One feature(e.g: generateAuth.feature) file which will have the auth
token
2. Second feature(createAccount.feature) file which take parameter as a
auth token, input data.
Here is my simulation class,
class <MyClass> extends Simulation {
before {
println("Simulation is about to start!")
}
val generateAuthTest = scenario("generateAuth").exec(karateFeature("classpath:path/generateAuth.feature"))
val createAccountTest = scenario("test").exec(karateFeature("classpath:path/createAccount.feature"))
setUp(
createAccountTest.inject(rampUsers(1) over (10 seconds))).maxDuration(1 minutes)
after {
println("Simulation is finished!")
}
}
Here, can i read auth from generateAuth.feature file which is input for createAccount.feature file, so that i can pass as a parameter?
Please suggest me how to pass parameters to createAccount.feature while calling in karateFeature method.
Let me put a requirement here,
let's say we have some feature files for CRUD operations on a particular data. Here how i go to write functional scenario,
I will create new feature file to write a scenario
just use CRUD files to test a SINGLE flow.
Now if i go for Performance test cases on individual operation, i feel there are 2 ways,
Create new 4 performance test feature files (one for each CRUD
method) and call these CRUD feature files in the respective test
feature file. Finally we just call test feature files in the
respective gatling simulation class.
**(In this case, I will end up with creating more test feature files as well simulation classes for
performance, which I want to avoid) **
Just call CRUD files in the respective gatling simulation class and
pass the required parameters to them.(In this case , we just need to create only 4 simulation
classes and run them on the basic of operation like create,read,delete and so on)
Here just wanted to know 2nd way of performance test, is it achievable or not in karate and if yes please let me know how?
Summary- I think its achievable using 3rd feature file (extra) for
individual use case but I do not want to make an extra feature file
for each case so that I can avoid maintenance work and can take
advantage of re-usability of existing feature file from functional
test to performance test.
Just use the normal Karate concepts such as karate-config.js
You can easily switch environments by setting the karate.env system property.
For example:
mvn test -DargLine="-Dkarate.env=e2e"
EDIT: after you edited your question, it is clear you have a SINGLE flow you want to test. please use a SINGLE feature. I suggest you move the generateAuth into the Background of the feature. Also refer to the docs on callSingle() for advanced options.
If you are expecting 2 feature files to magically share data that is not possible and not needed if you structure your tests correctly.
If you really really need this, please create a Java singleton and access it from each feature. Totally don't recommend this though.
EDIT: In Karate 0.9.0 onwards, you can call a single scenario within a feature if it has a tag:
classpath:animals/cats/create.feature#sometagname
Hi so im completely new to scalacheck.
So im building an obfuscator, and I want to check if the obfuscated code which i generate is correct. My function changes a while loop to a switch, so is there some way i can check if the structure of the switch is how i wanted it to be made?
For all valid inputs, check if the generated function generates the same output as the obfuscated function. For an arbitrary function, you'll have to write your own generator.
Say, I've UNITs 1,2,3,4 ( either as Model reference or Subsystem) for which I've units tests ready using matlab.unittest.TestCase framework.
What could be the easiest way to write integration test fro entire system ?
I need some way to set Global_Inputx ( x = 1,2,3 ) and verify Global_Outy ( y =1,2 ) in easiest possible way (may be utilizing the Unit tests) ?
I can use Matlab 14a
PS: I've already gone through this but it didn't help.
I think the question of integration testing in Simulink is a complex one that may involve formal methods like code and coverage analysis of the dynamic system under test, automatic test generation e.t.c. If you haven't already, you may want to check out "Verification, Validation and Test" section of the MathWorks product line up: http://www.mathworks.com/products/?s_tid=gn_ps.
However to answer your specific question of how you would set the global input and verify global output in your test:
Depending on where your global input data that feeds the inport blocks resides (MATLAB base workspace, model workspace, e.t.c), do you think you could set the external input of the model to that data. For example:
set_param(, 'ExternalInput', )
This could be defined in your test class setup, test method setup or in the test depending on when the data is available and where defining it is appropriate. The data could also be passed in directly to the sim() command. Parameterized testing (http://www.mathworks.com/help/matlab/matlab_prog/create-basic-parameterized-test.html) is an option to consider if you want to test the system with different sets of inputs. The external input value becomes the parameter in this context.
If you have your model set up for output logging, then once the simulation is done, you would get the logged outputs, which you could then compare against a baseline.
Does that help? or am I way off the base here. If you can add more details, I can try again.