dReal SMT solver counterexamples - smt

Does the dReal SMT solver return counterexamples? I have seen examples where :produce-models is true, but I do not know how to generate counterexamples. Also, the dReach tool has a --visualize option, so it would seem that dReal would need to produce some model information. However, when I run it on .smt2 files, I can't seem to find a way to view counterexamples.

O.k., it is trivial :). dReal does not follow the usual .smt2 convention of using (get-model), but you can get models by using the command line option --model.
E.g.: dReal --model microwave1.smt2

Related

Which tool is the best to convert clauses in CNF (or even better DIMACS CNF)?

I am automatically generating clauses like this with a C++ program:
((((((condition1#0 and not action1#0 and not action2#0 and TRUE) and (action1#1 and not action2#1 and not condition1#1 and TRUE) and TRUE)) or (not action1#0 and not action2#0 and not condition1#0 and action2#1 and not action1#1 and not condition1#1 and TRUE) or FALSE)))
I need then to check their satisfiability with some tool (something like MiniSat), but before inputing them in such a tool I need to convert them in DIMACS CNF, do you know any tool that can do it automatically for me?
Thank you!
Edit:
Also a non-CNF sat solver would work fine!
In pysmt there is a parser called human readable parser (hrparser) that should be able to parse those expressions. [1] .
Pysmt is integrated with both picosat and the cudd bdd package,so that you can check for satisfiability easily. In general, smt solvers are more flexible in the structure of the input (non cnf) . If you can change the output of the c++ code, it might be easy enough to create an smtlib file and use an smt silver (eg yices, z3, or cvc4).
Note: I am one of the developers of pysmt
[1] http://pysmt.readthedocs.io/en/latest/_modules/pysmt/parsing.html#HRParser

Removing annotations from a Modelica model

I'm developing a Modelica library and need to produce a document with source code listings. I'd like to be able to include the source of the Modelica models without annotations.
I could manually edit them out, but I'm looking for a more automated strategy. I'm guessing the most convenient and straightforward approach is to use some tool to save .mo files with no annotations and include those in my document (I'm using \lstinputlisting in LaTeX).
Is it possible to do this? I have access to Dymola, OpenModelica and JModelica. Dymola is obviously capable of producing such a listing, as it's able to include it in the automatically generated documentation (File > Export > HTML...). I've been looking into scripting with Dymola and OpenModelica, but haven't found a way to do this either.
JModelica seems like it could be a good option, but I don't have experience working with Python. If this is possible and someone gives me some pointers, I'm willing to look into it myself. I found a mention to a prettyprint function that might do the job, but I'm not sure where to start. I can't even find reference to that function in the latest documentation.
It would also be more convenient for me to find a way of doing it with Dymola/OpenModelica (whether through the UI or by using a script). Have I missed something?
I think you could use saveTotalModel("total.mo", MyModelName) in OpenModelica. This will strip most annotations (not ones used for code generation if I remember correctly) and pretty-print the source code including all dependencies. Then you just copy-paste the models/packages that you want to include in the listing. Or if you prefer, you can do something like the following to only include code for a particular model:
loadModel(Modelica);
loadFile("MyModel.mo");
saveTotalModel("total.mo", MyModel.A.B);
clear();
loadFile(MyModel);
str := list(MyModel.A.B);
writeFile("MyModel.A.B.listing", str);

Running Memnet model using caffe

I try to use the dataset of mnist, which is an example provided in github, to run Memnet model on the interface of cmd.
The model is downloaded from here
I modified its deploy.prototxt accordingly. Having no idea... Can someone help me with this?
But it keeps telling me something wrong going on, as the pic shows:
The command line interface should get as a -solver a solver.prototxt file (which has train_val.prototxt as one of its parameters). You cannot supply train_val.prototxt directly to caffe train.
You can look at the examples subfolder of caffe and find some examples of solver.prototxt. A simple one can be found in examples/mnist, you can check out lenet_solver.prototxt.

training a new model using pascal kit

need some help on this.
Currently I am doing a project on computer vision that requires me to train a new model to detect a certain object.
In this case, I am using the system provided by P. Felzenszwalb, D. McAllester, D. Ramaman and his team => Discriminatively trained deformable part models which is implemented in Matlab.
Project webpage: http://www.cs.uchicago.edu/~pff/latent/.
However I have no idea how to direct the system to use my dataset(a collection of images and annotation) which is different from the the PASCAL datasets so as to train a new model.
By directing, I meant a line of code that allows me to change the dataset the system reads from, for training a model.
E.g.
% directory for caching models, intermediate data, and results
cachedir = ['/var/tmp/rbg/YOURPATH/' VOCyear '/'];
I tried looking at their Readme and documentation guides but they do not make any mention. Do correct me if I am wrong.
Let me know if I have not made my problem clear enough.
I tried looking at some files such as global.m but no go.
Your help is much appreciated and thanks in advance!
You can try to read pascal.m in the DPM package(voc-release5), there are similar code working on VOC2007/2010 dataset.
There are plenty of parts that need to be adapted to achieve this. For example the voc_config has to be adapted in order to read from your files.
The same with the pascal_train.m function. Depending on the images and the way you parse them, this may require quite some time to adapt this function.
Other functions to consider:
imreadx
pascal_test
pascaleval

Accessing variable by string name

I need to load experimental data into scicoslab, a (pretty badly designed) clone fork of scilab which happens to support graphical modeling. The documentation on the web is pretty poor, but it's reasonably similar to scilab and octave.
The data I need to process is contained into a certain number of text files: Data_005, Data_010, …, Data_100. Each of them can be loaded using the -ascii flag for the loadmatfile command.
The problem comes from the fact that loadmatfile("foo", "-ascii") loads the file foo.mat into a variable named foo. In order to to cycle on the data files, I would need to do something like:
for i = [5:5:100]
name = sprintf("Data_%02d", i);
loadmatfile(name, "-ascii");
x = read_var_from_name(name);
do_something(x);
end
where what I search for is a builtin read_var_from_name which would allow me to access the internal symbol table by string.
Do you know if there exist a similar function?
Notes:
There's no way of overriding this behavior if your file is in ascii format;
In this phase I could also use octave (no graphical modelling is involved), although it behaves in the same way.
>> foo = 3.14; name = 'foo'; eval(name)
foo =
3.1400
The above works in MATLAB, and Scilab's documentation says it also has an eval function. Not sure if I understood you correctly, though.
#arne.b has a good answer.
In your case you can also do that in matlab:
a=load('filename.mat')
x=a.('variable_name')
lets go through your points one by one:
"ScicosLab, a (pretty badly designed) clone of Scilab" This in my opinion is an inaccurate way of introducing the software. ScicosLab is not a clone of Scilab but a fork of it. The team behind ScicosLab (INRIA) are the ones who made scocos (now called xcos in Scilab development line). At some point (from Scilab v4) the Scilab team decided to move away from Tcl/tk towards Java, but the SciccosLab/scicos team departed, keep using the language (Tcl) and it's graphical user interface design package (tk). Giving the ScocosLab community the credit that the whole Scilab documentation and support is not very good in general. :) (more about Scilab and the forks here)
Regarding the technical question I'm not sure what you are trying to achieve here, Scilab/ScicosLab still have the eval function which basically does what you want. However this function is to be deprecated in favor of evstr. There is also the execstr function which worth studying.
The loadmatfile, as far as I have understood, "tries" to load the variables defined in a MATLAB .mat file (MATLAB's proprietary tabular format) into the Scilab workspace. For example if there is a variable foo it will "try" to create the variable foo and loads its value from the MATLAB script. Check this example. I would create a variable x(i) = foo in the for loop. again your question is not completely clear.
As a side note maybe you could consider exporting your data as CSV instead of .mat files.