Differences between executable files generated by Dymola and OpenModelica - modelica

I am considering to use the executable file generated by either Dymola (dymosim.exe) or OpenModelica (model_name.exe) to make parametric simulations on the same model.
I was wondering, is there any difference in the two .exe files and related input files? (which are dsin.txt for Dymola, and model_name_init.xml for OpenModelica).
Regarding file sizes, I can see that the Dymola files are smaller. But I was also wondering about speed of execution and flexibility of the input files for scripting.
Lastly, since Dymola is a commercial software, is the dymosim.exe file publicly shareable?

I will write this for OpenModelica, the Dymola people can add their own.
I would suggest to use FMUs instead of executables and some (co)simulation framework like OMSimulator (via Python scripting) or some other ones (PyFMI, etc). See an example here:
https://www.openmodelica.org/doc/OMSimulator/master/html/OMSimulatorPython.html#example-pi
Note that if you have resources such as tables, etc, these will be put inside the FMU if you use Modelica URIs: modelica://LibraryName/Resource/blah. However, for the generated executables you would need to ship them with the exe and they would need to be in a specific directory on the other machine. Also, you would need to ship dependent DLLs for the executables for the the FMUs that is (mostly - not true if you have external dlls that you call in your model) not needed as they are statically compiled.
Simulation speed depends on the model sometimes one or the other is faster.
For what libraries are supported by OpenModelica you can check the library coverage:
https://libraries.openmodelica.org/branches/overview-combined.html
If you still want to use executables, here is a list of command line parameters for them: https://www.openmodelica.org/doc/OpenModelicaUsersGuide/latest/simulationflags.html
How to do parameter sweeps via executables:
https://openmodelica.org/doc/OpenModelicaUsersGuide/latest/scripting_api.html#simulation-parameter-sweep

For Dymola:
If you have the appropriate binary export license you can generate a dymosim.exe that can be distributed.
Parameter-sweep can be run inside Dymola (the scripts are automatically generated), or from Python etc.
However, running a parameter sweep in that way does not only use dsin.txt, but also some additional files. There are two reasons:
Reduced overhead of starting/stopping dymosim.exe, especially for small models.
Automatic parallelization.
That part of dymosim is currently not well documented in the manual, but you can run:
dymosim -M Which as default sweeps based on two csv-files (multIn.csv, multOutHeader.csv) generating a third (multOut.csv)
dymosim -M -1 mIn.csv -2 mOutH.csv -3 mOut.csv if you want different file-names
dymosim -M -n 45 To generate normal trajectory files, dsres45.mat, dsres46.mat, ...
dymosim -h For help
dymosim -s Normal simulation
And if you are really bold you can pipe to/from dymosim.exe for parameter sweeps
Another possibility is to FMUs instead.

Related

Matlab `mcc`: All m-files to include when compiling executable?

I have a Matlab script go.m that creates custom objects and runs
a suite of simulations. There is interest in porting it to
a different network where the Matlab licenses are few. Our
strategy is to compile the script into a stand-alone *.exe
so that it can run without using up licenses. Once I figure
out all the ropes, the Matlab Compiler Runtime will be installed
on the target system.
I managed to use command-line mcc to compile the TMW online example,
magicsquare.
Using cygwin's bash:
$ cd ~/bin
$ ln -s "/c/Program Files/MATLAB/Single_R2015b/bin/mcc.bat" mcc
$ cd ~/tmp/magicSqr
$ mcc -m magicsquare.m
# startup.m messages indicate that this launches Matlab
$ ./magicsquare.exe 5
Running C:\cygwin64\tmp\User.Name\mcrCache9.0\magics1\Users\User.Name\Documents\MATLAB\startup
m = 17 24 1 8 15
23 5 7 14 16
4 6 13 20 22
10 12 19 21 3
11 18 25 2 9
Both the directory specification . and the file extension .exe
are needed.
My next step was to push the full-blown go.m through the
process, see what breaks, and find the least onerous way to deal with
it. By least onerous, I mean a strategy that requires fewest code
modifications so that I'm not maintaining separate code bases for
development versus for porting to the destination.
The mcc compilation worked: mcc -m go.m. Running the *.exe
file, however, led to breakage at the very first executable statement:
profile off. As I said, tactically recoding on an individual basis
is very unpalatable, so I searched for a way to identify all the files
to include when running mcc. Two promising leads were inmem and
requiredFilesAndProducts.
However, the above webpage also warns:
Do not use the Dependency Report to determine which MATLAB code
files someone else needs to run a particular file. Instead use the
matlab.codetools.requiredFilesAndProducts function.
It appears that the Dependency Report to be avoided refers to the
mfiles output from inmem. This is corroborated by examination of
said m-files -- the list is extremely long, and includes functions
that befuddle even Matlab's which command:
>> which matricize
'matricize' not found.
The only other candidate for identifying m-files to include is the
fList output from requiredFilesAndProducts. It seems to include
all the methods for my custom classes, as well all invoked m-files
residing in c:\Users\User.Name\Documents\MATLAB\ (the only
custom folder in my path). However, it certainly does not cover the
profile command that underlies the aforementioned error.
What is the best way to identify all the m-files and/or folders
thereof for mcc? Is it reasonable to then treat any remaining
error-causing statements using conditional execution, e.g., if
~isdeployed; <...problematic statements...>; end?
You may refer to the list on the documentation page here regarding information on unsupported functions to be compiled with MATLAB Compiler and MATLAB Compiler SDK products:
https://www.mathworks.com/help/compiler/unsupported-functions.html
This page below also shows the compatibility with MATLAB Compiler with each individual toolboxes:
https://www.mathworks.com/products/compiler/supported/compiler_support.html

Where do the "virtual/..." terms come from?

In Bitbake I can build e.g. the Linux Kernel with bitbake virtual/kernel or U-Boot with bitbake virtual/bootloader.
Where do those "virtual/..." terms come from?
I used find for patters such as "virtual/kernel" in the poky directory, but there are nearly infinite results and I don't know where to search.
Can I e.g. direct virtual/bootloader to a custom recipe when I might have programmed an own bootloader?
From bitbake user-manual
As an example of adding an extra provider, suppose a recipe named
foo_1.0.bb contained the following:
PROVIDES += "virtual/bar_1.0"
The recipe now provides both "foo_1.0" and "virtual/bar_1.0". The "virtual/" namespace is often used to denote
cases where multiple providers are expected with the user choosing
between them. Kernels and toolchain components are common cases of
this in OpenEmbedded.
Sometimes a target might have multiple providers. A common example is
"virtual/kernel", which is provided by each kernel recipe. Each
machine often selects the best kernel provider by using a line similar
to the following in the machine configuration file:
PREFERRED_PROVIDER_virtual/kernel = "linux-yocto"
Go to your meta-layer/conf/machine/here you can find macros.
your-meta-layer/recipes-bsp/barebox(or U-boot) here you can find bootloader recipes(.bb).

msysgit large installation size

I installed (extracted) msysgit portable (PortableGit-1.9.5-preview20150319.7z)
The compressed archive is 23 MB, but once extracted the contents take up 262 MB. This is mostly due to the git command binaries (under 'libexec\git-core'). Almost all of the binaries are identical, they just have different names.
Why did the developers build the project like this? I suppose they need an executable for each command to support the CLI on windows cmd.exe.
But isn't there a way to avoid having ~100 identical binaries, each 1.5 MB in size (ex: using batch files)?
Why did the developers build the project like this? I suppose they
need an executable for each command to support the CLI on windows
cmd.exe.
Under unixoid OSes, you can have symbolic links to another file that behave exactly like the original file; if you do that for your executable, your executable can look into argv[0] to find out how it was called. That's a very common trick for many programs.
Under Windows, and especially without installers, it's (to my knowledge) impossible to get the same behaviour out of your file system -- there's just no symbolic link equivalent. Especially when you consider that programs that are meant to run from USB drives have to cope with ancient filesystems such as FAT32!
Thus, the executables simply were copied over. Same functionality, more storage. However, on a modern machine that you'd run windows on, you really don't care about 200MB give or take for such a versatile tool such as git.
In conclusion: the developers had no choice here; since windows (though having some posix abstraction layer) has no proper filesystem support for symbolic links, that was the only good way to port this unix-originating program. You either shouldn't care or use an OS that behaves better in that respect. I'd recommend the latter, but OS choices often aren't made freely...

Linking MATLAB to a DLL library

I am trying to execute some example code from a MATLAB toolkit, 'oscmex'. This toolkit allows for communication using the OSC protocol over MATLAB. I presume this question is non-specific; it should apply to any toolkit that is set-up in the manner that this one is.
Reasons aside, I'm having some simple trouble getting the toolkit up and running. The toolkit comes with no documentation whatsoever; just a set of six DLL files (in one directory), and a set of four MATLAB '.m' example code files (in another directory). Every toolkit I've used in the past has either been a built-in kit or has had an intuitive (semi-automated) install procedure.
After downloading the toolkit, the first thing I tried was to simply run one of the '.M' example codes. This failed as the first line of the code contained the function osc(), which is not (currently) recognised by MATLAB.
So, I figured maybe I need to move the '.M' files into the same folder as the DLLs; perhaps MATLAB would see the functions inside the DLLs. No dice.
So, I realise that I have to somehow link MATLAB to the DLLs on startup. I tried adding the DLLs to a folder and adding an entry to that in the 'pathdef.m' file. This also failed.
I've read somewhere I can load a DLL file by using the loadlibrary() function. So, I tried doing this for the DLL files. This failed on the first file:
>> loadlibrary('osc_free_address.dll')
Error using loadlibrary>lFullPath (line 587)
Could not find file osc_free_address.h.
I'm starting to run out of options... How can I get this set of DLLs up and running?
Browsing this library's web page it would seems these DLLs are just old form of mex files.
Therefore, they should not be used in the context of shared library (e.g., using loadlibrary and calllib), but rather compiled directly to mex files.
To do so, I would suggest the following steps:
Make sure you have a working mex compiler configured for your Matlab.
In matlab, type:
>> mex -setup
this will guide you through the configuration process. I understand that you are working on a windows machine, I usually work with visual studio compiler - works best for me.
This library's README file suggests that OSC
requires liblo 0.22 or later. See http://plugin.org.uk/liblo/
Make sure you have this library and it is located in you LD_LIBRARY_PATH (see e.g., this question for details, or the mex docs).
Get the source code for OSC library from their repository.
Compile the sources in matlab using
>> mex -O -largeArrayDims osc_free_address.c
>> mex -O -largeArrayDims osc_free_server.c
and so on for all 7 c source files. After mex-ing the c files you'll have mex files that you can run from Matlab as if they were regular functions.
You may find it useful to use the library's make file, as suggested by Andrew Mao.
Good luck,
If you look at the build for that software, it is compiling mex files, not DLLs (shared libraries): http://sourceforge.net/p/oscmex/code/4/tree/trunk/src/osc_make.m.
I would try using the mex commands instead of the dll commands (perhaps the files are just misnamed.) Even better, I would compile the files yourself with mex using the build file in source.
Note that the instructions also say that you need liblo-0.22 in order to run the library, so make sure you have that accessible as well.
I took a look at your OSC Toolkit. It seems they have been compiled by MATLAB mex. But, it is not mentioned for which kind of architecture they have been built. You can type mexext at MATLAB command prompt to find the extension for your MATLAB mex files. Then, change the DLL extensions to the given extension. If the original mex is compatible with your matlab, the DLL can be easily accessed by MATLAB. Just make sure to add the folder to your MATLAB path.
Try changing the extension from .dll to .mexw32 (in win32), or .wexw64 (in win64). It's a long shot but it might work.
The Shared Libraries cannot be used directly. As you have mentioned, you need to load them into MATLAB using loadlibrary. According to the documentation, loadlibrary takes two arguments (at least). The first argument is the name of the file, and the second one is the header file which contains definition of functions and external variables. If you do not provide the header file, the MATLAB looks for the a file with the same name as the DLL. Having said that, you need to have access to the header file or at least if you know how the function looks like, you need to write a header for the DLL.
I have worked with the DLLs in MATLAB. The MATLAB is not very user-friendly as long as DLL is concerned. Especially, if the DLL is written in a language other than C (or C++) you will have trouble loading the function into MATLAB.
Besides, MATLAB can only support some specific DLLs. Based, on your version of MATLAB, you need to find out whether or not the shared library is supported by MATLAB. Have a look at Here
In a nutshell, it is not easy to load a DLL into MATLAB. You need to have some information from DLL.

Automating Solaris custom software deployment and configuration for multiple nodes

Essentially, the question I'd like to ask is related to the automation of software package deployments on Solaris 10.
Specifically, I have a set of software components in tar files that run as daemon processes after being extracted and configured in the host environment. Pretty much like any server side software package out there, I need to ensure that a list of prerequisites are met before extracting and running the software. For example:
Checking that certain users exists, and they are associated with one or many user groups. If not, then create them and their group associations.
Checking that target application folders exist and if not, then create them with pre-configured path values defined when the package was assembled.
Checking that such folders have the appropriate access control level and ownership for a certain user. If not, then set them.
Checking that a set of environment variables are defined in /etc/profile, pointed to predefined path locations, added to the general $PATH environment variable, and finally exported into the user's environment. Other files include /etc/services and /etc/system.
Obviously, doing this for many boxes (the goal in question) by hand will certainly be slow and error prone.
I believe a better alternative is to somehow automate this process. So far I have thought about the following options, and discarded them for one reason or another.
Traditional shell scripts. I've only troubleshooted these before, and I don't really have much experience with them. These would be my last resort.
Python scripts using the pexpect library for analyzing system command output. This was my initial choice since the target Solaris environments have it installed. However, I want to make sure that I'm not reinveting the wheel again :P.
Ant or Gradle scripts. They may be an option since the boxes also have Java 1.5 enabled, and the fileset abstractions can be very useful. However, they may fall short when dealing with user and folder permissions checking/setting.
It seems obvious to me that I'm not the first person in this situation, but I don't seem to find a utility framework geared towards this purpose. Please let me know if there's a better way to accomplish this.
I thank you for your time and help.
Most of those steps sound like things handled by use of a packaging system to install your package. On Solaris 10, that would be the SVR4 packaging system included with the OS.