Matlab `mcc`: All m-files to include when compiling executable? - matlab

I have a Matlab script go.m that creates custom objects and runs
a suite of simulations. There is interest in porting it to
a different network where the Matlab licenses are few. Our
strategy is to compile the script into a stand-alone *.exe
so that it can run without using up licenses. Once I figure
out all the ropes, the Matlab Compiler Runtime will be installed
on the target system.
I managed to use command-line mcc to compile the TMW online example,
magicsquare.
Using cygwin's bash:
$ cd ~/bin
$ ln -s "/c/Program Files/MATLAB/Single_R2015b/bin/mcc.bat" mcc
$ cd ~/tmp/magicSqr
$ mcc -m magicsquare.m
# startup.m messages indicate that this launches Matlab
$ ./magicsquare.exe 5
Running C:\cygwin64\tmp\User.Name\mcrCache9.0\magics1\Users\User.Name\Documents\MATLAB\startup
m = 17 24 1 8 15
23 5 7 14 16
4 6 13 20 22
10 12 19 21 3
11 18 25 2 9
Both the directory specification . and the file extension .exe
are needed.
My next step was to push the full-blown go.m through the
process, see what breaks, and find the least onerous way to deal with
it. By least onerous, I mean a strategy that requires fewest code
modifications so that I'm not maintaining separate code bases for
development versus for porting to the destination.
The mcc compilation worked: mcc -m go.m. Running the *.exe
file, however, led to breakage at the very first executable statement:
profile off. As I said, tactically recoding on an individual basis
is very unpalatable, so I searched for a way to identify all the files
to include when running mcc. Two promising leads were inmem and
requiredFilesAndProducts.
However, the above webpage also warns:
Do not use the Dependency Report to determine which MATLAB code
files someone else needs to run a particular file. Instead use the
matlab.codetools.requiredFilesAndProducts function.
It appears that the Dependency Report to be avoided refers to the
mfiles output from inmem. This is corroborated by examination of
said m-files -- the list is extremely long, and includes functions
that befuddle even Matlab's which command:
>> which matricize
'matricize' not found.
The only other candidate for identifying m-files to include is the
fList output from requiredFilesAndProducts. It seems to include
all the methods for my custom classes, as well all invoked m-files
residing in c:\Users\User.Name\Documents\MATLAB\ (the only
custom folder in my path). However, it certainly does not cover the
profile command that underlies the aforementioned error.
What is the best way to identify all the m-files and/or folders
thereof for mcc? Is it reasonable to then treat any remaining
error-causing statements using conditional execution, e.g., if
~isdeployed; <...problematic statements...>; end?

You may refer to the list on the documentation page here regarding information on unsupported functions to be compiled with MATLAB Compiler and MATLAB Compiler SDK products:
https://www.mathworks.com/help/compiler/unsupported-functions.html
This page below also shows the compatibility with MATLAB Compiler with each individual toolboxes:
https://www.mathworks.com/products/compiler/supported/compiler_support.html

Related

Differences between executable files generated by Dymola and OpenModelica

I am considering to use the executable file generated by either Dymola (dymosim.exe) or OpenModelica (model_name.exe) to make parametric simulations on the same model.
I was wondering, is there any difference in the two .exe files and related input files? (which are dsin.txt for Dymola, and model_name_init.xml for OpenModelica).
Regarding file sizes, I can see that the Dymola files are smaller. But I was also wondering about speed of execution and flexibility of the input files for scripting.
Lastly, since Dymola is a commercial software, is the dymosim.exe file publicly shareable?
I will write this for OpenModelica, the Dymola people can add their own.
I would suggest to use FMUs instead of executables and some (co)simulation framework like OMSimulator (via Python scripting) or some other ones (PyFMI, etc). See an example here:
https://www.openmodelica.org/doc/OMSimulator/master/html/OMSimulatorPython.html#example-pi
Note that if you have resources such as tables, etc, these will be put inside the FMU if you use Modelica URIs: modelica://LibraryName/Resource/blah. However, for the generated executables you would need to ship them with the exe and they would need to be in a specific directory on the other machine. Also, you would need to ship dependent DLLs for the executables for the the FMUs that is (mostly - not true if you have external dlls that you call in your model) not needed as they are statically compiled.
Simulation speed depends on the model sometimes one or the other is faster.
For what libraries are supported by OpenModelica you can check the library coverage:
https://libraries.openmodelica.org/branches/overview-combined.html
If you still want to use executables, here is a list of command line parameters for them: https://www.openmodelica.org/doc/OpenModelicaUsersGuide/latest/simulationflags.html
How to do parameter sweeps via executables:
https://openmodelica.org/doc/OpenModelicaUsersGuide/latest/scripting_api.html#simulation-parameter-sweep
For Dymola:
If you have the appropriate binary export license you can generate a dymosim.exe that can be distributed.
Parameter-sweep can be run inside Dymola (the scripts are automatically generated), or from Python etc.
However, running a parameter sweep in that way does not only use dsin.txt, but also some additional files. There are two reasons:
Reduced overhead of starting/stopping dymosim.exe, especially for small models.
Automatic parallelization.
That part of dymosim is currently not well documented in the manual, but you can run:
dymosim -M Which as default sweeps based on two csv-files (multIn.csv, multOutHeader.csv) generating a third (multOut.csv)
dymosim -M -1 mIn.csv -2 mOutH.csv -3 mOut.csv if you want different file-names
dymosim -M -n 45 To generate normal trajectory files, dsres45.mat, dsres46.mat, ...
dymosim -h For help
dymosim -s Normal simulation
And if you are really bold you can pipe to/from dymosim.exe for parameter sweeps
Another possibility is to FMUs instead.

Force compiled Matlab app to use runtime rather than a Matlab license

My client has a network that doesn't have access to the internet. They intend to buy (very few) Matlab licenses just for the development efforts of me and my colleague. For the operational use by their staff, however, I should compile my Matlab code to *.exe files so that the operators don't use up licenses just to execute my Matlab "app"/"solution" (the language seems to be changing these days). They won't actually have mcc licenses, so the compiling will be done on my home organization's network.
The problem is that when the compiled executable runs, there doesn't seem to be an obvious way to force it to use Matlab Compiler Runtime (MCR). If there are Matlab licenses on the target system, it can use that as well. The whole point of compiling, however, is to avoid using the few licenses on the client network so that the licenses are available for m-file development work when needed. So the unique feature about this situation seems to be that the target environment will eventually have both Matlab licenses and MCR, as well as the requirement that compiled executables use only the MCR rather than the Matlab licenses.
The Mathworks is looking into the problem, but prospects of finding a solution are unclear. I am hoping that it won't involve manual rejigging login scripts to customize environment PATH variables, as that will break whenever login scripts are updated. I'm hoping for a solution like a pragma-like statement in the top-level m-file, or an mcc switch. In perusing the mcc documentation, however, no switches present themselves as likely candidates except for -Y license.lic, and it's not clear how to use that.
With regard to the client, another limitation I face is that I don't want to pester them with trial-and-error (it's not their job). This is complicated by the fact that there is also no efficient way to convey electronic content to them, so quick, iterative trial-and error is out. As well, their target environment, with Matlab licenses, doesn't yet exist, though the process to get there is in the works. It's a bit of an chicken-and-egg problem; they are getting Matlab based on the assumption that we can find solutions for the challenges, but it's hard to derisk the assumption beforehand by investigating solutions when the target environment doesn't yet exist.
On my home organization's system, I also face the limitation that I don't have rights to install MCR. Hence, I can't undertake trial-and-error to identify an incantation or recipe that ignores the presence of Matlab licenses and forces the use of MCR. Not that I have the time to do that, as it is a very inefficient way to achieve this objective.
Because of the many circumstantial challenges, trial-and-error isn't the way to go, and I'm hoping there is a canned method for forcing the use of MCR over any Matlab licenses that might be present. I am using R2015b.
I am working with this exact deployment situation for OS X / Linux with Matlab 2015b. When you compile an application for a Unix-based OS, the compiler creates a shell script that is executed at startup. My solution is to modify this script to check for the presence of the runtime libraries. For example, on OS X (macOS):
echo "Setting up environment variables"
if [ -d "/Applications/MATLAB/MATLAB_Compiler_Runtime/v90" ] ; then
echo "Using MCR v8.6 (R2015b) (_Compiler)"
MCRROOT=/Applications/MATLAB/MATLAB_Compiler_Runtime/v90
elif [ -d "/Applications/MATLAB/MATLAB_Runtime/v90" ] ; then
echo "Using MCR v8.6 (R2015b)"
MCRROOT=/Applications/MATLAB/MATLAB_Runtime/v90
elif [ -d "/Applications/MATLAB_R2015b.app" ] ; then
echo "Using MATLAB R2015b application"
MCRROOT=/Applications/MATLAB_R2015b.app
else
echo "No MATLAB libraries found! Install MCR R2015b from:"
echo " http://www.mathworks.com/products/compiler/mcr/"
echo " "
sleep 10
exit
fi
TMW's response:
Running a standalone application built with MATLAB Compiler will not check out any licenses whether running against an installed MATLAB Compiler Runtime or the runtime installed as part of a MATLAB Compiler installation. Note that end users with MATLAB installed without MATLAB Compiler will not have the compiler runtime libraries included with their installation.
If you have installed the MATLAB Compiler Toolbox, MATLAB will have a "runtime" folder with the necessary DLLs to execute the standalone application. Without the MATLAB Compiler Toolbox, these DLLs will not be available. Instead the user must install MCR to run the standalone application...the user cannot forgo the installation of MCR if they do not have the MATLAB Compiler Toolbox installed and they wish to run the standalone application.

Linking MATLAB to a DLL library

I am trying to execute some example code from a MATLAB toolkit, 'oscmex'. This toolkit allows for communication using the OSC protocol over MATLAB. I presume this question is non-specific; it should apply to any toolkit that is set-up in the manner that this one is.
Reasons aside, I'm having some simple trouble getting the toolkit up and running. The toolkit comes with no documentation whatsoever; just a set of six DLL files (in one directory), and a set of four MATLAB '.m' example code files (in another directory). Every toolkit I've used in the past has either been a built-in kit or has had an intuitive (semi-automated) install procedure.
After downloading the toolkit, the first thing I tried was to simply run one of the '.M' example codes. This failed as the first line of the code contained the function osc(), which is not (currently) recognised by MATLAB.
So, I figured maybe I need to move the '.M' files into the same folder as the DLLs; perhaps MATLAB would see the functions inside the DLLs. No dice.
So, I realise that I have to somehow link MATLAB to the DLLs on startup. I tried adding the DLLs to a folder and adding an entry to that in the 'pathdef.m' file. This also failed.
I've read somewhere I can load a DLL file by using the loadlibrary() function. So, I tried doing this for the DLL files. This failed on the first file:
>> loadlibrary('osc_free_address.dll')
Error using loadlibrary>lFullPath (line 587)
Could not find file osc_free_address.h.
I'm starting to run out of options... How can I get this set of DLLs up and running?
Browsing this library's web page it would seems these DLLs are just old form of mex files.
Therefore, they should not be used in the context of shared library (e.g., using loadlibrary and calllib), but rather compiled directly to mex files.
To do so, I would suggest the following steps:
Make sure you have a working mex compiler configured for your Matlab.
In matlab, type:
>> mex -setup
this will guide you through the configuration process. I understand that you are working on a windows machine, I usually work with visual studio compiler - works best for me.
This library's README file suggests that OSC
requires liblo 0.22 or later. See http://plugin.org.uk/liblo/
Make sure you have this library and it is located in you LD_LIBRARY_PATH (see e.g., this question for details, or the mex docs).
Get the source code for OSC library from their repository.
Compile the sources in matlab using
>> mex -O -largeArrayDims osc_free_address.c
>> mex -O -largeArrayDims osc_free_server.c
and so on for all 7 c source files. After mex-ing the c files you'll have mex files that you can run from Matlab as if they were regular functions.
You may find it useful to use the library's make file, as suggested by Andrew Mao.
Good luck,
If you look at the build for that software, it is compiling mex files, not DLLs (shared libraries): http://sourceforge.net/p/oscmex/code/4/tree/trunk/src/osc_make.m.
I would try using the mex commands instead of the dll commands (perhaps the files are just misnamed.) Even better, I would compile the files yourself with mex using the build file in source.
Note that the instructions also say that you need liblo-0.22 in order to run the library, so make sure you have that accessible as well.
I took a look at your OSC Toolkit. It seems they have been compiled by MATLAB mex. But, it is not mentioned for which kind of architecture they have been built. You can type mexext at MATLAB command prompt to find the extension for your MATLAB mex files. Then, change the DLL extensions to the given extension. If the original mex is compatible with your matlab, the DLL can be easily accessed by MATLAB. Just make sure to add the folder to your MATLAB path.
Try changing the extension from .dll to .mexw32 (in win32), or .wexw64 (in win64). It's a long shot but it might work.
The Shared Libraries cannot be used directly. As you have mentioned, you need to load them into MATLAB using loadlibrary. According to the documentation, loadlibrary takes two arguments (at least). The first argument is the name of the file, and the second one is the header file which contains definition of functions and external variables. If you do not provide the header file, the MATLAB looks for the a file with the same name as the DLL. Having said that, you need to have access to the header file or at least if you know how the function looks like, you need to write a header for the DLL.
I have worked with the DLLs in MATLAB. The MATLAB is not very user-friendly as long as DLL is concerned. Especially, if the DLL is written in a language other than C (or C++) you will have trouble loading the function into MATLAB.
Besides, MATLAB can only support some specific DLLs. Based, on your version of MATLAB, you need to find out whether or not the shared library is supported by MATLAB. Have a look at Here
In a nutshell, it is not easy to load a DLL into MATLAB. You need to have some information from DLL.

Is there a quick method to restore the PATH environment var settings for MATLAB?

I seem to have wiped out my path environment variable a while back. I've been slowly restoring things. I have both MATLAB and the MATLAB compiler installed. DLL's I've created from the MATLAB compiler won't run because they can't find the MATLAB compiler runtime dlls. While I did find the particular files that are my immediate problem. I'm wondering if there is a MATLAB *.bat file or command I can type that will restore my path variable to what it was after MATLAB and the Compiler were installed. I'm hoping to forestall future problems.
In case it's version specific I'm running MATLAB R2010b, I'm running a 32 bit version on a 64 bit machine.
[Edit]
I thought I would add that the path I need for running the compiled version was:
C:\Program Files (x86)\MATLAB\MATLAB Compiler Runtime\v714\runtime\win32
With luck, that one along with the two suggested in the answer will get me back to the original state.
restoredefaultpath might recover your MATLAB installation. Consider the use of startup.m, in order to easily undo changes to your environment.
You want to have these two directories on the PATH (I think the order is important):
C:\Program Files\MATLAB\R2010b\runtime\win32
C:\Program Files\MATLAB\R2010b\bin
Obviously you need to adjust the path to match your setup and architecture (those are on a WinXP 32-bit)
If you have a current software maintenance contract for your MATLAB, it may well be that the easy thing to do is just upgrade to the next version of MATLAB since R2011a is out now (unless there is some compelling reason why you must develop on R2010b. Running the software installer should recreate the default environment for MATLAB. I am, of course, assuming that you have Administrator access on your PC and have permission to install software.

What is a command line compiler?

What is a command line compiler?
Nowadays, you tend to have environments in which you develop code. In other words, you get an IDE (integrated development environment) which is comprised of an editor, compiler, linker, debugger and many other wonderous tools (code analysis, refactoring and so forth).
You never have to type in a command at all, preferring instead a key sequence like CTRLF5 which will build your entire project for you.
Not so in earlier days. We had to memorize all sorts of arcane commands to get our source code transformed into executables. Such beautiful constructs as:
cc -I/usr/include -c -o prog.o prog.c
cc -I/usr/include -c -o obj1.o obj1.c
as -o start.o start.s
ld -o prog -L/lib:/usr/lib prog.o obj1.o start.o -lm -lnet
Simple, no?
It was actually a great leap forward when we started using makefiles since we could hide all those arcane commands in a complex file and simply execute make from the command line. It would go away and perform all those commands for us, and only on files that needed it.
Of course, there's still a need for command-line compilers in today's world. The ability to run things like Eclipse in "headless" mode (no GUI) allow you to compile all your stuff in a batch way, without having to interact with the GUI itself.
In addition, both Borland (or whatever they're calling themselves this week) and Microsoft also provide command-line compilers for no cost (Microsoft also have their Express editions for free as well).
And gcc is also a command-line compiler. It does its one job very well and leaves it up to other applications to add a front end, if people need that sort of thing.
Don't get me wrong. I think the whole IDE thing is a wonderful idea for a quick code/debug cycle but I find that, once my applications have reached a certain level of maturity, I tend to prefer them in a form where I can edit the code with vim and just run make to produce the end product.
A command-line compiler is one that you run from the command line.
You type in gcc filename.c to compile a file (or something like that). Almost all compilers have a command-line version, and many have GUIs where you never see the command line, but the command line is still there. – Bill K Oct 5 at 16:27
(Bill K provided a nice answer in the comments... copied here and lightly edited by Mark Harrison, set to community wiki so as not to get rep.)