I wrote a GUI program which makes use of separate m-functions. What I have done thus far was to add the paths of the folders where the sub-functions are stored on startup, and remove those paths on close.
Everything works fine, but when I tried to compile the project, I got an error message (attached) which suggests that ADDPATH is not a function that can be used in a standalone app.
Is there a way to overcome with w\o any major changes in the program? Maybe some way to include those functions w\o ADDPATH?
Related
I wounder what is the most efficient way to run a program, given as executable, from Matlab many times in a for loop. At the moment I use the following Code:
for i = 1:100
system('MyProgram.exe');
% Do something with the output from the .exe
end
So, from the profiler I know that 99,9% of the time is used in the execution of the Program itself. My question is basically if there is a more efficient way to run executables in general from within Matlab?
I have read that everytime I run an exe like described above, a process is created which has to initialize the Matlab runtime environment... Is there possibly a way to avoid this by only doing the initialization once and from there on run the programm multiple times?
I am guessing your can't directly modify the .exe's you are given, so perhaps there is a way to instead of calling the .exe directly, you could call a .bash shell script.
I would imagine that if you do this and within the shell script check to see if a workspace is already open to associate the execution of the .exe with a specific process ID. Although I would guess that when the executable finishes it closes the session.
Just throwing some stuff out there :P I have had lots of trouble with how Matlab handles this kind of thing (Also things like Excel).
Hope you figure this out.
EDIT: I found some possible examples here Example Descriptions
-Kyle
I am new to Coverity,I am using it from the command prompt with it's .exe files.So I want to pass specific macros in coverity cov-build.exe so that those macros will be implemented when cov-emit.exe(when it is called by cov-build.exe) is parsing the .c files.Till now I have tried the below stated configurations.
code-build.exe Intermediate_folder --delete-stale-tus --preprocessor-first --return-emit-failure "My_bat_file" -- -D My_macro_name=my_macro_body
So any help will be much be appreciated.I am stuck on this.
Thanks and regards,
Newbie_in
cov-build wraps your existing build command, monitors it and spawns parallel compiler invocations in order to understand your code. These parallel compiler invocations will see the same command line being passed to your own compiler.
So if you want this define to take effect for your compiler as well as Coverity's then you should simply just add it to your build the way you would normally and Coverity will see it.
If you want to add a define that only Coverity's compiler can see, this is best done with within the config for your compiler.
You can either edit the config directly (add
<append_arg>-Dmy_macro_name=my_macro_body</append_arg>
after the <begin_command_line_config> line), or re-configure using --xml-option.
For example, if you're using the shortcut gcc config this would look like this:
$ cov-configure --gcc --xml-option=append_arg>-Dmy_macro_name=my_macro_body.
I noticed you're using --preprocess-first on the cov-build command line - I recommend against this, as it destroys XREFs making it much more difficult to browse defect information, as well as makes the analysis unable to find some defects (i.e. ones that are due to macros). --preprocess-next behaves like --preprocess-first and will only fire if the initial compilation attempt fails, so if you're using --preprocess-first to work around compilation issues, I strongly recommend using --preprocess-next instead.
If you do have compilation issues, it's always good to report them (along with a reproducer) to Coverity support so that they can be fixed in future releases.
I have compiled a Matlab routine using the MCR and deployed it to other computers without having them installed matlab. So far, so good. But of course, the routine is not completely error-free, particularly the GUI part. The problem is that when the MCR tries to write the error message to the terminal, it seems to corrupt the terminal so that everything is no longer legible - not even the prompt. Sometimes I also get an extra window, vaguely resembling the matlab editor window, full of illegible ascii characters. Does anyone know what is causing this, or how to avoid it?
My first attempt was a big try-catch block around everything, but whatever it is still seems to get through. The catch block just tries to divert the error to an errordlg rather than the command prompt:
catch e
errordlg({e.message;['in: ',e.stack.name]})
end
MATLAB Compiler does not support command window functions.
Peter Webb tells on Loren's blog:
Certain MATLAB functions cannot be deployed because they act on
objects that are not present in a deployed application. For example,
since deployed applications have no command window, functions that
modify the command window can't be deployed.
So, you probably need to get rid of any function that prints to the command window.
Also, you can check out the mccExcludedFiles.log file.
I'm writing some MATLAB code, and I want to use some optimized C routines. I have the C source code, and it works just fine. I've created a MEX file and am capable of compiling it provided that it is in the same folder as the optimized C routines. However, I want to be able to distribute this code to others on various platforms. Since MEX files are binaries, each individual must (likely) recompile on their own machines. That is fine, but I want to make the process as painless as possible.
Currently, if all of the files are in the same directory, then calling something like
mex mexfile.c other1.o other2.o other3.o
from that directory works just fine as you would expect. For organizational purposes, however, I'd like for the C code to be in its own (sub)directory, say code. Unfortunately, if I structure things like this, the mex command errors out (the errors differ based on the different things that I've tried). I've tried things like
mex mexfile.c code/other1.o code/other2.o code/other3.o
and the -Ipathname and -Lfolder options on the mex command, but these haven't worked for me. I would think that there has to be a simple way to do what I'm wanting to do, but I just can't find the appropriate documentation or figure it out myself. Any help would be appreciated.
[OP's solution converted to answer below]
In the setup given above, using the following command works as desired for me:
mex -Icode mexfile.c code/other1.o code/other2.o code/other3.o
I stumbled across this solution while regenerating error messages.
I'm guessing that this is just the appropriate syntax for the -Ipathname option to the mex command, but I can't find documentation to support this conclusion. If anyone else can provide a link to actual documentation, that would certainly be better than a potential solution that just happened to work for me.
I use something along the lines of
emlc -I './somedir/' -o sourcefilemex -T mex sourcefile
By changing the -T argument you can generate mex code or embeddable C. That's handy.
Hi I am using a perl script written by another person who is no longer in the company.
If I run the script as a stand alone, then the output are as expected. But when I call the script from another code repeatedly, the output is wrong except for the first time.
I suspect some variables are not initialised properly. When it is called standalone, each time it exits and all the variable values are initialised to defaults. But when called from another perl script, the modules and the variable values are probably carried over to the next call of the script.
Is there any way to flush out the called script from memory before I call it next time?
I tried enabling warning and it was throwing up 1000s of lines of warning...!
EDIT: How I am calling the other script:
The code looks like this:
do "processing.pl";
...
...
...
process(params); #A function in processing.pl
...
...
...
If you want to force the module to be reloaded, delete its entry from %INC and then reload it.
For example:
sub reload_module {
delete $INC{'Your/Silly/Module.pm'};
require Your::Silly::Module;
Your::Silly::Module->import;
}
Note that if this module relies on globals in other modules being set, those may need to be reloaded as well. There's no easy way to know without taking a peak at the code.
Hi I am using a perl script written by another person who is no longer in the company.
I tried enabling warning and it was throwing up 1000s of lines of warning...!
There's your problem right there. The script was not written properly, and should be rewritten.
Ask yourself this question: if it has 1000s of warnings when you enable strict checking, how can you be sure that it is doing the right thing? How can you be sure that it is not clobbering files, trashing data sets, making a mess of your filesystem? Chances are it is doing all of these things, either deliberately or accidentally.
I wouldn't trust running an error-filled script written by someone no longer with the company. I'd rewrite it and be sure that it was doing what I needed it to do.
Unloading a module is a more difficult task than simply removing the %INC entry of the module. Take a look at Class::Unload from CPAN.
If you don't want to rewrite/fix the script, I suggest calling the script via exec() or one of its varieties. While it is not very elegant to do, it will definitely fix your problem.
Are you sure that you need to reload the module? By using do, you are reading the source every time and executing it. What happens if you change that to require, which will only read and evaluate the source once?
Another possibility (just thinking aloud here) could be to do with the local directory? Are they running from the same place. Probably wouldn't work the first time though.
Another option is to use system ('doprocessing.pl');. Lazily, we do this with a few scripts to force re-initialisation of a number of classes/variables etc. And to force the log files to rotate properly.
edit: I have just re-read your question, and it would appear that you are not calling it like this.