I need to write a script that will run external program from matlab and to get output results from it. This program should simulate 20 variants. I made those 20 files in matlab, and I'm able to open external program, but I am not able to write a command that will simulate the files in it. And return me output results. (Input files have .idf extension and output .eso)
I tried with these and similar commands
for id=1:20;
system(sprintf('C:\...\myprogram.exe<''variant_%i.idf',id));
i=1+1;
end
or
for id=1:20;
cmd_line = '"C:\...\myprogram.exe" -f variant_%i.idf -o variant_%i.eso';
[status, result] = system(cmd_line);
i=1+1;
end
I need to do this for exam, and I had only 3 weeks of matlab and never studied programming, so I'm sorry if this is a stupid question, but I don't know where else to ask.
You may change your "myprogram.exe" in order to report the output data into a file.
Related
My code runs ok if all parfor code is in the same script file, but since the code is huge and I want to select parallel or serial mode execution, I separated it in a different script file, as fallow:
if (useParFor)
myParforCode.m
else
serialCode.m
end
The problem is that Matlab gives me this error:
"Using exist to check for a variable in the transparent workspace is
not supported."
But if I copy all the code in myParforCode.m and put it after the if statement instead of calling the script, the code runs. I thought I could divide my code in scripts without problems, but it seams it not like that.
What are the limitations here, what I'm doing wrong?
My code is huge but I'll try to create a running code sample and add it here.
looking for some pointers with using portqry and specifically, piping or modifying the output to a CSV.
I have around 20000 servers to check, all with a variance of 3-4 open ports to test and automating portqry seems the easiest way for me. Can't use 3rd party software as it's an Enterprise firewalled solution I'm testing.
So, I can easily use a FOR loop to make portqry run across a txt file line by line which outputs into another txt file.
This is fairly useless at the numbers I'm testing as I need to be able to filter and analyse the data easily.
I then moved onto using a couple of .bats to redirect my output based on the errorlevel of portqry, ie 0,1,2,3. Which kind of works but is still a pain as I have to hardcode the output for my CSV.
(Excuse the pseudo code)
:TOP
FOR %I in foo.txt, DO 'PORTQRY -n %I -e PORT#
#IF errorlevel =0 '%I,22,LISTENING' >> fooLISTENING.txt
goto END
ELSE
#IF errorlevel =1........
ETC ETC
This is still a bit of a pain as I'd rather see all targets in one place and then filter by listening, etc.
Another issue is that not being able to resolve the host doesn't seem to be an applicable errorlevel number.
At this point I'm starting to look at PowerShell to accomplish this, assign variables at each line then write them into a CSV, but it'd take a while to get that setup in my environment.
Any ideas while I have portqry though? Or indeed if you know of PS that would do the trick that'd be great too.
I got a .exe file complied by Fortran, which is for converting one format to another one. I have to run it lots of times and each time with different input. I could able to run the exe file with below script
command=('C:\Program Files\Director2.exe < O:\Free\1.dat');
system(command);
until now every thing is fine but when the GUI is comes up plus input data, I should chose the format and new directory for saving the conversion. I would like to know, is there any way to do that? Indeed I don't wanna use java.awt.Robot class which is not work for me (GUI automation using a Robot).
Also I have check this post, which has not been helpful.
(How to run .exe file from matlab)
Thanks in advance,
I have a script that is running a series of for loops, and within these for loops a file is created that is then run using an external program using the script command. In summary it looks like this:
for i=1:n1
for j=1:n2
for k=1:n3
fid=fopen('file.txt','w');
fprintf(fid,'Some commands to pass to external program depending on i j k');
fclose(fid);
system('program file.txt');
end
end
end
The script has in total about 500k cases (n1xn2xn3), and runs fine for a small scenario (about 100 runs), but for the entire script it runs for a while and then returns an error for no apparent reason, giving this error:
fopen invalid file identifier object
There is no obvious reason for this, and Im wondering if anyone could point out what is wrong?
Just a guess: an instance of your external program is reading file.txt and at the same time the next iteration of your nested loop wants to open file.txt for writing. The more instances of your external program are running at the same time, the slower your machine, the more likely becomes this scenario. (called a 'race condition')
Possible solution for this: use a separate text file per case with a unique file name
You should also consider using other ways to call your external function because file handling for 500k cases should be very slow.
Hope that helps,
Eli
Is it possible to execute a Perl script within a Stata .do file?
I have a Stata .do file in which I make some manipulations to the data set and arrange it in a certain way. Then I have a Perl script in which I take one of the variables at this point, apply a Perl package to it, and make a transformation to one of the variables. In particular, I use Perl's NYSIIS function, resulting in a very short script. After this output is provided in Perl, I'd like to continue with some additional work in Stata.
Two alternatives that come to mind but that are less desirable are:
Write Stata code to do nysiis but I prefer to use Perl's built-in function.
outsheet and save the output from the Stata .do file as a .txt for Perl. Then do the Perl script separately to get another .txt. Then read in that .txt to Stata to a new .do file and resume.
Your approach number 2 is what I use to call other programs to operate on Stata data. As Nick says, Stata won't necessarily wait for your output, unless you ask it to. You first outsheet the text file, then call the Perl script from Stata using ! to run something on the command line. Finally, have Stata periodically check for the result file, using a while loop and the sleep command so Stata isn't constantly checking.
outsheet using "perl_input.txt"
!perl my_perl_script.pl
while (1) {
capture insheet using "perl_output.txt", clear
if _rc == 0 continue, break
sleep 10000
}
!rm perl_output.txt
Here, your formatted data is saved from Stata as perl_input.txt. Next, your Perl script is run from the command line, and using a while loop, Stata checks for the output every 10 seconds (sleep takes arguments in milliseconds). When it finds the output file, it breaks out of the while loop. The last line is a good idea so that when you re-use the code you don't run the risk of using the Perl output from the last run.
I think the main issue is that although you can use the shell to call up something else, Stata is not going to wait for the results.
Start with help shell to see what is possible, but your #2 does sound easiest.