Is there a way to automatically test using the standard Test etc. modules whether a Perl program is reading input from e.g. STDIN properly? E.g. testing a program that reads two integers from STDIN and prints their sum.
It's not 100% clear what you mean, I'll asnswer assuming you want to write a test script that tests your main program, which as part of the test needs to have test input data passed via STDIN.
You can easily do that if your program outputs what it reads. You don't need a special test module - simply:
Call your program your're testing via a system call
redirect both STDIN and STDOUT of tested program to your test script, using
IPC::Open2 module to open both sides via pipes to filehandles,
... OR, build your command to redirect to/from files and read/write the files in the test script
Check STDOUT from tested program that you collected in the last step to make sure correct values are printed.
If you want to test if STDIN is connected to a terminal, use -t, as in:
if( -t STDIN ){
print "Input from a terminal\n";
}else{
print "Input from a file or pipe\n";
}
See http://perldoc.perl.org/perlfunc.html#Alphabetical-Listing-of-Perl-Functions
Related
I am writing a large Perl script, which needs to utilize other existing Perl scripts. The problem is the main script needs to reference many different scripts from different folders. For example the main script would be contained in:
/perl/programs/io
It may need to run a script which is stored in:
/perl/programs/tools
Note that there are other orthogonal folders besides tools so I need to be able to access any of them on the fly.
Currently this is what I got:
my $mynumber = '../tools/convert.pl bin2dec 1011';
In theory it should move back from the io directory then enter the appropriate tool directory and call the convert.pl script while passing it the parameters.
All this does is store the string in the single quotes to $myNumber.
I like to assign the output of a command to an array so I can loop through the array to find error or other messages. For example if I'm making a zip file to email to someone I want to check to see if the zip program had any errors before I continue to make and send the email.
#msgs = `zip -f myfile.zip *.pl`; # Use backticks
You can also assign the output to a scalar:
$msg = `ls -al *.pl`; # Use backticks
To run any system command or script all you have to do is use `backticks`. From observing another programer's perl code, I misread these strange quotes for 'single quotes'.
backticks are also nice because they return the text in STDOUT to your perl script so that the output can be assigned to a variable, something I have found impossible if using system("");
The similar question answer does not work with my version of perl. The line
use IPC::System::Simple qw(system capture);
throws some errors. However just using system works, like this:
my $mynumber = system($^X, "../tools/convert.pl", 'bin2dec', '1011');
I can use the above without setting equal to something to execute scripts which return no value and are only sent arguments.
This seems to be the easiest way to do what I need to and the entire programs folder can be moved anywhere and it will still work as no parent directories above programs are used.
I know this has been asked before.
But hear me out once..
I'm working on a Cisco router (IOS).
I have to write a script which executes some commands and redirect their output to a file. But instead of using tee for every command, I want to open a file then run the commands , whose output will be redirected to the file and then close the file.
And even the redirection operator > is not working or the following answer:
How can I redirect stdout into a file in tcl
The fact that the solutions in that question aren't working for you is informative: the real issue that you are experiencing is that Tcl commands do not normally write to stdout anyway (except for puts, which has that as its main job, and parray, which is a procedure that uses puts internally). The “write to stdout” that you are used to is a feature of the interactive Tcl shell only.
To capture the result of all commands while running a script, you need a wrapper like this:
trace add execution source leavestep log_result
proc log_result {cmd code result op} {
puts stdout $result
}
source theRealScript.tcl
You'll find that it produces a lot of output. Cutting the output down is a rather useful thing, so this reduces it to just the immediately-executed commands (rather than everything they call):
trace add execution source enterstep log_enter
trace add execution source leavestep log_result
proc log_enter {args} {
global log_depth
incr log_depth
}
proc log_result {cmd code result op} {
global log_depth
if {[incr log_depth -1] < 1 && $result ne ""} {
puts stdout $result
}
}
source theRealScript.tcl
You'll probably still get far more output than you want…
I'm trying to get user input from a web page written in Perl and send it to a local program (blastp), then display the results.
This is what I have right now:
(input code)
print $q->p, "Your database: $bd",
$q->p, "Your protein is: $prot",
$q->p, "Executing...";
print $q->p, system("blastp","-db $bd","-query $prot","-out results.out");
Now, I've done a little research, but I can't quite grasp how you're supposed to do things like this in Perl. I've tried opening a file, writing to it, and sending it over to blastp as an input, but I was unsuccessful.
For reference, this line produces a successful output file:
kold#sazabi ~/BLAST/pataa $ blastp -db pataa -query ../teste.fs -out results.out
I may need to force the bd to load from an absolute path, but that shouldn't be difficult.
edit: Yeah, the DBs have an environmental variable, that's fixed. Ok, all I need is to get the input into a file, pass it to the command, and then print the output file to the CGI page.
edit2: for clarification:
I am receiving user input in $prot, I want to pass it over to blastp in -query, have the program blastp execute, and then print out to the user the results.out file (or just have a link to it, since blastp can output in HTML)
EDIT:
All right, fixed everything I needed to fix. The big problem was me not seeing what was going wrong: I had to install Tiny:Capture and print out stderr, which was when I realized the environmental variable wasn't getting set correctly, so BLAST wasn't finding my databases. Thanks for all the help!
Write $prot to the file. Assuming you need to do it as-is without processing the text to split it or something:
For a fixed file name (may be problematic):
use File::Slurp;
write_file("../teste.fs", $prot, "\n") or print_error_to_web();
# Implement the latter to print error in nice HTML format
For a temp file (better):
my ($fh, $filename) = tempfile( $template, DIR => "..", CLEANUP => 1);
# You can also create temp directory which is even better, via tempdir()
print $fh "$prot\n";
close $fh;
Step 2: Run your command as you indicated:
my $rc = system("$BLASTP_PATH/blastp", "-db", "pataa"
,"-query", "../teste.fs", "-out", "results.out");
# Process $rc for errors
# Use qx[] instead of system() if you want to capture
# standard output of the command
Step 3: Read the output file in:
use File::Slurp;
my $out_file_text = read_file("results.out");
Send back to web server
print $q->p, $out_file_text;
The above code has multiple issues (e.g. you need better file/directory paths, more error handling etc...) but should start you on the right track.
I basically want to reopen STDERR/STDOUT so they write to one logfile with both the stream and the timestamp included on every line. So print STDERR "Hello World" prints STDERR: 20130215123456: Hello World. I don't want to rewrite all my print statements into function calls, also some of the output will be coming from external processes via system() calls anyway which I won't be able to rewrite.
I also need for the output to be placed in the file "live", i.e. not only written when the process completes.
(p.s. I'm not asking particularly for details of how to generate timestamps, just how to redirect to a file and prepend a string)
I've worked out the following code, but it's messy:
my $mode = ">>";
my $file = "outerr.txt";
open(STDOUT, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDOUT: \$\_"; }'));
open(STDERR, "|-", qq(perl -e 'open(FILE, "$mode", "$file"); while (<>) { print FILE "STDERR: \$\_"; }'));
(The above doesn't add dates, but that should be trivial to add)
I'm looking for a cleaner solution, one that doesn't require quoting perl code and passing it on the command line, or at least module that hides some of the complexity. Looking at the code for Capture::Tiny it doesn't look like it can handle writing a part of output, though I'm not sure about that. annotate-output only works on an external command sadly, I need this to work on both external commands and ordinary perl printing.
The child launched via system doesn't write to STDOUT because it does not have access to variables in your program. Therefore, means having code run on a Perl file handle write (e.g. tie) won't work.
Write another script that runs your script with STDOUT and STDERR replaced with pipes. Read from those pipes and print out the modified output. I suggest using IPC::Run to do this, because it'll save you from using select. You can get away without it if you combine STDOUT and STDERR in one stream.
I'm trying to handle the possibility that no arguments and no piped data is passed to a Perl script. I'm assuming that if there are no arguments then input is being piped via STDIN. However if the user provides no arguments and does not pipe anything to the script, it will try to get keyboard input. My objective is to provide an error message instead.
Unfortunately, select() is not portable to some non-POSIX systems.
Is there another way to do this with maximum portability?
Perl comes with the -t file-test operator, which tells you if a particular filehandle is open to a TTY. So, you should be able to do this:
if ( -t STDIN and not #ARGV ) {
# We're talking to a terminal, but have no command line arguments.
# Complain loudly.
}
else {
# We're either reading from a file or pipe, or we have arguments in
# #ARGV to process.
}
A quick test reveals this working fine on Windows with Perl 5.10.0, and Linux with Perl 5.8.8, so it should be portable across the most common Perl environments.
As others have mentioned, select would not be a reliable choice as there may be times when you're reading from a process, but that process hasn't started writing yet.
All the best,
Paul
use POSIX 'isatty';
if ( ! #ARGV && isatty(*STDIN) ) {
die "usage: ...";
}
See: http://www.opengroup.org/onlinepubs/009695399/functions/isatty.html
Note that select wouldn't be much help anyway, since it would produce false results
if the piped info wasn't ready yet. Example:
seq 100000|grep 99999|perl -we'$rin="";vec($rin,fileno(STDIN),1)=1;print 0+select($rin,"","",.01)'