Why can't I get the output of a command with system() in Perl? - perl

When executing a command on the command-line from Perl, is there a way to store that result as a variable in Perl?
my $command = "cat $input_file | uniq -d | wc -l";
my $result = system($command);
$result always turns out to be 0.

Use "backticks":
my $command = "cat $input_file | uniq -d | wc -l";
my $result = `$command`;
And if interested in the exit code you can capture it with:
my $retcode = $?;
right after making the external call.

From perlfaq8:
Why can't I get the output of a command with system()?
You're confusing the purpose of system() and backticks (````). system() runs a command and returns exit status information (as a 16 bit value: the low 7 bits are the signal the process died from, if any, and the high 8 bits are the actual exit value). Backticks (``) run a command and return what it sent to STDOUT.
$exit_status = system("mail-users");
$output_string = `ls`;

You can use the Perl back-ticks to run a shell command, and save the result in an array.
my #results = `$command`;
To get just a single result from the shell command, you can store it in a scalar variable:
my $result = `$command`;
If you are expecting back multiple lines of output, it's easier to use an array, but if you're just expecting back one line, it's better to use scalar.
(something like that, my perl is rusty)

You can use backticks, as others have suggested. That's fine if you trust whatever variables you're using to build your command.
For more flexibility, you can open the command as a pipe and read from that as you would a file. This is particularly useful when you want to pass variables as command line arguments to the program and you don't trust their source to be free of shell escape characters, as open in recent Perl (>= 5.8) has the capacity to invoke a program from an argument list. So you can do the following:
open(FILEHANDLE, '-|', 'uniq', 'some-file.txt') or die "Cannot fork: $!\n";
while (<FILEHANDLE>) {
# process a line from $_
}
close FILEHANDLE or die "child error: $!\n";

IPC::System::Simple provides the 'capture' command which provides a safe, portable alternative to backticks. It (and other commands from this module) are highly recommended.

I'm assuming this a 'contrived' example, because this is
equivalent: 'uniq -d $input_file | wc -l'.
In almost all my experience, the only reason for putting the results
in to a perl variable, is to parse the later. In that case, I use
the following pattern:
$last = undef;
$lc = 0;
open(FN, "$input_file");
while (<FN>) {
# any other parsing of the current line
$lc++ if ($last eq $_);
$last = $_;
}
close(FN);
print "$lc\n";
This also has the added advantages:
no fork for shell, cat, uniq, and wc
faster
parse and collect the desired input

Related

Perl backticks subprocess is causing EOF on STDIN

I'm having this issue with my perl program that is reading from a file (which I open on STDIN and read each line one at a time using $line = <>). After I execute a `backtick` command, and then I go to read the next line from STDIN, I get an undef, signaling EOF. I isolated it to the backtick command using debugging code as follows:
my $dir = dirname(__FILE__);
say STDERR "before: tell(STDIN)=" . tell(STDIN) . ", eof(STDIN)=" . eof(STDIN);
say STDERR "\#export_info = `echo nostdin | perl $dir/pythonizer_importer.pl $fullfile`;";
#export_info = `echo nostdin | perl $dir/pythonizer_importer.pl $fullfile`;
say STDERR "after: tell(STDIN)=" . tell(STDIN) . ", eof(STDIN)=" . eof(STDIN);
The output is:
before: tell(STDIN)=15146, eof(STDIN)=
#export_info = `echo nostdin | perl ../pythonizer_importer.pl ./Pscan.pm`;
after: tell(STDIN)=15146, eof(STDIN)=1
I recently added the echo nostdin | to the perl command which had no effect. How do I run this command and get the STDOUT without messing up my STDIN? BTW, this is all running on Windows. I fire off the main program from a git bash if that matters.
Try locally undefining STDIN before running the backticks command, like this example script does. Note that any subroutines called from the sub that calls local will see the new value. You can also do open STDIN, "<", "file for child process to read"; after the local *STDIN but before the backticks but remember to close() the file before restoring STDIN to its old value.
The child process is affecting your STDIN because "the STDIN filehandle used by the command is inherited from Perl's STDIN." – perlop manual
This is just an example; in your actual script, replace the sed command with your actual command to run.
use strict;
use warnings;
#Run a command and get its output
sub get_output {
# Prevent passing our STDIN to child process
local *STDIN = undef;
print "Running sed\n";
#replace the sed command with the actual command you want to run
return `sed 's/a/b/'`;
}
my $output = get_output();
print $output;
#We can still read STDIN even after running a child process
print "Waiting for input\n";
print "Readline is " . scalar readline;
Input:
a
b
c
^D
line
Output:
Running sed
b
b
c
Waiting for input
Readline is line

How to get STDERR in Perl from a command executed in pipe with su -c

I'm trying to capture the output of the command executed as a different user using:
my $command = qq(sudo su - <username> -c '/usr/bin/whatever');
my $pid = open $cmdOutput, "-|", $command;
How can I capture the STDERR of /usr/bin/whatever?
I tried
$pid = open $cmdOutput, "-|", $command || die " something went wrong: $!";
but it looks like this is capturing the possible errors of "open" itself.
I also tried
my $command = qq(sudo su - <username> -c '/usr/bin/whatever' 2>/tmp/error.message);
which will redirect the STDERR to the file, which I can parse later, but I wanted some more straightforward solution.
Also, I only want to use core modules.
This is covered thoroughly in perlfaq8. Since you are using a piped open, the relevant examples are those that go by open3 from the core IPC::Open3 module.
Another option is to use IPC::Run for managing your processes, and the pump function will do what you need. The IPC::Open3 documentation says for IPC::Run
This is a CPAN module that has better error handling and more facilities than Open3.
With either of these you can manipulate STDOUT and STDERR separately or together, as needed. For convenient and complete output capture also see Capture::Tiny.
Other than 2>output redirection, there are no more elementary methods for the piped open.
If you don't mind mixing the streams or losing STDOUT altogether, another option is
my $command = 'cmd 2>&1 1>/dev/null' # Remove 1>/dev/null to have both
my $pid = open my $cmdOutput, "-|", $command;
while (<$cmdOutput>) { print } # STDERR only
The first redirection merges STDERR stream with STDOUT so you get them both, and mixed (with STDOUT subject to buffering, thus things may well come out of order). The second redirect sends the STDOUT away so with it in place you read only the command's STDERR from the handle.
The question is about running an external command using open but I'd like to mention that the canonical and simple qx (backticks) can be used in the same way. It returns the STDOUT so redirection just like above is needed to get STDERR. For completeness:
my $cmd = 'cmd_to_execute';
my $allout = qx($cmd 2>&1); # Both STDOUT and STDERR in $out, or
my $stderr = qx($cmd 2>&1 1>/dev/null); # Only STDERR
my $exit_status = $?;
The qx puts the child process exit code (status) in $?. This can then be inspected for failure modes; see a summary in the qx page or a very thorough discussion in I/O operators in perlop.
Note that the STDERR returned this way is from the command, if it ran. If the command itself couldn't be run (for a typo in command name, or fork failed for some reason) then $? will be -1 and the error will be in $!.
As suggested by zdim I used the IPC::Open3 module for the matter and I've got something like this doing the job for me
$instanceCommand = qq(sudo su - <username> -c '<command>');
my ($infh,$outfh,$errfh,$pid);
$errfh = gensym();
$pid = open3($infh, $outfh, $errfh, $instanceCommand);
my $sel = new IO::Select;
$sel->add($outfh,$errfh);
while (my #ready = $sel->can_read){
foreach my $fh (#ready){
my $line =<$fh>;
if (not defined $line){
$sel->remove($fh);
next;
}
if ($fh == $outfh){
chomp($line);
#<----- command output processing ----->
}
elsif ($fh == $errfh){
chomp $line;
#<----- command error processing ----->
}
else {
die "Reading from something else\n";
}
}
}
waitpid($pid, 0);
Maybe not completely bullet proof, but its working fine for me. Even whilst executing funny cascaded script as < command > .
The desired destination, opened for writing, could be dup()'ed to FD #2

Tail command used in perl backticks

I'm trying to run a tail command from within a perl script using the usual backticks.
The section in my perl script is as follows:
$nexusTime += nexusUploadTime(`tail $log -n 5`);
So I'm trying to get the last 5 lines of this file but I'm getting the following error when the perl script finishes:
sh: line 1: -n: command not found
Even though when I run the command on the command line it is indeed successful and I can see the 5 lines from that particular.
Not sure what is going on here. Why it works from command line but through perl it won't recognize the -n option.
Anybody have any suggestions?
$log has an extraneous trailing newline, so you are executing
tail file.log
-n 5 # Tries to execute a program named "-n"
Fix:
chomp($log);
Note that you will run into problems if log $log contains shell meta characters (such as spaces). Fix:
use String::ShellQuote qw( shell_quote );
my $tail_cmd = shell_quote('tail', '-n', '5', '--', $log);
$nexusTime += nexusUploadTime(`$tail_cmd`);
ikegami pointed out your error, but I would recommend avoiding external commands whenever possible. They aren't portable and debugging them can be a pain, among other things. You can simulate tail with pure Perl code like this:
use strict;
use warnings;
use File::ReadBackwards;
sub tail {
my ($file, $num_lines) = #_;
my $bw = File::ReadBackwards->new($file) or die "Can't read '$file': $!";
my ($lines, $count);
while (defined(my $line = $bw->readline) && $num_lines > $count++) {
$lines .= $line;
}
$bw->close;
return $lines;
}
print tail('/usr/share/dict/words', 5);
Output
ZZZ
zZt
Zz
ZZ
zyzzyvas
Note that if you pass a file name containing a newline, this will fail with
Can't read 'foo
': No such file or directory at tail.pl line 10.
instead of the more cryptic
sh: line 1: -n: command not found
that you got from running the tail utility in backticks.
The answer to this question is to place the option -n 5 before the target file

From Perl, spawn a shell, configure it, and fork the STDOUT

I use a Perl script to configure and spawn a compiled program, that needs a subshell configured a certain way, so I use $returncode = system("ulimit -s unlimited; sg ourgroup 'MyExecutable.exe'");
I want to capture and parse the STDOUT from that, but I need it forked, so that the output can be checked while the job is still running. This question comes close:
How can I send Perl output to a both STDOUT and a variable? The highest-rated answer describes a function called backtick() that creates a child process, captures STDOUT, and runs a command in it with exec().
But the calls I have require multiple lines to configure the shell. One solution would be to create a disposable shell script:
#disposable.sh
#!/bin/sh
ulimit -s unlimited
sg ourgroup 'MyExecutable.exe'
I could then get what I need either with backtick(disposable.sh) or open(PROCESS,'disposable.sh|').
But I'd really rather not make a scratch file for this. system() happily accepts multi-line command strings. How can I get exec() or open() to do the same?
If you want to use shell's power (that includes loops, variables, but also multiple command execution), you have to invoke the shell (open(..., 'xxx|') doesn't do that).
You can pass your shell script to the shell with the -c option of the shell (another possibility would be to pipe the commands to the shell, but that's more difficult IMHO).
That means calling the backtick function from the other answer like this:
backtick("sh", "-c", "ulimit -s unlimited; sg ourgroup 'MyExecutable.exe'");
The system tee with backticks will do this, no?
my $output = `(ulimit -s unlimited; sg ourgroup 'MyExecutable.exe') | tee /dev/tty`;
or modify Alnitak's backticks (so it does use a subshell)?
my $cmd = "ulimit -s unlimiited ; sg ourgroup 'MyExecutable.exe'";
my $pid = open(CMD, "($cmd) |");
my $output;
while (<CMD>) {
print STDOUT $_;
$output .= $_;
}
close CMD;
Expect should be used as you are interacting with your program: http://metacpan.org/pod/Expect
Assuming /bin/bash on your *nix matches something like bash-3.2$ the below program can be used to launch number of commands using $exp->send on bash console and output from each command can then be parsed for further actions.
#!/usr/bin/perl
use Expect;
my $command="/bin/bash";
my #parameters;
my $exp= new Expect;
$exp->raw_pty(1);
$exp->spawn($command);
$exp->expect(5, '-re', 'bash.*$');
$exp->send("who \n");
$exp->expect(10, '-re', 'bash.*$');
my #output = $exp->before();
print "Output of who command is #output \n";
$exp->send("ls -lt \n");
$exp->expect(10, '-re', 'bash.*$');
my #output = $exp->before();
print "Output of ls command is #output \n";

Why does system call affect subsequent print behaviour in perl?

Here's my code
#!/usr/bin/perl
use strict;
use warnings;
use diagnostics;
my $file = $ARGV[0];
system('wc -l $file');
print "\nprinting alpha \n";
sleep 1;
exit;
After I run (in tcsh shell) perl script.pl /path/to/file I don't see printing alpha until I press Ctrl+C. Even when I add another statement $|=1 either before or after system call, the behaviour remains the same.
What is happening?
You are executing the shell command
wc -l $file
The shell has no variable $file defined, so that's the same as
wc -l
This causes the shell to execute wc with the lone arg -l. With no file name provided, wc in turn reads from STDIN until you kill it with SIGINT from Ctrl-C.
You were perhaps aiming for
system("wc -l $file"); # XXX
but that's wrong too. That doesn't pass the args -l and the value of $file to wc. Consider what would happen if a file name with a space in it was provided.
To build a shell literal that results in the correct file name, you could use
use String::ShellQuote qw( shell_quote );
system(shell_quote('wc', '-l', $file));
But a better option is to avoid the shell and execute wc directly, passing to it the values you want without having to build a shell command.
system('wc', '-l', $file);
Because the single quotes prevent interpolation of $file. Change to double quotes.
What is happening is that the string is being executed without substituting a value for $file. When the shell gets this it looks for a shell variable $file which does not exist, so it executes the command with no file. This causes wc to read from stdin, resulting in the behavior you see.