Perl script that invokes shell command doesn't work - perl

I am writing a simple Perl program to test a shell script for changing directory. But it doesn't work.
This is my code :
$result = `cd/`;
print $result;
It works fine when I use
$result =`dir`;

If you need to change the cwd directory in your script, then you should use Perl's built-in chdir function.
perldoc -f chdir

cd (by default) doesn't output anything, so you're assigning an empty string to your $result variable.
If you want to output the (full) path of the directory you changed to, simply append && pwd inside the backticks:
$result = `cd / && pwd`;
Note that `...` creates a child process for running the shell with the specified command, so whatever environment changes you perform there - including changing the directory - do NOT affect the Perl script itself.
In other words: you're NOT changing the Perl script's current directory with your shell command.
If your intent is:
to simply test whether the shell command you enclose in `...` succeeds or not, use, the system() function instead; e.g.:
system('cd /') == 0 || die "Command failed";
to capture the output from the shell command, presume it to be a directory path and change the Perl script's working directory to it:
$result = `cd / && pwd` || die "Command failed.";
chomp $result; # trim trailing newline
# Change Perl script's working dir.
chdir $result || die "Could not change to $result.";

To affect the current working directory of the perl process, use the chdir() function.

Related

Can I pass a string from perl back to the calling c-shell?

RHEL6
I have a c-shell script that runs a perl script. After dumping tons of stuff to stdout, it determines where (what dir) the parent shell should cd to when the perl script finishes. But that's a string, not an int which is all I can pass back with "exit()".
Storing the name of the dir in a file which the c-shell script can read is what I have now. It works, but is not elegant. Is there a better way to do this ? Maybe a little chunk of memory that I can share with the perl script ?
Short:
Redirect Perl's streams and restore in the end to print that info, taken by the shell script
Or, print that last and the shell script can pass output to the console and take the last line
Or, use a named pipe (either shell) or specific file descriptors (not csh) for that print
When the Perl script prints out that name you can assign it to a variable
in the shell script
#!/bin/csh
set DIR `perl -e'print "dir_name"'`
while in bash
#!/bin/bash
DIR="$(perl -e'print "dir_name"')"
where $(...) is preferred for the command substitution.
But those other prints to console from the Perl script then need be handled
One way is to redirect all output in Perl script other than that one print, what can be controlled by a command-line option (filename to which to redirect, which shell script can print out)
Or, take all Perl's output and pass it to console, the last line being the needed "return." This puts the burden on the Perl script to print that last (perhaps in an END block). The program's output can be printed from the shell script after it completes or line by line as it is emitted.
Or, use a named pipe (both shells) or a specific file descriptor (bash only) to which the Perl script can print that information. In this case its streams go straight to the console.
The question explicitly mentions csh so it is given below. But I must repeat the old and worn fact that shell scripting is far better done in bash than in csh. I strongly recommend to reconsider.
bash
If you need the program's output on the console as it goes, take and print it line by line
#!/bin/bash
while read line; do
echo "$line"
DIR=$line
done < <(perl script.pl)
echo "$DIR"
Or, if you don't need output on the console before the script is finished
#!/bin/bash
mapfile -t lines < <(perl script.pl)
DIR="${lines[-1]}"
printf '%s\n' "${lines[#]}" # print script.pl's output
Or, use file descriptors for that particular print
F=$(mktemp) # safe filename
exec 3> "$F" # open fd 3 to write to it
exec 4< "$F" # open fd 4 to read from it
rm -f "$F" # remove file(name) for safety; opened fd's can still access
perl -E'$fd=shift; say "...normal prints to STDOUT...";
open(FH, ">&=$fd") or die $!;
say FH "dirname";
close FH
' 3
read dir_name <&4
exec 3>&- # close them
exec 4<&-
echo "$dir_name"
I couldn't get it to work with a single file descriptor for both reading and writing (exec 3<> ...), I think because the read can't rewind after the write, thus separate descriptors are used.
With a Perl script (and not the demo one-liner above) pass the fd number as a command-line option. The script can then do this only if it's invoked with that option.
Or, use a named pipe very similarly to how it's done for csh below. This is probably best here, if the manipulation of the program's STDOUT isn't to your liking.
csh
Iterate over the program's (completed) output line by line
#!/bin/csh
foreach line ( "`perl script.pl`" )
echo "$line"
set dir_name = "$line"
end
echo "Directory name: $dir_name"
or extract the last line first and then print the whole output
#!/bin/csh
set lines = ( "`perl script.pl`" )
set dir_name = $lines[$#]
# Print program's output
while ( $#lines )
echo "$lines[1]"
shift lines
end
or use a named pipe
set fifo_name = "/tmp/fifo$$" # or use mktemp
mkfifo "$fifo_name"
( perl script.pl --fifo $fifo_name [other args] & )
set dir_name = `cat "$fifo_name"`
rm -f $fifo_name
echo "dir name from FIFO: $dir_name"
The Perl command is in the background since FIFO blocks until written and read. So if the shell script were to wait for perl ... to complete the Perl script would block as it's writing to FIFO (since that's not being read) so shell would never get to read it; we would deadlock. It is also in a subshell, with ( ), so to avoid the informational prints about the background job.
The --fifo NAME command-line option is needed so that Perl script knows what special file to use (and not to do this if the option is not there).
For an in-line example replace ( perl script ...) with this one-liner, used above as well
( perl -E'$ff = shift; say qq(\t...normal prints to STDOUT...);
open FF, ">$ff" or die $!;
say FF "dir_name_$$";
close FF
' $fifo_name
& )
(broken over lines for readability)

Perl regular expression loop through all the directory and get specific file

I would like to translate the unix regular expression into Perl language to get some specific file associated with some condition.
Suppose now I have Perl script in a directory /nfs/cs/test_case/y2016 call totalResult.pl, this directory also contains lot of directories as well such as testWeek1, testWeek2, testWeek3...etc. Each directory contain sub-directory such as testCase1, testCase2, testCase3...etc. and Each testCase directory contains a file call .test_result, the contain record the result either success or fail.
So I can get the file information using unix command, for example:
wc /nfs/cs/test_case/y2016/testWeek1/testCase1/.test_result
If would like to get the test_results for each directory and sub-directory which is fail, I can do it from the current path /nfs/cs/test_case/y2016 in unix like:
grep -ri "fail" */*/.test_result
It will give me the output:
/nfs/cs/test_case/y2016/testWeek1/testCase1/.test_result:fail
/nfs/cs/test_case/y2016/testWeek3/testCase45/.test_result:fail
/nfs/cs/test_case/y2016/testWeek4/testCase12/.test_result:fail
.
.
...etc
How can I achieve it in writing a Perl script just run the command perl testCase.pl then can get the same output? I'm new in unix and Perl, anyone can help?
# Collect names of all test files
my #TestFiles = glob('/nfs/cs/test_case/y2016/*/*/.test_result');
# Check test files for "fail"
foreach my $TestFile ( #TestFiles ) {
open(my $T,'<',$TestFile) or die "Can't open < $TestFile: $!";
while(<$T>){
if( /fail/ ) {
chomp;
print $TestFile,":",$_,"\n";
}
}
close($T);
}
You can also execute the same linux command within Perl using back tick (`) operator.
#result=`grep -ri "fail" */*/.test_result`;
print #result;

Need to pass variable to a batch script from a perl script

I'm running a perl script, which in turn calls a batch script. I need to pass 3 parameters to the batch script. I'm passing the parameters, since it it easier to read a file in perl script & capture the desired value. But, my script is erroring out with error - 'The system cannot find the path specified.'I'm using below code --
while (<FILE>)
{
($file, $rcc, $loc) = split(',');
my #lines = qx/"D:\\SiebelAdmin\\Commands\\WinFile_Move.bat $file $rcc $loc" /;
}
Remove the double quotes. With them, the system interprets the whole line as a command, not as a command with parameters.
my #lines = qx/D:\\SiebelAdmin\\Commands\\WinFile_Move.bat $file $rcc $loc/;
Please check if this works for you.
I have created sample batch script which takes two args and print it on prompt.
This script location is on desktop.
Code of batch script
#echo off
set arg1=%1
set arg2=%2
shift
shift
echo %arg1%
echo %arg2%
Output of batch script
C:\Users\Administrator\Desktop>a.bat perl5.8 perl5.18
perl5.8
perl5.18
C:\Users\Administrator\Desktop>
Now I have created the perl script which calls this batch script. This perl script is present in C drive.
Code for perl script
my $bat_file = 'C:\Users\Administrator\Desktop\a.bat';
my $arg1 = 'perl5.8';
my $arg2 = 'perl5.18';
my #lines = `$bat_file $arg1 $arg2`;
print #lines;
Output of perl script
C:\>perl tmp.pl
perl5.8
perl5.18
You can do like this:
Perl file:
my $arg = "hey";
my $bat_file_loc = "C:\\abc.bat";
system($bat_file_loc,$arg);
Batch file:
set arg1=%1

How Perl can execute a command in the same shell with it?

I am not sure whether the title is really make sense to this problem. My problem is simple, I want to write a perl script to change my current directory and hope the result can be kept after calling the perl script. The script looks like this:
if ($#ARGV != 0) {
print "usage: mycd <dir symbol>";
exit -1;
}
my $dn = shift #ARGV;
if ($dn eq "kite") {
my $cl = `cd ./private`;
print $cl."\n";
}
else {
print "unknown directory symbol";
exit -1;
}
However, my current directory doesn't change after calling the script. What is the reason? How can I resolve it?
No, the Perl script will be run in a subprocess so it will not be able to affect the environment of the process that called it.
There are various tricks you can use such as sourcing shell scripts (in the context of the current shell rather than a sub-process), or using bash functions and aliases, but they won't work here.
How Perl can execute a command in the same shell with it?
Unless you have a very atypical shell, shells can only receive commands via STDIN, via its command line, and possibly via a command evaluation builtin.
The first two are out unless the Perl script is the parent of the shell, but you could use the third one indirectly as in the following example.
script.pl:
#!/usr/bin/perl
print "chdir 'private'\n";
bash script:
echo "$PWD" # /some/dir
eval "$( script.pl )"
echo "$PWD" # /some/dir/private
Of course, if you use bash, you could hide the details in a shell function.
mycd () {
eval "$( mycd.pl "$#" )"
}
Allowing you use to use
mycd
or even
mycd foo

Why does system call affect subsequent print behaviour in perl?

Here's my code
#!/usr/bin/perl
use strict;
use warnings;
use diagnostics;
my $file = $ARGV[0];
system('wc -l $file');
print "\nprinting alpha \n";
sleep 1;
exit;
After I run (in tcsh shell) perl script.pl /path/to/file I don't see printing alpha until I press Ctrl+C. Even when I add another statement $|=1 either before or after system call, the behaviour remains the same.
What is happening?
You are executing the shell command
wc -l $file
The shell has no variable $file defined, so that's the same as
wc -l
This causes the shell to execute wc with the lone arg -l. With no file name provided, wc in turn reads from STDIN until you kill it with SIGINT from Ctrl-C.
You were perhaps aiming for
system("wc -l $file"); # XXX
but that's wrong too. That doesn't pass the args -l and the value of $file to wc. Consider what would happen if a file name with a space in it was provided.
To build a shell literal that results in the correct file name, you could use
use String::ShellQuote qw( shell_quote );
system(shell_quote('wc', '-l', $file));
But a better option is to avoid the shell and execute wc directly, passing to it the values you want without having to build a shell command.
system('wc', '-l', $file);
Because the single quotes prevent interpolation of $file. Change to double quotes.
What is happening is that the string is being executed without substituting a value for $file. When the shell gets this it looks for a shell variable $file which does not exist, so it executes the command with no file. This causes wc to read from stdin, resulting in the behavior you see.