Perl mis-behaves when a file is given as an argument - perl

Hi below is my perl code :
#!/usr/bin/perl -w
no warnings;
use Env;
my $p4user = $ENV{'P4USER'};
my $current_path = $ENV{'PWD'};
print("\t Hello $p4user. You are currently in $current_path path.\n");
open($user_in,"$ARGV[0]") || die("failed to open the argument file $ARGV[0]\n");
print("\t To create a new client enter '1' |");
print("\t To use an existing client enter '2' : ... ");
my $cl_op = <>;
chop($cl_op);
if (($cl_op == 1) || ($cl_op == 2))
{
# do something common for both condition
if ($cl_op == 1)
{
# do something
}
elsif ($cl_op == 2)
{
# do something
}
}
else
{
die("\n\t Sorry. Invalid option : $cl_op\n");
}
Then the script runs like :
Hello biren. You are currently in /remote/vgvips18/biren/tcf_4nov path.
Sorry. Invalid option : ###################################################################
To create a new client enter '1' | To use an existing client enter '2' : ...
Any Idea why the script is behaving like this. When I comment out this line "
open($user_in,"$ARGV[0]") || die("failed to open the argument file
$ARGV[0]\n");
", the script runs fine.
Can anyone help why the script behaves when i pass a file as an argument on command line.

my $cl_op = <>;
That will read the first line of the file you passed as the arg.
To make it read the user response to your prompt, change it to this:
my $cl_op = <STDIN>;

It's wise to include sanity checks to verify that a file name was passed as an argument before attempting to open it.
my $user_in;
if ( defined $ARGV[0] ) {
die "$ARGV[0] does not exist.\n" unless -e $ARGV[0];
open($user_in,"$ARGV[0]")
or die("failed to open the argument file $ARGV[0]: $!\n");
}

<> is shorthand for <ARGV>, where ARGV is a special (i.e., magic) filehandle that can either mean filenames from your command-line arguments, or STDIN if there are no command-line arguments. Perl will make the decision how to treat ARGV based on the contents of #ARGV the first time it is used.
If you want invocations of your program like
perl my_script.pl userfile inputfile
perl my_script.pl userfile < inputfile
cat inputfile | perl my_script.pl userfile
to all work alike, you will want to consume the first element of #ARGV before you refer to ARGV. Like so:
my $userfile = shift #ARGV; # removes $ARGV[0] from front of #ARGV
open $user_in, '<', $userfile or ...
...
my $cl_op = <>;
Now $cl_op is being read from the second filename you provide, or from STDIN.

Related

perl script not looking for text file

I have a Perl script that is supposed to look for the text file I write in the command argument but, for whatever reason, it doesn't even acknowledge the existence of the text file I write in the argument, even though it is in the same folder.
This is where the code starts going haywire
my $filename = $ARGV[0];
if($filename == "") {
print("[ERROR] Argument unavailable! use ./script.pl filename.txt\n");
end;
} elsif (open (FILE, "<", $filename)) {
print("[INFO] File $filename loaded successfully!\n\n");
menu();
close FILE;
} else{
die("An error occured while opening the file: $!\n\n");
end;
}
Always use
use strict;
use warnings;
When writing Perl code. It will tell you when you do something wrong, and give you information that might otherwise be hard to find.
What do I get when I run your program with these pragmas on?
$ foo.pl asdasd
Argument "" isn't numeric in numeric eq (==) at foo.pl line 9.
Argument "asdasd" isn't numeric in numeric eq (==) at foo.pl line 9.
[ERROR] Argument unavailable! use ./script.pl filename.txt
Those warnings come from use warnings. Good thing we were using that! Here, I am told that using == for string comparisons is causing some issues.
What happens is that both the filename and the empty string "" is being cast to numbers. Perl uses context for operators, and == forces Perl to use a numeric, scalar context. It assumes the arguments are supposed to be numbers, to it tries to coerce them into numbers. It will attempt to find a number at the beginning of the string, and if it doesn't find one, cast the value to 0. So your comparison becomes:
if (0 == 0)
# equal to "foo.txt" == ""
Which is true. Hence the program never gets further than this.
The proper way to fix this particular problem is to use eq, the string equality comparison:
if ($file eq "")
Then it will check if the file name is the empty string. However, this is not the correct solution for you. Lets try it out, and use the test case that the user forgot the argument:
$ foo.pl
Use of uninitialized value $filename in string eq at foo.pl line 9.
[ERROR] Argument unavailable! use ./script.pl filename.txt
Why? Because in this case $ARGV[0] is not the empty string, it is undef, or uninitialized. It still sort of gets it right, since undef eq "" is true, but does give a warning that you are using the wrong method.
What you want to do here is just check if it exists. A new strategy:
if (#ARGV < 1) # check for number of arguments to program
You can also adopt a file test, and check if the file exists:
if ( ! -e $file)
However, the simpler way to handle those cases is to just use a proper open statement:
open my $fh, "<", $file or die "Cannot open '$file': $!";
Which will then tell you if the file did not exist. This is the idiomatic way to open files in Perl: Three argument open with explicit open mode to prevent code injection, lexical file handle, and handling exceptions and reporting the error.
If I were to write your program, I would write it as:
if (#ARGV < 1) {
die "Usage: $0 <filename>"; # $0 is your program's filename
}
my $file = shift; # default shift uses #ARGV, or #_ inside a sub, this is a common Perl idiom
open my $fh, "<", $file or die "Cannot open '$file': $!";
menu(); # your menu subroutine, I assume...
close $fh;

How can I read from both STDIN and files passed as command line arguments without using `while(<>)`?

This post demonstrates how one can read from STDIN or from a file, without using the null filehandle (i.e., while(<>)). However, I'd like to know how can one address situations where input may come from files, STDIN, or both simultaneously.
For instance, the <> syntax can handle such a situation, as demonstrated by the following minimal example:
$ echo -e 'a\nb\nc' | \
while read x; do echo $x > ${x}".txt"; done; echo "d" | \
perl -e "while(<>) {print;}" {a,b,c}.txt -
a
b
c
d
How can I do this without using while(<>)?
I want to avoid using <> because I want to handle each file independently, rather than aggregating all input as a single stream of text. Moreover, I want to do this without testing for eof on every line of input.
If you want to handle each file independently of the others, you should loop over the arguments that have been given and open each file in turn:
for (#ARGV) {
open(my $fh, '<', $_) || die "cannot open $_";
while (<$fh>) {
... process the file here ...
}
}
# handle standard input
while (<STDIN>) {
...
}
Here is an idea based on Tim's that checks if STDIN has something to read (NON BLOCKING STDIN). This is useful if you don't really care about a user entering input manually from STDIN yet still want to be able to pipe and redirect data to the script.
File: script.pl
#!/usr/bin/env perl
use IO::Select;
$s = IO::Select->new();
$s->add(\*STDIN);
if ($s->can_read(0)) { push #ARGV, "/dev/stdin"; }
for (#ARGV) {
open(IN, "<$_") || die "** Error opening \"$_\": $!\n";
while (<IN>) {
print $_
}
}
$> echo "hello world" | script.pl
hello world
$> script.pl < <(echo "hello world")
hello world
$> script.pl <(echo "hello world")
hello world
$> script.pl <<< "hello world"
hello world
$> script.pl
$>
This was already answered by the Answer to which the question links.
#ARGS = '-' if !#ARGV;
for my $qfn (#ARGV) {
open($fh, $qfn);
while (<$fh>) {
...
}
}

How to print result STDOUT to a temporary blank new file in the same directory in Perl?

I'm new in Perl, so it's maybe a very basic case that i still can't understand.
Case:
Program tell user to types the file name.
User types the file name (1 or more files).
Program read the content of file input.
If it's single file input, then it just prints the entire content of it.
if it's multi files input, then it combines the contents of each file in a sequence.
And then print result to a temporary new file, which located in the same directory with the program.pl .
file1.txt:
head
a
b
end
file2.txt:
head
c
d
e
f
end
SINGLE INPUT program ioSingle.pl:
#!/usr/bin/perl
print "File name: ";
$userinput = <STDIN>; chomp ($userinput);
#read content from input file
open ("FILEINPUT", $userinput) or die ("can't open file");
#PRINT CONTENT selama ada di file tsb
while (<FILEINPUT>) {
print ; }
close FILEINPUT;
SINGLE RESULT in cmd:
>perl ioSingle.pl
File name: file1.txt
head
a
b
end
I found tutorial code that combine content from multifiles input but cannot adapt the while argument to code above:
while ($userinput = <>) {
print ($userinput);
}
I was stucked at making it work for multifiles input,
How am i suppose to reformat the code so my program could give result like this?
EXPECTED MULTIFILES RESULT in cmd:
>perl ioMulti.pl
File name: file1.txt file2.txt
head
a
b
end
head
c
d
e
f
end
i appreciate your response :)
A good way to start working on a problem like this, is to break it down into smaller sections.
Your problem seems to break down to this:
get a list of filenames
for each file in the list
display the file contents
So think about writing subroutines that do each of these tasks. You already have something like a subroutine to display the contents of the file.
sub display_file_contents {
# filename is the first (and only argument) to the sub
my $filename = shift;
# Use lexical filehandl and three-arg open
open my $filehandle, '<', $filename or die $!;
# Shorter version of your code
print while <$filehandle>;
}
The next task is to get our list of files. You already have some of that too.
sub get_list_of_files {
print 'File name(s): ';
my $files = <STDIN>;
chomp $files;
# We might have more than one filename. Need to split input.
# Assume filenames are separated by whitespace
# (Might need to revisit that assumption - filenames can contain spaces!)
my #filenames = split /\s+/, $files;
return #filenames;
}
We can then put all of that together in the main program.
#!/usr/bin/perl
use strict;
use warnings;
my #list_of_files = get_list_of_files();
foreach my $file (#list_of_files) {
display_file_contents($file);
}
By breaking the task down into smaller tasks, each one becomes easier to deal with. And you don't need to carry the complexity of the whole program in you head at one time.
p.s. But like JRFerguson says, taking the list of files as command line parameters would make this far simpler.
The easy way is to use the diamond operator <> to open and read the files specified on the command line. This would achieve your objective:
while (<>) {
chomp;
print "$_\n";
}
Thus: ioSingle.pl file1.txt file2.txt
If this is the sole objective, you can reduce this to a command line script using the -p or -n switch like:
perl -pe '1' file1.txt file2.txt
perl -ne 'print' file1.txt file2.txt
These switches create implicit loops around the -e commands. The -p switch prints $_ after every loop as if you had written:
LINE:
while (<>) {
# your code...
} continue {
print;
}
Using -n creates:
LINE:
while (<>) {
# your code...
}
Thus, -p adds an implicit print statement.

Why does calling Perl's exec builtin cause this function not to return to its caller?

I am having problems with passing arguments to a function and the called function not returning control back to the function that initially called it. Basically, I am reading in a text file which contains usernames and passwords into an array. Then using a foreach loop I am passing the username and password to another function but this never returns and only executes for one set of arugments;
sub batch {
open(my $in, "<", "$ARGV[0]") or die "Can't open $ARGV[0]: $!";
#Read file contents into an array.
#listOfUsers = <$in>;
foreach $listOfUsers (#listOfUsers) {
#Regex to check if txt file conforms to correct syntax.
if ($listOfUsers !~ /([a-zA-Z]{1}[a-zA-Z0-9]{3,40})\s[a-zA-Z]{1}[a-zA-Z0-9]{3,40}/) {
print "Please ensure that line $listOfUsers in $ARGV[0] is of the following syntax:\n";
print "\n<USERNAME> <PASSWORD>\n";
exit(0);
} else {
#split string and call AddUser function
my ($username, $password) = split(" ",$listOfUsers);
AddUser($username, $password);
}
}
}
sub AddUser {
exec("infacmd.sh createUser -dn domain -un user -pd pass -hp domain:80 -nu " . $_[0] . " -np " . $_[1] . " -nf test");
}
Basically, don't worry what the AddUser function does. It just runs some .sh file that does some stuff that I'm not concerned about. Currently, I am only able to add one user from the file that I read in. I.e. this code only works once and does not return back to the 'batch' function.
I have tried adding 'return()' to the end of the 'AddUsers' function but this does not help
Thanks
exec never returns. Like the equivalent family of calls in UNIX C, it replaces the current process with the one you specify. You want system.
From those pages:
exec:The exec function executes a system command and never returns; use system instead of exec if you want it to return. It fails and returns false only if the command does not exist and it is executed directly instead of via your system's command shell.
system:Does exactly the same thing as exec, except that a fork is done first, and the parent process waits for the child process to exit.
If you want to capture the standard output of a program, you can use the open-with-pipe variant:
open (HNDL, "myprogram |") || die "Cannot execute.";
while (<HNDL>) {
# Do something with each line.
}
close (HNDL);
The following transcript shows how you can search for lines containing a specific string (123 in this case) from a specific command (ls -al xx* in this case):
pax> ls -al xx*
-rw-r--r-- 1 pax None 10 2010-06-13 19:51 xx
-rw-r--r-- 1 pax None 123 2010-05-05 23:39 xx.py
pax> cat qq.pl
open (HNDL, "ls -al xx* |") || die "Cannot execute.";
while (<HNDL>) {
if (/123/) {
print;
}
}
close (HNDL);
pax> perl qq.pl
-rw-r--r-- 1 pax None 123 2010-05-05 23:39 xx.py

Perl script to run a C executable with an argument while giving standard input through a file?

I want to run and executable ./runnable on argument input.afa. The standard input to this executable is through a file finalfile. I was earlier trying to do the same using a bash script, but that does not seem to work out. So I was wondering whether Perl provides such functionality. I know I can run the executable with its argument using backticks or system() call. Any suggestions on how to give standard input through file.
_ UPDATE _
As I said I had written a bash script for the same. I not sure how to go about doing it in Perl. The bash script I wrote was:
#!/bin/bash
OUTFILE=outfile
(
while read line
do
./runnable input.afa
echo $line
done<finalfile
) >$OUTFILE
The data in standard input file is as follows, where each line correspond to one time input. So if there are 10 lines then the executable should run 10 times.
__DATA__
2,9,2,9,10,0,38
2,9,2,10,11,0,0
2,9,2,11,12,0,0
2,9,2,12,13,0,0
2,9,2,13,0,1,4
2,9,2,13,3,2,2
2,9,2,12,14,1,2
If I understood your question correctly, then you are perhaps looking for something like this:
# The command to run.
my $command = "./runnable input.afa";
# $command will be run for each line in $command_stdin
my $command_stdin = "finalfile";
# Open the file pointed to by $command_stdin
open my $inputfh, '<', $command_stdin or die "$command_input: $!";
# For each line
while (my $input = <$inputfh>) {
chomp($input); # optional, removes line separator
# Run the command that is pointed to by $command,
# and open $write_stdin as the write end of the command's
# stdin.
open my $write_stdin, '|-', $command or die "$command: $!";
# Write the arguments to the command's stdin.
print $write_stdin $input;
}
More info about opening commands in the documentation.
Perl code:
$stdout_result = `exescript argument1 argument2 < stdinfile`;
Where stdinfile holds the data you want to be passed through stdin.
edit
The clever method would be to open stdinfile, tie it via select to stdin, and then execute repeatedly. The easy method would be to put the data you want to pass through in a temp file.
Example:
open $fh, "<", "datafile" or die($!);
#data = <$fh>; #sucks all the lines in datafile into the array #data
close $fh;
foreach $datum (#data) #foreach singluar datum in the array
{
#create a temp file
open $fh, ">", "tempfile" or die($!);
print $fh $datum;
close $fh;
$result = `exe arg1 arg2 arg3 < tempfile`; #run the command. Presumably you'd want to store it somewhere as well...
#store $result
}
unlink("tempfile"); #remove the tempfile