Im running a perl script to scan several hosts. When i put a single host in
$scanner->scan('-sS -p 1-1024 -sV -O --max-rtt-timeout 200ms 111.111.111.111');
it runs fine, but when I try to add a variable value inside by parsing a file with list of hosts
$scanner->scan('-sS -p 1-1024 -sV -O --max-rtt-timeout 200ms $host');
The program just assumes $host as characters, is there anyway to get around this? I'm using nmap::scanner as my module.
Thanks
Try replacing quotes by double-quotes:
$scanner->scan("-sS -p 1-1024 -sV -O --max-rtt-timeout 200ms $host");
or place $host outside:
$scanner->scan('-sS -p 1-1024 -sV -O --max-rtt-timeout 200ms '.$host);
you are using the wrong type of quotes
'
does not interpolate variables so
$x='fish';
$b='deep fried $x';
sets $b to deep fried $x
whereas
$b="deep fried $x";
sets $b to deep fried fish
See perldoc perlop for more details
Related
I am trying to put below command in perl system() function.But getting so many compilation errors (syntax).
./istool export -domain serviceshost:9080 -u dsadm -p password -ar test.isx -pre -ds '-base="ENGINEHOST/Dev_Project" Jobs/Batch/\*.*'
I was using it like in perl:
system("./istool export -domain serviceshost:9080 -u dsadm -p password -ar test.isx -pre -ds '-base="ENGINEHOST/Dev_Project" Jobs/Batch/\*.*'");
can some one guide me exactly how to use it in system function?I tried escaping . also with backslash(\) in front of it.
Replace system with print, and it's obvious you didn't build the string correctly.
If you want to include a " in a string quoted with ", you need to escape it.
If you want to include a \ in a double-quoted string, you need to escape it.
I am a beginner with Perl. I am using the Below Perl Command To Search and Replace "/$" Sequence in my tcl Script. This works well When used on the linux Command Line directly.
perl -p -i -e 's/\/\$/\/\\\$/g' sed_list.tcl
I am calling to Call the above Perl One liner in another Perl script using System Command and only with " ` " Back Tick.
system(`perl -p -i -e 's/\/\$/\/\\\$/g' sed_list.tcl`);
`perl -p -i -e 's/\/\$/\/\\\$/g' sed_list.tcl`;
I am getting the Below error. Please Help With this issue.
Bareword found where operator expected at -e line 1, near "$/g"
(Missing operator before g?)
Final $ should be \$ or $name at -e line 1, within string
syntax error at -e line 1, near "s//$/"
Execution of -e aborted due to compilation errors.
I Dont Know if I Can use any other Separation Operator like % and # just like SED command but, When I used '%' operator for separation, I didn't see error but job is not done.
`perl -p -i -e 's%\/\$%\/\\\$%g' sed_list.tcl`;
I couldn't find sufficient results for this particular issue of '$' variable on the web. Any help is appreciated.
Some one here Suggested that I should Escape all Back Slashes while using System Command or calling another command using BackTicks from inside a perl script. But later they have deleted their answer. It worked for me. I would like to thank every one for taking effort and helping me out in solving my question.
Here is the correct working code.
`perl -p -i -e 's/\\\/\\\$/\\\/\\\\\\\$/g' sed_list_gen.tcl`;
or Use System function as shown Below
system("perl -p -i -e 's/\\\/\\\$/\\\/\\\\\\\$/g' sed_list_gen.tcl");
Thanks once again for the community for helping me out. . .
You can execute an external command by passing the command to a system function or by using backticks(``) operator. Please pass the command to the system() function as a string:
system(q{perl -p -i -e 's/\/\$/\/\\\$/g' sed_list.tcl})
or use backticks as:
`perl -p -i -e 's/\/\$/\/\\\$/g' sed_list_gen.tcl`
Edit:
As suggested by Paul in the comments.
Suppose you've got this C shell script:
setenv MYVAR "-l os="\""redhat4.*"\"" -p -100"
setenv MYVAR `perl -pe "<perl>"`
Replace with code that will either replace "-p -100" with "-p -200" in MYVAR or add it if it doesn't exist, using a one liner if possible.
The topic does not correspond to content, but I think it may be usefull if someone posts an answer to topic-question. So here is the perl-oneliner:
echo "my_string" | perl -pe 's/my/your/g'
What you want will look something like
perl -e' \
use Getopt::Long qw( :config posix_default ); \
use String::ShellQuote; \
GetOptions(\my %opts, "l=s", "p=s") or die; \
my #opts; \
push #opts, "-l", $opts{l} if defined($opts{l}); \
push #opts, "-p", "-100"; \
print(shell_quote(#opts)); \
' -- $MYVAR
First, you need to parse the command line. That requires knowing the format of the arguments of the application for which they are destined.
For example, -n is an option in the following:
perl -n -e ...
Yet -n isn't an option in the following:
perl -e -n ...
Above, I used Getopt::Long in POSIX mode. You may need to adjust the settings or use an entirely different parser.
Second, you need to produce csh literals.
I've had bad experiences trying to work around csh's defective quoting, so I'll leave those details to you. Above, I used String::ShellQuote's shell_quote which produces sh (rather than csh) literals. You'll need to adjust.
Of course, once you got this far, you need to get the result back into the environment variable unmangled. I don't know if that's possible in csh. Again, I leave the csh quoting to you.
I am using this command fine:
ssh user#ip 'bash -s' -- < /usr/local/nagios/libexec/check_ssh_mem.sh
I don't like the results of that script so want to use a perl script instead and use this:
ssh user#ip 'perl -s' -- < /usr/local/nagios/libexec/check_mem.pl
check_mem.pl v1.0 - Nagios Plugin
usage:
check_mem.pl -<f|u> -w <warnlevel> -c <critlevel>
options:
-f Check FREE memory
-u Check USED memory
-C Count OS caches as FREE memory
-w PERCENT Percent free/used when to warn
-c PERCENT Percent free/used when critical
As you can see, I get proper feedback from the script. I want to pass it the -f, -w and -c variables but get errors when trying to do that.
man perlrun says:
Upon startup, Perl looks for your program in one of the following places
...
3. Passed in implicitly via standard input. This works only if there are
no filename arguments--to pass arguments to a STDIN-read program you must
explicitly specify a "-" for the program name.
So, you can use this:
ssh user#ip 'perl - -f -u -C' -- < /usr/local/nagios/libexec/check_mem.pl
The arguments after the - are passed to the script you are running.
You don't need the -s argument - I assume you copied that from your original bash implementation, but -s has a different meaning for perl.
I have a bash script in which I have a few qsubs. Each of them are waiting for a preivous qsub to be done before starting.
My first qsub consist of sending files in a certain directory to a perl program and having the outfiles printed in a new directory. At the end, I echo the array with all my jobs names. This script works as intented.
mkdir -p /perl_files_dir
for ID_FILES in `ls Infiles_dir/*.txt`;
do
JOB_ID=`echo "perl perl_scirpt.pl $ID_FILES" | qsub -j oe `
JOB_ID_ARRAY="${JOB_ID_ARRAY}:$JOB_ID"
done
echo $JOB_ID_ARRAY
My second qsub is meant to sort all my previous files made with my perl script in a new outfile and to start after all these jobs are done (about 100 jobs) with depend=afterany. Again, this part is working fine.
SORT_JOB=`echo "sort -m -n perl_files_dir/*.txt >>sorted_file.txt" | qsub -j oe -W depend=afterany$JOB_ID_ARRAY`
SORT_ARRAY="${SORT_ARRAY}:$SORT_JOB"
My issue is that in my sorted file, I have a few columns I wish to remove (2 to 6), so I came up with this last line using awk piped to sed with another depend=afterany
SED=`echo "awk '{\$2="";\$3="";\$4="";\$5="";\$6=""; print \$0}' sorted_file.txt \
| sed 's/ //g' >final_file.txt" | qsub -j oe -W depend=afterany$SORT_ARRAY`
This last step creates final_file.txt, but leaves it empty. I added SED= before my echo because it would otherwise give me Command not found.
I tried without the pipe so it would just print everything. Unfortunately it prints nothing.
I assume it is not opening my sorted file and this is why my final file is empty after my sed. If it's the case, then why won't awk read it?
In my script, I am using variables to define my directories and files (with the correct path). I know my issue is not about find my files or directories since they are perfectly defined at the beginning and used throughout the script. I tried to write the whole path instead of a variable and I get the same results.
for ID_FILES in `ls Infiles_dir/*.txt`
Simplify this to
for ID_FILES in Infiles_dir/*.txt
ls lists the files you pass it (except when you pass it directories, then it lists their content). Rather than telling it to display a list of files and parse the output, use the list of files you already have! This is more reliable (parsing the output of ls will fail if the file names contain whitespace or wildcard characters), clearer and faster. Don't parse the output of ls.
SORT_JOB=`echo "sort -m -n perl_files_dir/*.txt >>sorted_file.txt" | qsub -j oe -W depend=afterany$JOB_ID_ARRAY`
You'd make your life simpler if you used the right form of quoting in the right place. Don't use backquotes, because it's difficult to know how to quote things inside. Use $(…) instead, it's exactly equivalent except that it is parsed in a sane way.
I recommend using a here document for the shell snippet that you're feeding to qsub. You have fewer quoting issues to worry about, and it's more readable.
While we're at it, always put double quotes around variable substitutions and command substitutions: "$some_variable", "$(some_command)". Annoyingly, $var in shell syntax doesn't mean “take the value of the variable var”, it means “take the value of the variable var, parse it as a list of wildcard patterns, and replace each pattern by the list of matching files if there are matching files”. This extra stuff is turned off if the substitution happens inside double quotes (or in a here document, by the way): "$var" means “take the value of the variable var”.
SORT_JOB=$(qsub -j oe -W depend="afterany$JOB_ID_ARRAY" <<'EOF'
sort -m -n perl_files_dir/*.txt >>sorted_file.txt
EOF
)
We now get to the snippet where the quoting was actually causing a problem.
SED=`echo "awk '{\$2="";\$3="";\$4="";\$5="";\$6=""; print \$0}' sorted_file.txt \
| sed 's/ //g' >final_file.txt" | qsub -j oe -W depend=afterany$SORT_ARRAY`
The string that becomes the argument to the echo command is:
awk '{$2=;$3=;$4=;$5=;$6=; print $0}' sorted_file.txt | sed 's/ //g' >final_file.txt
This is syntactically incorrect, and that's why you're not getting any output.
You didn't escape the double quotes inside what was meant to be the awk snippet. It's a lot clearer if you use a here document. Also, you don't need the SED= part. You added it because you had a command substitution (a command between …), which substitutes the output of a command. But since you aren't interested in the output of the qsub command, don't take its output, just execute it.
qsub -j oe -W depend="afterany$SORT_ARRAY" <<'EOF'
awk '{$2="";$3="";$4="";$5="";$6=""; print $0}' sorted_file.txt |
sed 's/ //g' >final_file.txt
EOF
I'm not familiar with qsub, but presumably there's a way to get the error output and the return status of the commands it runs. Inspect that error output, you should have seen the errors from awk.
The version of awk that I am using, does not like the character escapes
awk --version
GNU Awk 3.1.7
spuder#cent64$ awk '{\$2="";\$3="";\$4=""; print \$0}' foo.txt
awk: {\$2="";\$3="";\$4=""; print \$0}
awk: ^ backslash not last character on line
Try the following syntax
awk '{for(i=2;i<=7;i++) $i="";print}' foo.txt
As a side note, if you are using Torque 4.x you may not be able to use a comma separated list of jobs with -W depend=, instead you may need to create a new PBS declarative (-W) for each job.
eg...
#Invalid syntax in newer versions of torque
qsub -W depend=foo,bar
Resources
backslash in gawk fields
Print all but the first three columns
http://docs.adaptivecomputing.com/torque/help.htm#topics/commands/qsub.htm#-W