Passing multiple quoted arguments to command in bourne shell - command-line

I have a utility (myutil) that I need to pass multiple parameters. The parameters may(will) contain backslashes and spaces and therefore need to be enclosed in single quotes. An example run of this utility is:
myutil setOptions "ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO"
I am working on a script which will read these parameters from a multi-line file and feed them into the script. The input file myinput.txt looks like this:
# cat myinput.txt
ONE\APPLE
ONE\PEAR
TWO\RED GRAPE
TWO\TOMATO
I am using the following code to parse the file and execute myutil. The script reads each line as an argument, and encloses it in double quotes (as myutil will expect for values with spaces or special characters) and creates a single variable to hold the entire argument string:
#!/bin/sh
MYCOMMAND=myutil
if [ -f myinput.txt ]; then
ARGSLIST=`cat myinput.txt| sed -e 's/^/"/g' -e 's/$/"/g' | awk '{ printf "%s ", $0 }'`
$MYCOMMAND setOptions "${ARGSLIST}"
printf "%s\n" "$MYCOMMAND setOptions ${ARGSLIST}"
fi
As I might expect, the screen output of this command looks as expected:
myutil setOptions "ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO"
However, this is not actually what is being executed based on how myutil would normally process this command. Instead, myutil is processing this as if ALL the arguments were also enclosed within single quotes. Running this debug under sh doesn't reflect this:
+ MYCOMMAND=myutil
+ [ -f myinput.txt ]
+ awk { printf "%s ", $0 }
+ sed -e s/^/"/g -e s/$/"/g
+ cat myinput.txt
+ ARGSLIST="ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO"
+ myutil setOptions "ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO"
+ printf %s\n myutil setOptions "ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO"
myutil setOptions "ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO"
HOWEVER, running debug under bash seems to show what is really being executed:
+ MYCOMMAND=myutil
+ '[' -f myinput.txt ']'
++ awk '{ printf "%s ", $0 }'
++ sed -e 's/^/"/g' -e 's/$/"/g'
++ cat myinput.txt
+ ARGSLIST='"ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO" '
+ myutil setOptions '"ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO" '
+ printf '%s\n' 'myutil setOptions "ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO" '
myutil setOptions "ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO"
You can see the debug line which shows as:
+ myutil setOptions '"ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO" '
That line is actually what is being executed based on the output I see in my util, but what gets printed to the screen is the aforementioned minus the single quotes:
myutil setOptions "ONE\APPLE" "ONE\PEAR" "TWO\RED GRAPE" "TWO\TOMATO"
I need what is being displayed to the screen to be what is actually executed, not the version with the single quotes as that provides one giant parameter to myutil which is worthless.
I have two solutions, neither of which I'm sure is the best way to handle this.
The first is to simply use eval:
#!/bin/sh
MYCOMMAND=myutil
if [ -f myinput.txt ]; then
ARGSLIST=`cat myinput.txt| sed -e 's/^/"/g' -e 's/$/"/g' | awk '{ printf "%s ", $0 }'`
eval $MYCOMMAND setOptions "${ARGSLIST}"
printf "%s\n" "$MYCOMMAND setOptions ${ARGSLIST}"
fi
The second is to use xargs:
#!/bin/sh
MYCOMMAND=myutil
if [ -f myinput.txt ]; then
ARGSLIST=`cat myinput.txt| sed -e 's/^/"/g' -e 's/$/"/g' | awk '{ printf "%s ", $0 }'`
printf "%s\n" "$ARGSLIST" | xargs $MYCOMMAND setOptions
printf "%s\n" "$MYCOMMAND setOptions ${ARGSLIST}"
fi
I'm having a hard time believing that xargs is required, and I have been repeatedly reading how using eval is probably also not a recommended choice.
I have seen some similar topics regarding this with advice to use arrays, but I could not apply their advice properly to my example. Additionally I need this solution to run under bourne sh and be as portable as possible as it will be running on most standard flavors of Linux/UNIX. I don't believe sh supports arrays in the type that other solutions have recommended.
What is the best method for me to execute my command as it actually displays to the screen without the extra quoting being supplied by the shell?

You could define a recursive shell function which reads one line from its standard input at a time to build up the command line. When no more input is available, the actual command is run with the accumulated arguments. The recursion adds some overhead, but I'm assuming the input file isn't very big.
run_from_file () {
if IFS= read -r newarg; then
run_from_file "$#" "$newarg"
else
"$MYCOMMAND" "$#"
fi
}
run_from_file < myinput.txt

Related

I have a some scritps is working but not work in crontab

I have a some scripts and, when ı was run manually the scripts were run. But
When working in crontab, the format is incorrect.
This is for a new Linux server
awk 'BEGIN{
FS="-"
print "<HTML>""<table border="1" border="3" cellpadding="4" cellspacing="4" bgcolor=lightblue><TH>Firma</TH><TH>Charged Party No</TH><TH>Pcom Status</TH>"
}
{
printf "<TR>"
for(i=1;i<=NF;i++)
printf "<TD>%s</TD>",$i
print "</TR>"
}
END{
print "</TABLE></BODY></HTML>"
}
' /app/ovocontrol/cp_not_found2.txt > file.html
sed -i "s/failure/<font color="red">failure<\/font>/g;s/success/<font color="green">success<\/font>/g" file.html
(
echo "To: **********"
echo "Subject: Son 10 Dakikaya ait Toplu SMS CUDB Hata Detayi"
echo "Content-Type: text/html"
echo
cat file.html
echo
) | /usr/sbin/sendmail -t
I have a some scripts and, when ı was run manually the scripts were run. But
When working in crontab, the format is incorrect.
You don't have a shebang, nor do you have a complete crontab, so I'm guessing at what you're actually doing. I suspect you are trying to call those multiple commands directly from your crontab, which is a terrible idea. Instead, put your multiple calls into a single script and invoke it from cron. eg, do something like:
$ cat > /path/to/script << 'EOF'
#!/bin/sh
: ${f:=/app/ovocontrol/cp_not_found2.txt}
{
echo "To: **********"
echo "Subject: Son 10 Dakikaya ait Toplu SMS CUDB Hata Detayi"
echo "Content-Type: text/html"
echo
printf '<HTML><table border="1" border="3" cellpadding="4" cellspacing="4"'
printf ' bgcolor=lightblue><TH>Firma</TH><TH>Charged Party No</TH><TH>Pcom Status</TH>\n'
awk -F - ' {
printf "<TR>"
for(i=1;i<=NF;i++) printf "<TD>%s</TD>",$i
print "</TR>"
}
' "$f"
printf '</TABLE></BODY></HTML>\n'
} \
| sed -e 's#failure#<font color="red">failure</font>#g' \
-e 's#success#<font color="green">success</font>#g'
| /usr/sbin/sendmail -t
EOF
$ chmod +x /path/to/script
$ printf 'i\n0 * * * * /path/to/script\n.\nw\nq\n' | EDITOR=ed crontab -e
Note that the last command above is not really a great idea, just an attempt to codify the directive to add /path/to/script to your crontab.
I solved problem other way , "awk '!seen[$0]++'" command incorrect my format. the code was actually a short portion of the code. I think crontab has a special settings.

Execute a perl substitution through ssh [perl one-oneliner from shell script]

I'm trying to execute this script remotely:
perl -i -pe 's/nginx-cache\K(\d+)/ ++($n = $1) /e; s/MYSITE\K(\w+)/ ++($n = $1) /e;' $SITENAME
with the following solution:
ssh -T root#$IPSRV <<EOI
perl -i -pe 's/nginx-cache\K(\d+)/ ++($n = $1) /e; s/MYSITE\K(\w+)/ ++($n = $1) /e;' /etc/nginx/sites-available/$SITENAME"
exit
EOI
I tried also without the "-T" option of ssh
ssh root#$IPSRV "
> perl -i -pe 's/nginx-cache\K(\d+)/ ++($n = $1) /e; s/MYSITE\K(\w+)/ ++($n = $1) /e;' /etc/nginx/sites-available/$SITENAME"
but unfortunately it does not work:
syntax error at -e line 1, near "( ="
syntax error at -e line 1, near "( ="
Execution of -e aborted due to compilation errors.
Could someone please suggest me a solution for running this command remotely?
Thanks in advance!
Note that $SITENAME is a variable on the local machine.
[EDIT]
I have made some progress, based on the #ikegami's answer.
I tried
root#$IPSRV 'perl -i -pe'\''s/nginx-cache\K(\d+)/ ++($n = $1) /e; s/MYSITE\K(\w+)/ ++($n = $1) /e;'\'' /etc/nginx/sites-available/"$SITENAME"'
root#192.168.1.100's password:
Can't do in place edit: /etc/nginx/sites-available/ is not a regular file.
I think it's related to the missing substitution of $SITENAME variable.
Another important thing to keep in mind is the use of single quote after ssh root#IPSRV - it should be replaced by quotes because I have other variables into the script and if I use single quote they are not translated.
Example:
ssh root#$IPSRV "mkdir $TESTDIR
cp /tmp/file $TESTDIR"
This works, but if I try with single quote:
ssh root#$IPSRV 'mkdir $TESTDIR
cp /tmp/file $TESTDIR'
it does not. So I have to consider also this aspect if the only way for running the perl substitution is 'perl -i -pe'\''s ...
Thanks!
Improper escaping.
ssh root#$IPSRV "...++($n = $1)..." passes ...++( = )... to the remote host. Same with the here-doc version. (The here-doc version also has a stray quote.)
Handling multiple levels of escaping is complicated, so let's do the escaping programmatically. This also allows us to pass values from variables, as they need to be converted into shell literals.
quote() {
prefix=''
for p in "$#" ; do
printf "$prefix"\'
printf %s "$p" | sed "s/'/'\\\\''/g"
printf \'
prefix=' '
done
}
ssh root#$IPSRV "$( quote perl -i -pe'...' "$SITENAME" )"
or
quote() {
perl -MString::ShellQuote=shell_quote -e'print(shell_quote(#ARGV))' "$#"
}
ssh root#$IPSRV "$( quote perl -i -pe'...' "$SITENAME" )"
In case it's of help to others, the following shows how to use the remote machine's $SITENAME var instead:
quote() {
prefix=''
for p in "$#" ; do
printf "$prefix"\'
printf %s "$p" | sed "s/'/'\\\\''/g"
printf \'
prefix=' '
done
}
ssh root#$IPSRV "$( quote perl -i -pe'...' )"' "$SITENAME"'
or
quote() {
perl -MString::ShellQuote=shell_quote -e'print(shell_quote(#ARGV))' "$#"
}
ssh root#$IPSRV "$( quote perl -i -pe'...' )"' "$SITENAME"'
or
ssh localhost sh <<'EOI' # Notice the quotes around the token.
perl -i -pe'...' "$SITENAME"
EOI
Or, since it doesn't need any local variables, you can do it manually rather easily. Take the remote command, replace every ' with '\'', then wrap the whole with quotes.
Remote command:
perl -i -pe'...' "$SITENAME"
Local command:
ssh root#$IPSRV 'perl -i -pe'\''...'\'' "$SITENAME"'
Quoting by hand is hard an error prone. Let's Perl do all the work for you:
use Net::OpenSSH;
my $one_liner = <<'EOOL';
s/nginx-cache\K(\d+)/ ++($n = $1) /e; s/MYSITE\K(\w+)/ ++($n = $1) /e
EOOL
my $ssh = Net::OpenSSH->new("root\#$ENV{IPSRV}");
$ssh->system('perl', '-i', '-pe', $one_liner, $ENV{SITENAME});
$ssh->die_on_error;
Don't forget to export $IPSRV and $SITENAME.
After a lot of tries I got it working! ;-))
The problem was related to the shell that tries to expand $n or $1 environmental variables prior to sending all this to remote SSH. And on remote side script turns into:
perl -i -pe 's/nginx-cache\K(\d+)/ ++( = ) /e; s/MYSITE\K(\w+)/ ++( = ) /e;'
which yields error on "( = )" places.
Just escaping them as \$n sends these string untouched:
perl -i -pe 's/nginx-cache\\K(\\d+)/ ++(\$n = \$1) /e; s/MYSITE\\K(\\w+)/ ++(\$n = \$1) /e;' $SITENAME
So, the complete answer:
ssh root#IPSRV "
first command
second command
perl -i -pe 's/nginx-cache\\K(\\d+)/ ++(\$n = \$1) /e; s/MYSITE\\K(\\w+)/ ++(\$n = \$1) /e;' $SITENAME
"
Thanks anyone for pointing me in the right direction!

sed - Separate quotes and arguments

So, I'm trying to get a script I'm working on to run another script in different directories with different arguments as defined in a text file.
Here's part of my code:
for bline in $(cat "$file"); do
lindir=$()
linarg=$()
echo "dir: ${lindir}"
echo "arg: ${linarg}"
done
Let's say I have a line in file that says this:
"./puppies" -c=1 -u=0 -b=1
How can I get an output of ./puppies for lindir and an output of -c=1 -u=0 -b=1 for linarg?
lindir="$( cut -d ' ' -f 1 <<<"$bline" )"
linarg="$( cut -d ' ' -f 2- <<<"$bline" )"
That is
while read -r bline; do
lindir="$( cut -d ' ' -f 1 <<<"$bline" )"
linarg="$( cut -d ' ' -f 2- <<<"$bline" )"
printf "dir: %s\n" "$lindir"
printf "arg: %s\n" "$linarg"
done <"$file"
If you're in a shell that doesn't understand "here-strings":
lindir="$( printf "%s" "$bline" | cut -d ' ' -f 1 )"
linarg="$( printf "%s" "$bline" | cut -d ' ' -f 2- )"

executing a bash script in perl

I want to run this command in perl
for dir in *; do
test -d "$dir" && ( find "$dir" -name '*test' | grep -q . || echo "$dir" );
done
I have tried :
system ("for dir in *; do
test -d "\$dir" && ( find "\$dir" -name '*test' | grep -q . || echo "\$dir" );
done");
but does not work .
A pure Perl implementation using File::Find module's find function:
#!/usr/bin/env perl
use strict;
use warnings;
use File::Find;
find \&find_directories, '.';
sub find_directories {
if ( -d && $File::Find::name =~ /test$/ ) {
print "$File::Find::name\n";
}
}
Your quoting is off.
"for dir in *; do
test -d "\$dir" && ( find "\$dir" -name '*test' | grep -q . || echo "\$dir" );
done"
You have decided to delimit your string with double quotes ", but they are included in your string.
Either escape the other quotes:
"for dir in *; do
test -d \"\$dir\" && ( find \"\$dir\" -name '*test' | grep -q . || echo \"\$dir\" );
done"
(error prone, ugly)
… or use another delimiter: Perl offers you a wide range of possibilities. These quoting syntaxes interpolate variables inside: "…" and qq{…} where you can use any character in [^\s\w] as delimiter, and non-interpolating syntaxes are: '…' and q{…} with the same delimiter flexibility as before:
qq{for dir in *; do
test -d "\$dir" && ( find "\$dir" -name '*test' | grep -q . || echo "\$dir" );
done}
The q and qq constructs can include the delimiter inside the string, if the occurrence is balanced: q( a ( b ) c ) works.
The third quoting mechanism is a here-doc:
system( <<END_OF_BASH_SCRIPT );
for dir in *; do
test -d "\$dir" && ( find "\$dir" -name '*test' | grep -q . || echo "\$dir" );
done
END_OF_BASH_SCRIPT
This is usefull for including longer fragments without worrying about a delimitor. The String is ended by a predefined token that has to appear on a line of its own. If the delimitor declaration is placed in single quotes (<<'END_OF_SCRIPT'), no variables will be interpolated:
system( <<'END_OF_BASH_SCRIPT' );
for dir in *; do
test -d "$dir" && ( find "$dir" -name '*test' | grep -q . || echo "$dir" );
done
END_OF_BASH_SCRIPT
Note on the q{} and qq{} syntax: This is a feature never to be used outside of obfuscation, but it is possible to use a character in \w as the delimiter. You have to include a space between the quoting operator q or qq and the delimiter. This works: q xabcx and is equal to 'abc'.
Instead of starting the script, try starting a bash instance that runs the script. E.g.
system("bash -c 'for dir bla bla bla'");
system() uses your default system shell, which is probably not Bash. The solution is to call Bash explicitly with the system() command.

How can i convert the following bash script into a perl script

#!/bin/bash
i="0"
echo ""
echo "##################"
echo "LAUNCHING REQUESTS"
echo " COUNT: $2 "
echo " DELAY: $3 "
echo " SESSID: $1"
echo "##################"
echo ""
while [ $2 -gt "$i" ]
do
i=$[$i+1]
php avtest.php $1 $4 &
echo "EXECUTING REQUEST $i"
sleep $3
done
here is a better/modified script in bash
#!/bin/bash
i="0"
#startTime=`date +%s`
startTime=$(date -u +%s)
startTime=$[$startTime+$1+5]
#startTime=$($startTime+$1+5)
dTime=`date -d #$startTime`
echo ""
echo "##################"
echo "LAUNCHING REQUESTS"
echo " COUNT: $1 "
echo " DELAY: 1 "
#echo " EXECUTION: $startTime "
echo " The scripts will fire at : $dTime "
echo "##################"
echo ""
while [ $1 -gt "$i" ]
do
i=$[$i+1]
php avtestTimed.php $1 $3 $startTime &
echo "QUEUEING REQUEST $i"
sleep 1
done
Here's a direct translation
#!/usr/bin/env perl
use strict;
use warnings;
print <<HERE;
##################
LAUNCHING REQUESTS
COUNT: $ARGV[1]
DELAY: $ARGV[2]
SESSID: $ARGV[0]
##################
HERE
my $i = 0;
while($ARGV[1] > $i){
$i += 1;
system("php avtest.php $ARGV[0] $ARGV[3] &");
print "EXECUTING REQUEST $i\n";
sleep $ARGV[2];
}
But it would make more sense to read the command line parameters into variables named after what they're for and not rely on remembering argument ordering.
A brief errata in the conversion:
I use a here string to represent multiline text. I could also have put in multiple print statements to more closely mimic the bash version
In bash arguments are accessed as numbered variables, starting with $1 and going up. In Perl the argument list is represented by the array #ARGV, which is numbered starting at zero (like arrays in most languages). In both bash and Perl the name of the script can be found in the variable $0.
In Perl arrays are written as #arrayname when refering to the entire array, but they use $arrayname[index] when accessing array members. So the Perl $list[0] is like the bash ${list[0]} and the Perl #list is like the bash ${list[#]}.
In Perl variables are declared with the my keyword; the equivalent in bash would be declare.
I've used the system function for spawning background processes. Its argument can be simply the command line as you might use it in bash.
Unlike echo, print requires to be told if there should be a newline at the end of the line. For recent versions of Perl the say function exists which will append a newline for you.
The Perl sleep function is pretty self-explanatory.
EDIT: Due to a typo $i in the print statement had been represented as $ni leading to runtime errors. This has been corrected.