Run a for loop using a comma separated bash variable - mongodb

I have a list of collections as a comma seperated variable in Bash like below
list_collection=$collection_1,$collection_2,$collection_2,$collection_4
I want to connect to Mongodb and run some commands on these collections
I have done like below but I am not getting the loop to work
${Mongo_Home}/mongo ${mongo_host}/${mongo_db} -u ${mongo_user} -p ${mongo_password} <<EOF
use ${mongo_db};for i in ${list_collection//,/ }
do
db.${i}.reIndex();
db.${i}.createIndex({
"recon_type":1.0,
"account_name":1.0,
"currency":1.0,
"funds":1.0,
"recon_status":1.0,
"transaction_date":1.0},
{name:"index_def"});
if [ $? -ne 0 ] ; then
echo "Mongo Query to reindex ${i} failed"
exit 200
fi
done
EOF
What wrong AM I doing?
What is the correct way?

It's hard to guess what your desired behavior is from a bunch of code that doesn't exhibit that behavior, but to take a shot at it, the following will run mongo once per item in list_collection, with a different heredoc each time:
#!/usr/bin/env bash
# read your string into a single array
IFS=, read -r -a listItems <<<"$list_collection"
# iterate over items in that array
for i in "${listItems[#]}"; do
{ # this brace group lets the redirection apply to the whole complex command
"${Mongo_Home}/mongo" "${mongo_host}/${mongo_db}" \
-u "${mongo_user}" -p "${mongo_password}" ||
{ echo "Mongo query to reindex $i failed" >&2; exit 200; }
} <<EOF
use ${mongo_db};
db.${i}.reIndex();
db.${i}.createIndex({
"recon_type":1.0,
"account_name":1.0,
"currency":1.0,
"funds":1.0,
"recon_status":1.0,
"transaction_date":1.0
}, {name:"index_def"});
EOF
done
Alternately, to run mongo just once (but lose the ability to tell which index a failure happened for) might look like:
#!/usr/bin/env bash
# read your string into a single array
IFS=, read -r -a listItems <<<"$list_collection"
buildMongoCommand() {
printf '%s\n' "use $mongo_db;"
for i in "${listItems[#]}"; do
cat <<EOF
db.${i}.reIndex();
db.${i}.createIndex({
"recon_type":1.0,
"account_name":1.0,
"currency":1.0,
"funds":1.0,
"recon_status":1.0,
"transaction_date":1.0
}, {name:"index_def"});
EOF
done
}
"${Mongo_Home}/mongo" "${mongo_host}/${mongo_db}" \
-u "${mongo_user}" -p "${mongo_password}" \
< <(buildMongoCommand) \
|| { echo "Mongo query failed" >&2; exit 200; }

Related

How to automate the LSF waiting based on job name in perl

I have a perl code where I am submitting few jobs at once in parallel via LSF bsub command and once all these jobs finish want to submit a final job.
For example I have these three bsub commands where first two bsub commands submits jobs t1 and t2 and third command checks whether t1 and t2 are finished or not and wait on them with -w argument.
system(" bsub -E "xyz" -n 1 -q queueType -J t1 sleep 30")
system("bsub -E "xyz" -n 1 -q queueType -J t2 sleep 30")
system("bsub -E "xyz" -n 1 -q queueType -J t3 -w "done(t1)&&done(t2)" sleep 30")
So for automating -w argument I have this
my $count=2;
my $i;
system(" bsub -E "xyz" -n 1 -q queueType -J t3 \"foreach my $i (0..$count) {print " done(t_$i)&&";}\" sleep 30 ")
I get this error:
sh: -c: line 0: syntax error near unexpected token `('
sh: -c: line 0: `bsub -E "/pkg/ice/sysadmin/bin/linux-pre-exec" -n 1 -q short -J t3 -w "foreach (0..7) {print \"done(t)&&\";}" sleep 30'
EDIT: Yes I am using system command to submit these jobs from perl
If you want to generate the done(...)&&done(...) string dynamically, you can use
my $count = 7;
my $done_all = join '&&', map "done(t$_)", 1 .. $count;
That is, for each number in the range 1 .. 7, produce a string "done(t$_)", which gives a list "done(t1)", "done(t2)", ... "done(t7)". The elements of this list are then join'd together with a separator of &&, producing "done(t1)&&done(t2)&&...&&done(t7)".
To run an external command, you can use system. Passing a list to system avoids going through the shell, which avoids all kinds of nasty quoting issues:
system('bsub', '-E', 'xyz', '-n', '1', '-q', 'queueType', '-J', 't3', '-w', $done_all, 'sleep', '30');
# or alternatively:
system(qw(bsub -E xyz -n 1 -q queueType -J t3 -w), $done_all, qw(sleep 30));
Your code tries to pass Perl code to bsub, but that won't work. You have to generate the command string beforehand and pass the result to bsub.

Perl: Syntax Error; Non-Printable Chars for Key. Error Executing CMD find

I'm trying to pass a scalar value to the system command but keep having trouble. I'm unsure as to why. Any help would be appreciated at this late hour.
The two errors that I'm getting are:
Syntax Error; Non-Printable Chars for Key.
Error Executing CMD find.
Essentially what I'm trying to pass a string of commands (dbcommand; f; echo;...) within the command "command."
my $id_to_test = $ids[0];
my $cmd = q{command -c "dbcommand -a app-f fam -d db; f sub=a, device=};
$cmd .= q{$id_to_test};
$cmd .= q{, analog=A; echo -c on; echo -o A_value.txt; /DIS;"};
system $cmd;
#system('command-c "dbcommand-a app-f fam-d db; f sub=a, device=$id_to_test, analog=A; echo -c on; echo -o A_value.txt; /DIS;"');
So now I'm doing:
my $id_to_test = $ids[0];
my $cmd = 'command-c \"dbcommand-a app-f fam -d db; f sub=a, device=';
$cmd .= "$id_to_test";
$cmd .= ', analog=A; echo -c on; echo -o A_value.txt; /DIS;\"';
system $cmd;
and I'm getting the error:
Syntax error; cmd has invalid character(s) -- "dbcommand
sh: f: command not found
-c on
-o A_value.txt
sh: /DIS: No such file or directory
sh: ": command not found
If I do print $cmd; instead, I also get slashes in front of my double quotes, which is not what I want:
command -c \"dbcommand -a app -f fam -d db; find sub=a, device=1234567, analog=A; echo -c on; echo -o A_value.txt; /DIS;\"
Making sure array is populated:
One of the first things I did was to make sure I was adding values to the array correctly by declaring the array, opening the file, and then doing:
while (<$fh>) {
#Remove any newline characters from line using the chomp() command.
chomp;
push #ids, "$_";
# print($ids[$index]);
# $index = $index + 1;
# print "$row\n";
}
print join (", ", #ids);
my $array_size = #ids;
print("\n" . $array_size);
when I execute the perl script and it prints locally*, everything is as expected -- values are printed and size of array is 3.
123456789, 123456888, 123456789
3
However, when I print remotely, I only get the last element
, 123456789
3
even though the size of the array is also 3.
The problem is when you use the single quote (no interpolation) it's treating your backslash as a literal backslash. The only time I can think of where you need a backslash as an escape character when using a single quote is when you are escaping a single quote itself or a backslash:
print 'I said \'hello.\'';
Try something like this:
my $cmd = 'command -c "dbcommand -a app-f fam -d db; f sub=a, device=' .
$id_to_test . ', analog=A; echo -c on; echo -o A_value.txt; /DIS;';
This should also work:
my $cmd = 'command -c "dbcommand -a app-f fam -d db; f sub=a, device=' .
"$id_to_test, analog=A; echo -c on; echo -o A_value.txt; /DIS;";
(or you could have wrapped it in one big "" / q{} with no concatenation operator).
And, as others have hinted, qq{} is essentially "", with the added advantage that you can include double quote characters without having to escape them.

Perl backticks not capturing output

I have the following program snippet
my $nfdump_command = "nfdump -M /data/nfsen/profiles-data/live/upstream1 -T -R ${syear}/${smonth}/${sday}/nfcapd.${syear}${smonth}${sday}0000:${eyear}/${emonth}/${eday}/nfcapd.${eyear}${emonth}${eday}2355 -n 100 -s ip/bytes -N -o csv -q | awk 'BEGIN { FS = \",\" } ; { if (NR > 1) print \$5, \$10 }'";
syslog("info", $nfdump_command);
my %args;
Nfcomm::socket_send_ok ($socket, \%args);
my #nfdump_output = `$nfdump_command`;
my %domain_name_to_bytes;
my %domain_name_to_ip_addresses;
syslog("info", Dumper(\#nfdump_output));
foreach my $a_line (#nfdump_output) {
syslog("info", "LINE: " . $a_line);
}
Bug: #nfdump_output is empty.
The $nfdump_command is correct and it printing output when ran individually
This program was working for sometime and then it broke. Couldn't figure out why. After moving my development setup to another virtual machine, I found out that using absolute path to nfdump fixes it

Different results with shell command in and out of a perl script

I have a perl script that needs to check for an empty directory on a remote machine. Using ksh I can get the following shell script to work:
ksh# ssh user#host '[ "$(ls -A /empty/dir/* 2>/dev/null)" ] && echo "1" || echo "0"'
This correctly returns a "0" if the directory is empty or does not exist. It returns a "1" only if the directory contains something.
When I place this line inside of the perl script though like so:
#!/usr/bin/perl
print `ssh user\#host '[ "$(ls -A /empty/dir/* 2>/dev/null)" ] && echo "1" || echo "0"'`
No matter what I put in there it returns a "1", empty directory or not. I've checked env values compared to the normal shell and the perl script and they are the same.
Does anyone have any ideas why this command would return different results only in the perl script?
Both machines are AIX 6.1 with KSH as the default shell.
Text inside backticks is interpolated as if it were inside double quotes before being passed to the OS. Run
print qq`ssh user\#host '[ "$(ls -A /empty/dir/* 2>/dev/null)" ] && echo "1" || echo "0"'`
to see exactly what string is getting passed to the OS. I'll bet you'll at least have to escape the $.
A safer and saner way is to build your command first and run it inside backticks later:
# q{...} does no interpolation
my $cmd = q{ssh user\#host '[ "$(ls -A /empty/dir/* 2>/dev/null)" ] && echo "1" || echo "0"'};
print `$cmd`;
use Net::SFTP::Foreign;
my $s = Net::SFTP::Foreign->new('user#host');
my $empty = 1;
if (my $d = $s->opendir('/empty/dir')) {
if (defined $s->readdir($d)) {
$empty = 0
}
}

perl backticks: use bash instead of sh

I noticed that when I use backticks in perl the commands are executed using sh, not bash, giving me some problems.
How can I change that behavior so perl will use bash?
PS. The command that I'm trying to run is:
paste filename <(cut -d \" \" -f 2 filename2 | grep -v mean) >> filename3
The "system shell" is not generally mutable. See perldoc -f exec:
If there is more than one argument in LIST, or if LIST is an array with more than one value, calls execvp(3) with the arguments in LIST. If
there is only one scalar argument or an array with one element in it, the argument is checked for shell metacharacters, and if there are any, the
entire argument is passed to the system's command shell for parsing (this is "/bin/sh -c" on Unix platforms, but varies on other platforms).
If you really need bash to perform a particular task, consider calling it explicitly:
my $result = `/usr/bin/bash command arguments`;
or even:
open my $bash_handle, '| /usr/bin/bash' or die "Cannot open bash: $!";
print $bash_handle 'command arguments';
You could also put your bash commands into a .sh file and invoke that directly:
my $result = `/usr/bin/bash script.pl`;
Try
`bash -c \"your command with args\"`
I am fairly sure the argument of -c is interpreted the way bash interprets its command line. The trick is to protect it from sh - that's what quotes are for.
This example works for me:
$ perl -e 'print `/bin/bash -c "echo <(pwd)"`'
/dev/fd/63
To deal with running bash and nested quotes, this article provides the best solution: How can I use bash syntax in Perl's system()?
my #args = ( "bash", "-c", "diff <(ls -l) <(ls -al)" );
system(#args);
I thought perl would honor the $SHELL variable, but then it occurred to me that its behavior might actually depend on your system's exec implementation. In mine, it seems that exec
will execute the shell
(/bin/sh) with the path of the
file as its first argument.
You can always do qw/bash your-command/, no?
Create a perl subroutine:
sub bash { return `cat << 'EOF' | /bin/bash\n$_[0]\nEOF\n`; }
And use it like below:
my $bash_cmd = 'paste filename <(cut -d " " -f 2 filename2 | grep -v mean) >> filename3';
print &bash($bash_cmd);
Or use perl here-doc for multi-line commands:
$bash_cmd = <<'EOF';
for (( i = 0; i < 10; i++ )); do
echo "${i}"
done
EOF
print &bash($bash_cmd);
I like to make some function btck (which integrates error checking) and bash_btck (which uses bash):
use Carp;
sub btck ($)
{
# Like backticks but the error check and chomp() are integrated
my $cmd = shift;
my $result = `$cmd`;
$? == 0 or confess "backtick command '$cmd' returned non-zero";
chomp($result);
return $result;
}
sub bash_btck ($)
{
# Like backticks but use bash and the error check and chomp() are
# integrated
my $cmd = shift;
my $sqpc = $cmd; # Single-Quote-Protected Command
$sqpc =~ s/'/'"'"'/g;
my $bc = "bash -c '$sqpc'";
return btck($bc);
}
One of the reasons I like to use bash is for safe pipe behavior:
sub safe_btck ($)
{
return bash_btck('set -o pipefail && '.shift);
}