How can I use eval to execute the command in a variable and also background it in the same line? I'm trying the following but it's not working. For example s xeyes I would expect the shell to return.
function s --description "Start a command in the background and remove from jobs list"
echo (count $argv)
if test (count $argv) -ne 1
echo "illegal number of parameters"
return 1
end
eval $argv[1] 2>&1 > /dev/null &
disown
end
Fish does not support backgrounding functions. eval is a function, so backgrounding it is not supported.
You need to put the & into the eval'd code, so
eval $argv[1] 2>&1 > /dev/null &
might work.
Alternatively, since fish 3.0 eval isn't needed here anymore, so you can just do
$argv[1] 2>&1 >/dev/null &
Related
I have a very simple Perl script that fails with this error message:
sh: 1: Syntax error: Bad fd number
Here is the script (two lines)
#!/usr/bin/perl
system("xterm >& /dev/null &");
If I run the same xterm command from the command-line, it works. From the Perl script, it doesn't. What is wrong?
system(EXPR)
is short for[1]
system("/bin/sh", "-c", EXPR)
In other words, it takes a bourne shell command.
xterm >& /dev/null &
isn't a valid bourne shell command. You want
xterm >/dev/null 2>&1 &
Maybe you used a different shell when you tested it outside of Perl.
Technically, it's closer to
use Config qw( );
system($Config::Config{sh}, "-c", EXPR)
Except in Windows.
Firstly, the preferred syntax for redirecting both stdout and stderr in Bash is &>, not >&, because the latter can be confused with other redirection forms.
Secondly, system uses /bin/sh which may behave differently than your default shell.
Try writing it out explicitly, as in
system("xterm >/dev/null 2>&1 &");
or skipping the shell altogether.
if (fork() == 0) {
open STDOUT, '>', '/dev/null';
open STDERR, '>&', *STDOUT;
exec "xterm";
POSIX::_exit();
}
I am trying to capture the output of a command. It works fine if the command executes. However when there is an error, i am unable to capture what gets displayed in commandline
Eg.
$ out=`/opt/torque/bin/qsub submitscript`
qsub: Unauthorized Request MSG=group ACL is not satisfied: user abc#xyz.org, queue home
$ echo $out
$
I want $out to have the message
Thanks!
Errors are on stderr, so you need to redirect them into stdout so the backticks will capture it:
out=`/opt/torque/bin/qsub submitscript 2>&1`
if [ $? -gt 0 ] ; then
# By convention, this is sent to stderr, but if you need it on
# stdout, just remove the >&2 redirection
echo "Error: $out" >&2
else
echo "Success: $out"
fi
You should test the exit status of the command to figure out what the output represents (one way shown). It is similar for perl, slightly different syntax of course.
Have you tried doing it like this
$ out=`/opt/torque/bin/qsub submitscript 2>&1 > /dev/null`
$ echo $out
I need to run a system command which would go to a directory and delete sub directories excluding files if present. I wrote the below command to perform this operation:
system("cd /home/faizan/test/cache ; for i in *\; do if [ -d \"$i\" ]\; then echo \$i fi done");
The command above keeps throwing syntax error. I have tried multiple combinations but still not clear how this should go. Please suggest.
Well, your command line does contain syntax errors. Try this:
system("cd /home/faizan/test/cache ; for i in *; do if [ -d \"\$i\" ]; then echo \$i; fi; done");
Or better yet, only loop over directories in the first place;
system("for i in /home/faizan/test/cache/*/.; do echo \$i; done");
Or better yet, do it without a loop:
system("echo /home/faizan/test/cache/*/.");
(I suppose you will want to rmdir instead of echo once it is properly debugged.)
Or better yet, do it all in Perl. There is nothing here which requires system().
You're still best off trying this as a bash command first. Formatting that properly makes it much clearer that you're missing statement terminators:
for i in *; do
if [ -d "$i" ]; then
echo $i
fi
done
And condensing that by replacing new lines with semicolons (apart from after do/then):
for i in *; do if [ -d "$i" ]; then echo $i; fi; done
Or as has been mentioned, just do it in Perl (I haven't tested this to the point of actually uncommenting remove_tree - be careful!):
use strict;
use warnings;
use File::Path 'remove_tree';
use feature 'say';
chdir '/tmp';
opendir my $cache, '.';
while (my $item = readdir($cache)) {
if ($item !~ /^\.\.?$/ && -d $item) {
say "Deleting '$item'...";
# remove_tree($item);
}
}
Using system
my #args = ("cd /home/faizan/test/cache ; for i in *; do if [ -d \"\$i\" ]; then echo \$i; fi; done");
system(#args);
Using Subroutine
sub do_stuff {
my #args = ( "bash", "-c", shift );
system(#args);
}
do_stuff("cd /home/faizan/test/cache ; for i in *; do if [ -d \"\$i\" ]; then echo \$i; fi; done");
As question title stand for system command, this will answer directly, but the sample command using bash contain only thing that will be simplier in perl only (take a look at other answer using opendir and -d in perl).
If you want to use system (instead of open $cmdHandle,"bash -c ... |"), the prefered syntax for execution commands like system or exec, is to let perl parsing the command line.
Try this (as you've already done):
perl -e 'system("bash -c \"echo hello world\"")'
hello world
perl -e 'system "bash -c \"echo hello world\"";'
hello world
And now better, same but letting perl ensure command line parsing, try this:
perl -e 'system "bash","-c","echo hello world";'
hello world
There are clearly 3 argument of system command:
bash
-c
the script
or little more:
perl -e 'system "bash","-c","echo hello world;date +\"Now it is %T\";";'
hello world
Now it is 11:43:44
as you can see in last purpose, there is no double double-quotes enclosing bash script part of command line.
**Nota: on command line, using perl -e '...' or perl -e "...", it's a little heavy to play with quotes and double-quotes. In a script, you could mix them:
system 'bash','-c','for ((i=10;i--;));do printf "Number: %2d\n" $i;done';
or even:
system 'bash','-c','for ((i=10;i--;));do'."\n".
'printf "Number: %2d\n" $i'."\n".
'done';
Using dots . for concatening part of (script part) string, there are always 3 arguments.
I am calling many Perl scripts in my Bash script (sometimes from csh also).
At the start of the Bash script I want to put a test which checks if all the Perl scripts are devoid of any compilation errors.
One way of doing this would be to actually call the Perl script from the Bash script and grep for "compilation error" in the piped log file, but this becomes messy as different Perl scripts are called at different points in the code, so I want to do this at the very start of the Bash script.
Is there a way to check if the Perl script has no compilation error?
Beware!!
Using the below command to check compilation errors in your Perl program can be dangerous.
$ perl -c yourperlprogram
Randal has written a very nice article on this topic which you should check out
Sanity-checking your Perl code (Linux Magazine Column 91, Mar 2007)
Quoting from his article:
Probably the simplest thing we can tell is "is it valid?". For this,
we invoke perl itself, passing the compile-only switch:
perl -c ourprogram
For this operation, perl compiles the program,
but stops just short of the execution phase. This means that every
part of the program text is translated into the internal data
structure that represents the working program, but we haven't actually
executed any code. If there are any syntax errors, we're informed, and
the compilation aborts.
Actually, that's a bit of a lie. Thanks to BEGIN blocks (including
their layered-on cousin, the use directive), some Perl code may have
been executed during this theoretically safe "syntax check". For
example, if your code contains:
BEGIN { warn "Hello, world!\n" }
then you will see that message,
even during perl -c! This is somewhat surprising to people who
consider "compile only" to mean "executes no code". Consider the
code that contains:
BEGIN { system "rm", "-rf", "/" }
and you'll see the problem with
that argument. Oops.
Apart from perl -c program.pl, it's also better to find warnings using the command:
perl -w program.pl
For details see: http://www.perl.com/pub/2004/08/09/commandline.html
I use the following part of a bash func for larger perl projects :
# foreach perl app in the src/perl dir
while read -r dir ; do
echo -e "\n"
echo "start compiling $dir ..." ;
cd $product_instance_dir/src/perl/$dir ;
# run the autoloader utility
find . -name '*.pm' -exec perl -MAutoSplit -e 'autosplit($ARGV[0], $ARGV[1], 0, 1, 1)' {} \;
# foreach perl file check the syntax by setting the correct INC dirs
while read -r file ; do
perl -MCarp::Always -I `pwd` -I `pwd`/lib -wc "$file"
# run the perltidy inline
# perltidy -b "$file"
# sleep 3
ret=$? ;
test $ret -ne 0 && break 2 ;
done < <(find "." -type f \( -name "*.pl" -or -name "*.pm" \))
test $ret -ne 0 && break ;
echo "stop compiling $dir ..." ;
echo -e "\n\n"
cd $product_instance_dir ;
done < <(ls -1 "src/perl")
When you need to check errors/warnings before running but your file depends on mutliple other files you can add option -I:
perl -I /path/to/dependency/lib -c /path/to/file/to/check
Edit: from man perlrun
Directories specified by -I are prepended to the search path for modules (#INC).
I'm writing a script that will fold, sort and count text in a file. I need to design the program so that, if it is given multiple filenames on the command line, it processes each one separately, one after the other. I think I could write a loop but I don't know that much about those yet so if possible would like to try other options. Are there other options that I can add to this so more than one file name can be entered in the command line?
if test $# -lt 1
then
echo "usage: $0 Enter at least one DNA filename"
exit
fi
if test -r $*
then
fold -w3 $* | sort | uniq -c | sort -k1,1nr -k2
else
echo "usage: $* must be readable"
exit
fi
Nena
for loop will be appropriate here. The following form is used to iterate over positional arguments:
for f; do
# do work here using "$f" as the current argument
done
This is equivalent to a more verbose version:
for f in "$#"; do
# do work here using "$f" as the current argument
done
You can use a while loop and shift to iterate through the command line arguments one by one as:
if test $# -lt 1 # insufficient arguments.
then
echo "usage: $0 Enter at least one DNA filename"
exit
fi
# loop through the argument on by one.
# till their number($#) becomes 0.
while test $# -gt 0
do
if test -r "$1" # use $1..$* represent all arguments.
then
fold -w3 "$1" | sort | uniq -c | sort -k1,1nr -k2
else
echo "usage: $1 must be readable"
exit
fi
# shift so that 2nd argument now comes in $1.
shift
done