tee stderr to file (csh/tcsh shell) - redirect

How to redirect the STDEER to a file and both STDOUT & STDERR still show on screen?
I found many method on web, but they not work on csh/tcsh shell.
However, my command cannot run at "bash shell".
I know something like below :
(command > /dev/null) >& stderr.log
But this will mask the screen display.

tcsh's IO redirection options are redirecting stdout and stderr simultaneously or just stdout.
One option is to redirect stdout to /dev/tty and then dup stderr into stdout and tee it.
% (command > /dev/tty) |& tee stderr.log
Note that this will always write to the console, even if used in a script which you then pipe somewhere else.
You can call other shells from inside tcsh, so it's worth mentioning what they can do.
Bourne-compatible shells can only pipe stdout, but do have syntax for manipulating file descriptors, by using a third file descriptor as temporary storage we can swap stdout and stderr. The sequence 3>&2 2>&1 1>&3 3>&- accomplishes this and then closes file descriptor 3.
$ (command 3>&2 2>&1 1>&3 3>&- | tee stderr.log) 3>&2 2>&1 1>&3 3>&-
Bash and some other shells like zsh support process substitution, which lets you use a command as a source <(...) or destination >(...).
$ command 2> >(tee stderr.log)
If you don't mind rants, http://www.grymoire.com/unix/CshTop10.txt and http://www.perl.com/doc/FMTEYEWTK/versus/csh.whynot have some decent information about some of the rough edges of (t)csh quoting, redirection, conditionals, and aliases. Some of the information about specific bugs is long out of date, however.

Related

Msys: how to keep STDERR on the screen and at the same time copy both STDOUT and STDERR to files

I would like to be able to
Display STDERR on the screen
Copy STDOUT and STDERR in files (and if possible in the same file)
For information, I am using Msys to do that.
.
After doing some research on the SO, I have tried to use something like
<my command> > >(tee stdout.log) 2> >(tee stderr.log)
Bu I got the following error:
sh: syntax error near unexpected token `>'
.
Any idea on how to do that?
There might be no direct solution in Msys but the >(tee ... ) solution works fine in *Nix, OSX, and probably Cygwin.
The workaround is to grep all the errors and warnings we want to keep them on the screen.
I have successfully used the following command for a makefile to compile C code:
make 2>&1 | tee make.log | grep -E "(([Ee]rror|warning|make):|In function|undefined)"
I have a simple script (test.sh) that generates STDOUT and STDERR:
#!/bin/bash
echo hello
rm something
exit
Then, to do what you want execute with the following:
./test.sh > stdout.log 2> >(tee stderr.log >&2)
You'll get the STDERR on the screen, and two separated log files with STDERR and STDOUT. I used part of the answer given here
Note that I am assuming you don't have a file called something on the current directory :)
If you want both STDOUT and STDERR to go to the same file, use the -a option on tee:
./test.sh > std.log 2> >(tee -a std.log >&2)

redirect stdout and stderr to seperate files and to screen

I have been through quite a few variants of this question but none seems to do exactly what I want. I would like to start csh, have STDOUT go to one file and to the screen, and have STDERR go to a different file, and to the screen as well.
This command:
csh > stdout_file.txt 2> stderr_file.txt
gets STDOUT and STDERR to different files, but than nothing goes to the screen. How can I get what is written to the files go to the screen as well? I assume tee should be a part of this, but whatever I try gives my syntax errors.
((csh | tee f1) >`tty`) 2>&1 | tee f2
or maybe
((csh | tee f1) >`tty`) |& tee f2

Logging stdout and stderr separately and together simultaneously

I have a specific problem I'd like to solve and I'm running system through perl which I think runs in bash:
Show stdout and stderr in both.log. Add "done" if it finished.
Append stderr in stderr.log
Don't print out to terminal, except the original command and "done" if the command finished.
So once we combine stdout and stderr, can we separate them again? OR can we first capture stderr and then combine it with stdout after?
bash: system("((" . $cmd . " 2>&1 1>&3 | tee -a stderr.log) 3>&1) > both.log; echo done | tee -a both.log"
Even though we use tee for stderr.log it doesn't tee the error output to the terminal (which is what I want). I don't completely understand how it works, but I guess the tee causes it to go to the log file and also to the next file descriptor equation.
Bonus
I found this here in the comments: http://wiki.bash-hackers.org/howto/redirection_tutorial
This can be used to do the same thing, but all output is also teed to terminal (It doesn't print "done").
bash:
(($command 2>&1 1>&3 | tee stderr.log) 3>&1 ) | tee both.log
Using Perl 5.14.1.
tcsh alone would not suffice here. It does not support redirecting stderr without stdout.
Use another shell or a designated script.
More here.
UPDATE:
Can this be ported to tcsh?
No. tcsh does not support redirecting stderr without stdout.

Redirecting the output of both stdout and stderr?

I have one program, which writes its output to stderr and it also runs a executable internally
which writes to stdout. I want to redirect the output of both to the same file using redirection operator something like "./a.out 2> output.txt", but this redirects the stderr, How to specify stdout also here.
Under Linux:
./a.out > output.txt 2>&1
Or simply
./a.out &> output.txt

Perl's diamond operator: can it be done in bash?

Is there an idiomatic way to simulate Perl's diamond operator in bash? With the diamond operator,
script.sh | ...
reads stdin for its input and
script.sh file1 file2 | ...
reads file1 and file2 for its input.
One other constraint is that I want to use the stdin in script.sh for something else other than input to my own script. The below code does what I want for the file1 file2 ... case above, but not for data provided on stdin.
command - $# <<EOF
some_code_for_first_argument_of_command_here
EOF
I'd prefer a Bash solution but any Unix shell is OK.
Edit: for clarification, here is the content of script.sh:
#!/bin/bash
command - $# <<EOF
some_code_for_first_argument_of_command_here
EOF
I want this to work the way the diamond operator would work in Perl, but it only handles filenames-as-arguments right now.
Edit 2: I can't do anything that goes
cat XXX | command
because the stdin for command is not the user's data. The stdin for command is my data in the here-doc. I would like the user data to come in on the stdin of my script, but it can't be the stdin of the call to command inside my script.
Sure, this is totally doable:
#!/bin/bash
cat $# | some_command_goes_here
Users can then call your script with no arguments (or '-') to read from stdin, or multiple files, all of which will be read.
If you want to process the contents of those files (say, line-by-line), you could do something like this:
for line in $(cat $#); do
echo "I read: $line"
done
Edit: Changed $* to $# to handle spaces in filenames, thanks to a helpful comment.
Kind of cheezy, but how about
cat file1 file2 | script.sh
I am (like everyone else, it seems) a bit confused about exactly what the goal is here, so I'll give three possible answers that may cover what you actually want. First, the relatively simple goal of getting the script to read from either a list of files (supplied on the command line) or from its regular stdin:
if [ $# -gt 0 ]; then
exec < <(cat "$#")
fi
# From this point on, the script's stdin is redirected from the files
# (if any) supplied on the command line
Note: the double-quoted use of $# is the best way to avoid problems with funny characters (e.g. spaces) in filenames -- $* and unquoted $# both mess this up. The <() trick I'm using here is a bash-only feature; it fires off cat in the background to feed data from files supplied on the command line, and then we use exec to replace the script's stdin with the output from cat.
...but that doesn't seem to be what you actually want. What you seem to really want is to pass the supplied filenames or the script's stdin as arguments to a command inside the script. This requires sort of the opposite process: converting the script's stdin into a file (actually a named pipe) whose name can be passed to the command. Like this:
if [[ $# -gt 0 ]]; then
command "$#" <<EOF
here-doc goes here
EOF
else
command <(cat) <<EOF
here-doc goes here
EOF
fi
This uses <() to launder the script's stdin through cat to a named pipe, which is then passed to command as an argument. Meanwhile, command's stdin is taken from the here-doc.
Now, I think that's what you want to do, but it's not quite what you've asked for, which is to both redirect the script's stdin from the supplied files and pass stdin to the command inside the script. This can be done by combining the above techniques:
if [ $# -gt 0 ]; then
exec < <(cat "$#")
fi
command <(cat) <<EOF
here-doc goes here
EOF
...although I can't think why you'd actually want to do this.
The Perl diamond operator essentially loops across all the command line arguments, treating each as a filename. It opens each file and reads them line-by-line. Here's some bash code that will do approximately the same.
for f in "$#"
do
# Do something with $f, such as...
cat $f | command1 | command2
-or-
command1 < $f
-or-
# Read $f line-by-line
cat $f | while read line_from_f
do
# Do stuff with $line_from_f
done
done
You want to take the first argument and do something with it, and then either read from any files specified or stdin if no files?
Personally, I'd suggest using getopt to indicate arguments using the "-a value" syntax to help disambiguate, but that's just me. Here's how I'd do it in bash without getopts:
firstarg=${1?:usage: $0 arg [file1 .. fileN]}
shift
typeset -a files
if [[ ${##} -gt 0 ]]
then
files=( "$#" )
else
files=( "/dev/stdin" )
fi
for file in "${files[#]}"
do
whatever_you_want < "$file"
done
The ?: operator will die if there are no args specified, since you seem to want at least one arg either way. After grabbing that, shift the args over by one, and then either use the remaining args as your file list, or the bash special filehandle "/dev/stdin" if there were no other args.
I think that the "if no files are specified, use /dev/stdin - otherwise use the files on the command line" piece is probably what you're looking for, but the rest of the code is at least useful for context.
Also a little cheezy, but how about this:
if [[ $# -eq 0 ]]
then
# read from stdin
else
# read from $* (args)
fi
If you need to read and process line-by-line (which is likely) and don't want to copy/paste the same code twice (which is likely), define a function in your script and just pass the lines one-by-one to this function, and process them in said function.
Why not use ``cat #* in the script? For example:
x=`cat $*`
echo $x