In Perforce CLI, the output of this command: p4 login -s is Perforce password (P4PASSWD) invalid or unset. if no user is logged in (see screenshot below).
When I pipe this command to Find command, I expect to get a blank line, but I still get the same line:
How can I pipe this command as I expect?
The Perforce password (P4PASSWD) invalid or unset. message is output to STDERR, and find (and findstr, for that matter) only operate on STDOUT. To solve this, use this:
p4 login -s 2>&1 | find "gg"
This will tie the output of STDERR (that is, stream 2) to STDOUT (stream 1).
Related
I'm using PostgreSQL on Windows 7 through the command line. I want to import the content of different CSV files into a newly created table.
After executing the command the database name appeared like:
database=#
Now appears like
database*# after executing:
type directory/*.csv | psql -c 'COPY sch.trips(value1, value2) from stdin CSV HEADER';
What does *# mean?
Thanks
This answer is for Linux and as such doesn't answer OP's question for Windows. I'll leave it up anyway for anyone that comes across this in the future.
You accidentally started a block comment with your type directory/*.csv. type doesn't do what you think it does. From the bash built-ins:
With no options, indicate how each name would be interpreted if used as a command name.
Try doing cat instead:
cat directory/*.csv | psql -c 'COPY sch.trips(value1, value2) from stdin CSV HEADER';
If this gives you issues because each CSV has its own header, you can also do:
for file in directory/*.csv; do cat "$file" | psql -c 'COPY sch.trips(value1, value2) from stdin CSV HEADER'; done
Type Command
The type built-in command in Bash is a way of viewing command interpreter results. For example, using it with ssh:
$ type ssh
ssh is /usr/bin/ssh
This indicates how ssh would be interpreted when you run ssh as a command in the current Bash environment. This is useful for things like aliases. As an example for this, ll is usually an alias to ls -l. Here's what my Bash environment had for ll:
$ type ll
ll is aliased to `ls -l --color=auto'
For you, when you pipe the result of this command to psql, it encounters the /* in the input and assumes it's a block comment, which is what the database*# prompt means (the * indicates it's waiting for the comment close pattern, */).
Cat Command
cat is for concatenating multiple files together. By default, it writes to standard out, so cat directory/*.csv will write each CSV file to standard out one after another. However, piping this means that each CSV's header will also be piped mid-stream of the copy. This may not be desirable, so:
For Loop
We can use for to loop over each file and individually import it. The version I have above, for file in directory/*.csv, will properly handle files with spaces. Properly formatted:
for file in directory/*; do
cat "$file" | psql -c 'COPY sch.trips(value1, value2) from stdin CSV HEADER'
done
References
PostgreSQL 10 Comments Documentation (postgresql.org)
type built-in Manual page (mankier.com)
cat Manual page (mankier.com)
Bash looping tutorial (tldp.org)
How to redirect the STDEER to a file and both STDOUT & STDERR still show on screen?
I found many method on web, but they not work on csh/tcsh shell.
However, my command cannot run at "bash shell".
I know something like below :
(command > /dev/null) >& stderr.log
But this will mask the screen display.
tcsh's IO redirection options are redirecting stdout and stderr simultaneously or just stdout.
One option is to redirect stdout to /dev/tty and then dup stderr into stdout and tee it.
% (command > /dev/tty) |& tee stderr.log
Note that this will always write to the console, even if used in a script which you then pipe somewhere else.
You can call other shells from inside tcsh, so it's worth mentioning what they can do.
Bourne-compatible shells can only pipe stdout, but do have syntax for manipulating file descriptors, by using a third file descriptor as temporary storage we can swap stdout and stderr. The sequence 3>&2 2>&1 1>&3 3>&- accomplishes this and then closes file descriptor 3.
$ (command 3>&2 2>&1 1>&3 3>&- | tee stderr.log) 3>&2 2>&1 1>&3 3>&-
Bash and some other shells like zsh support process substitution, which lets you use a command as a source <(...) or destination >(...).
$ command 2> >(tee stderr.log)
If you don't mind rants, http://www.grymoire.com/unix/CshTop10.txt and http://www.perl.com/doc/FMTEYEWTK/versus/csh.whynot have some decent information about some of the rough edges of (t)csh quoting, redirection, conditionals, and aliases. Some of the information about specific bugs is long out of date, however.
I would like to be able to
Display STDERR on the screen
Copy STDOUT and STDERR in files (and if possible in the same file)
For information, I am using Msys to do that.
.
After doing some research on the SO, I have tried to use something like
<my command> > >(tee stdout.log) 2> >(tee stderr.log)
Bu I got the following error:
sh: syntax error near unexpected token `>'
.
Any idea on how to do that?
There might be no direct solution in Msys but the >(tee ... ) solution works fine in *Nix, OSX, and probably Cygwin.
The workaround is to grep all the errors and warnings we want to keep them on the screen.
I have successfully used the following command for a makefile to compile C code:
make 2>&1 | tee make.log | grep -E "(([Ee]rror|warning|make):|In function|undefined)"
I have a simple script (test.sh) that generates STDOUT and STDERR:
#!/bin/bash
echo hello
rm something
exit
Then, to do what you want execute with the following:
./test.sh > stdout.log 2> >(tee stderr.log >&2)
You'll get the STDERR on the screen, and two separated log files with STDERR and STDOUT. I used part of the answer given here
Note that I am assuming you don't have a file called something on the current directory :)
If you want both STDOUT and STDERR to go to the same file, use the -a option on tee:
./test.sh > std.log 2> >(tee -a std.log >&2)
I have a specific problem I'd like to solve and I'm running system through perl which I think runs in bash:
Show stdout and stderr in both.log. Add "done" if it finished.
Append stderr in stderr.log
Don't print out to terminal, except the original command and "done" if the command finished.
So once we combine stdout and stderr, can we separate them again? OR can we first capture stderr and then combine it with stdout after?
bash: system("((" . $cmd . " 2>&1 1>&3 | tee -a stderr.log) 3>&1) > both.log; echo done | tee -a both.log"
Even though we use tee for stderr.log it doesn't tee the error output to the terminal (which is what I want). I don't completely understand how it works, but I guess the tee causes it to go to the log file and also to the next file descriptor equation.
Bonus
I found this here in the comments: http://wiki.bash-hackers.org/howto/redirection_tutorial
This can be used to do the same thing, but all output is also teed to terminal (It doesn't print "done").
bash:
(($command 2>&1 1>&3 | tee stderr.log) 3>&1 ) | tee both.log
Using Perl 5.14.1.
tcsh alone would not suffice here. It does not support redirecting stderr without stdout.
Use another shell or a designated script.
More here.
UPDATE:
Can this be ported to tcsh?
No. tcsh does not support redirecting stderr without stdout.
When I run isql with a script file:
isql.exe -q -e -i %1 -o %~n1.log
Then in the output file I see commands, but the error of commands I see on the screen when it run.
The Error doesn't isn't written to the output file. Which command should I use so the errors are also written to the output file?
You have to use the -m(erge) command line switch in order to send the error messages into the output file.