Why is fish 3.0 printing an error message on start? - fish

I recently upgraded to fish 3.0.1 via Homebrew 2.0.1 on MacOS Mojave 10.14.2. Since the upgrade, the following error message appears every time fish starts:
contains: Unknown option '-gx'
/usr/local/Cellar/fish/3.0.1/share/fish/config.fish (line 426):
if not contains $entry $result
^
in function '__fish_macos_set_env'
called on line 228 of file /usr/local/Cellar/fish/3.0.1/share/fish/config.fish
with parameter list 'PATH /etc/paths /etc/paths.d'
from sourcing file /usr/local/Cellar/fish/3.0.1/share/fish/config.fish
called during startup
contains - test if a word is present in a list
Synopsis
contains [OPTIONS] KEY [VALUES...]
contains: Type 'help contains' for related documentation
contains: Unknown option '-e'
/usr/local/Cellar/fish/3.0.1/share/fish/config.fish (line 426):
if not contains $entry $result
^
in function '__fish_macos_set_env'
called on line 228 of file /usr/local/Cellar/fish/3.0.1/share/fish/config.fish
with parameter list 'PATH /etc/paths /etc/paths.d'
from sourcing file /usr/local/Cellar/fish/3.0.1/share/fish/config.fish
called during startup
contains - test if a word is present in a list
Synopsis
contains [OPTIONS] KEY [VALUES...]
contains: Type 'help contains' for related documentation
My first reflex was to have a look in the offending file, namely /usr/local/Cellar/fish/3.0.1/share/fish/config.fish, but this file is only 306 lines long, and therefore does not seem to contain the infamous line 426. I am copying this file below, in case it proves useful:
# Main file for fish command completions. This file contains various
# common helper functions for the command completions. All actual
# completions are located in the completions subdirectory.
#
# Set default field separators
#
set -g IFS \n\ \t
set -qg __fish_added_user_paths
or set -g __fish_added_user_paths
#
# Create the default command_not_found handler
#
function __fish_default_command_not_found_handler
printf "fish: Unknown command %s\n" (string escape -- $argv[1]) >&2
end
if status --is-interactive
# The user has seemingly explicitly launched an old fish with too-new scripts installed.
if not contains -- "string" (builtin -n)
set -g __is_launched_without_string 1
# XXX nostring - fix old fish binaries with no `string' builtin.
# When executed on fish 2.2.0, the `else' block after this would
# force on 24-bit mode due to changes to in test behavior.
# These "XXX nostring" hacks were added for 2.3.1
set_color --bold
echo "You appear to be trying to launch an old fish binary with newer scripts "
echo "installed into" (set_color --underline)"$__fish_data_dir"
set_color normal
echo -e "\nThis is an unsupported configuration.\n"
set_color yellow
echo "You may need to uninstall and reinstall fish!"
set_color normal
# Remove this code when we've made it safer to upgrade fish.
else
# Enable truecolor/24-bit support for select terminals
# Ignore Screen and emacs' ansi-term as they swallow the sequences, rendering the text white.
if not set -q STY
and not string match -q -- 'eterm*' $TERM
and begin
set -q KONSOLE_PROFILE_NAME # KDE's konsole
or string match -q -- "*:*" $ITERM_SESSION_ID # Supporting versions of iTerm2 will include a colon here
or string match -q -- "st-*" $TERM # suckless' st
or test -n "$VTE_VERSION" -a "$VTE_VERSION" -ge 3600 # Should be all gtk3-vte-based terms after version 3.6.0.0
or test "$COLORTERM" = truecolor -o "$COLORTERM" = 24bit # slang expects this
end
# Only set it if it isn't to allow override by setting to 0
set -q fish_term24bit
or set -g fish_term24bit 1
end
end
else
# Hook up the default as the principal command_not_found handler
# in case we are not interactive
function __fish_command_not_found_handler --on-event fish_command_not_found
__fish_default_command_not_found_handler $argv
end
end
#
# Set default search paths for completions and shellscript functions
# unless they already exist
#
set -g __fish_config_dir ~/.config/fish
if set -q XDG_CONFIG_HOME
set __fish_config_dir $XDG_CONFIG_HOME/fish
end
set -l userdatadir ~/.local/share
if set -q XDG_DATA_HOME
set userdatadir $XDG_DATA_HOME
end
# __fish_data_dir, __fish_sysconf_dir, __fish_help_dir, __fish_bin_dir
# are expected to have been set up by read_init from fish.cpp
# Grab extra directories (as specified by the build process, usually for
# third-party packages to ship completions &c.
set -l __extra_completionsdir
set -l __extra_functionsdir
set -l __extra_confdir
if test -f $__fish_data_dir/__fish_build_paths.fish
source $__fish_data_dir/__fish_build_paths.fish
end
# Set up function and completion paths. Make sure that the fish
# default functions/completions are included in the respective path.
if not set -q fish_function_path
set fish_function_path $__fish_config_dir/functions $__fish_sysconf_dir/functions $__extra_functionsdir $__fish_data_dir/functions
end
if not contains -- $__fish_data_dir/functions $fish_function_path
set fish_function_path $fish_function_path $__fish_data_dir/functions
end
if not set -q fish_complete_path
set fish_complete_path $__fish_config_dir/completions $__fish_sysconf_dir/completions $__extra_completionsdir $__fish_data_dir/completions $userdatadir/fish/generated_completions
end
if not contains -- $__fish_data_dir/completions $fish_complete_path
set fish_complete_path $fish_complete_path $__fish_data_dir/completions
end
# This cannot be in an autoload-file because `:.fish` is an invalid filename on windows.
function :
# no-op function for compatibility with sh, bash, and others.
# Often used to insert a comment into a chain of commands without having
# it eat up the remainder of the line, handy in Makefiles.
end
#
# This is a Solaris-specific test to modify the PATH so that
# Posix-conformant tools are used by default. It is separate from the
# other PATH code because this directory needs to be prepended, not
# appended, since it contains POSIX-compliant replacements for various
# system utilities.
#
if test -d /usr/xpg4/bin
if not contains -- /usr/xpg4/bin $PATH
set PATH /usr/xpg4/bin $PATH
end
end
# Add a handler for when fish_user_path changes, so we can apply the same changes to PATH
function __fish_reconstruct_path -d "Update PATH when fish_user_paths changes" --on-variable fish_user_paths
set -l local_path $PATH
for x in $__fish_added_user_paths
set -l idx (contains --index -- $x $local_path)
and set -e local_path[$idx]
end
set -g __fish_added_user_paths
if set -q fish_user_paths
for x in $fish_user_paths[-1..1]
if set -l idx (contains --index -- $x $local_path)
set -e local_path[$idx]
else
set -g __fish_added_user_paths $__fish_added_user_paths $x
end
set local_path $x $local_path
end
end
set -xg PATH $local_path
end
#
# Launch debugger on SIGTRAP
#
function fish_sigtrap_handler --on-signal TRAP --no-scope-shadowing --description "Signal handler for the TRAP signal. Launches a debug prompt."
breakpoint
end
#
# Whenever a prompt is displayed, make sure that interactive
# mode-specific initializations have been performed.
# This handler removes itself after it is first called.
#
function __fish_on_interactive --on-event fish_prompt
__fish_config_interactive
functions -e __fish_on_interactive
end
# Set the locale if it isn't explicitly set. Allowing the lack of locale env vars to imply the
# C/POSIX locale causes too many problems. Do this before reading the snippets because they might be
# in UTF-8 (with non-ASCII characters).
__fish_set_locale
# "." command for compatibility with old fish versions.
function . --description 'Evaluate contents of file (deprecated, see "source")' --no-scope-shadowing
if test (count $argv) -eq 0
# Uses tty directly, as isatty depends on "."
and tty 0>&0 >/dev/null
echo "source: '.' command is deprecated, and doesn't work with STDIN anymore. Did you mean 'source' or './'?" >&2
return 1
else
source $argv
end
end
# Upgrade pre-existing abbreviations from the old "key=value" to the new "key value" syntax.
# This needs to be in share/config.fish because __fish_config_interactive is called after sourcing
# config.fish, which might contain abbr calls.
if not set -q __fish_init_2_3_0
if set -q fish_user_abbreviations
set -l fab
for abbr in $fish_user_abbreviations
set fab $fab (string replace -r '^([^ =]+)=(.*)$' '$1 $2' -- $abbr)
end
set fish_user_abbreviations $fab
end
set -U __fish_init_2_3_0
end
# macOS-ism: Emulate calling path_helper.
if command -sq /usr/libexec/path_helper
# Adapt construct_path from the macOS /usr/libexec/path_helper
# executable for fish; see
# https://opensource.apple.com/source/shell_cmds/shell_cmds-203/path_helper/path_helper.c.auto.html .
function __fish_macos_set_env -d "set an environment variable like path_helper does (macOS only)"
set -l result
for path_file in $argv[2] $argv[3]/*
if test -f $path_file
while read -l entry
if not contains $entry $result
set result $result $entry
end
end <$path_file
end
end
for entry in $$argv[1]
if not contains $entry $result
set result $result $entry
end
end
set -xg $argv[1] $result
end
__fish_macos_set_env 'PATH' '/etc/paths' '/etc/paths.d'
if [ -n "$MANPATH" ]
__fish_macos_set_env 'MANPATH' '/etc/manpaths' '/etc/manpaths.d'
end
functions -e __fish_macos_set_env
end
#
# Some things should only be done for login terminals
# This used to be in etc/config.fish - keep it here to keep the semantics
#
if status --is-login
#
# Put linux consoles in unicode mode.
#
if test "$TERM" = linux
if string match -qir '\.UTF' -- $LANG
if command -sq unicode_start
unicode_start
end
end
end
end
# Invoke this here to apply the current value of fish_user_path after
# PATH is possibly set above.
__fish_reconstruct_path
# Allow %n job expansion to be used with fg/bg/wait
# `jobs` is the only one that natively supports job expansion
function __fish_expand_pid_args
for arg in $argv
if string match -qr '^%\d+$' -- $arg
# set newargv $newargv (jobs -p $arg)
jobs -p $arg
if not test $status -eq 0
return 1
end
else
printf "%s\n" $arg
end
end
end
function bg
builtin bg (__fish_expand_pid_args $argv)
end
function fg
builtin fg (__fish_expand_pid_args $argv)
end
function kill
command kill (__fish_expand_pid_args $argv)
end
function wait
builtin wait (__fish_expand_pid_args $argv)
end
function disown
builtin disown (__fish_expand_pid_args $argv)
end
# As last part of initialization, source the conf directories.
# Implement precedence (User > Admin > Extra (e.g. vendors) > Fish) by basically doing "basename".
set -l sourcelist
for file in $__fish_config_dir/conf.d/*.fish $__fish_sysconf_dir/conf.d/*.fish $__extra_confdir/*.fish
set -l basename (string replace -r '^.*/' '' -- $file)
contains -- $basename $sourcelist
and continue
set sourcelist $sourcelist $basename
# Also skip non-files or unreadable files.
# This allows one to use e.g. symlinks to /dev/null to "mask" something (like in systemd).
[ -f $file -a -r $file ]
and source $file
end
What's going on here? How can I fix this?

What happens here is that a component of your $PATH, $fish_user_paths or a line in /etc/paths looks like an option.
Most likely, this is wrong, and you should remove it.
E.g. try printf '%s\n' $fish_user_paths. If that tells you that one of the entries is "-gx", then you've set it incorrectly and need to use set -e fish_user_paths[number-of-that-entry] to correct it.
Since these are all common options to set, you've probably once done something like set fish_user_paths /something -gx, which adds a "-gx" component (set only reads options before the variable name).
This has been reported upstream at https://github.com/fish-shell/fish-shell/issues/5662, and future fish versions won't spew an error, but the existence of the offending component is most likely an error, so you still want to remove it.

Related

Can I pass a string from perl back to the calling c-shell?

RHEL6
I have a c-shell script that runs a perl script. After dumping tons of stuff to stdout, it determines where (what dir) the parent shell should cd to when the perl script finishes. But that's a string, not an int which is all I can pass back with "exit()".
Storing the name of the dir in a file which the c-shell script can read is what I have now. It works, but is not elegant. Is there a better way to do this ? Maybe a little chunk of memory that I can share with the perl script ?
Short:
Redirect Perl's streams and restore in the end to print that info, taken by the shell script
Or, print that last and the shell script can pass output to the console and take the last line
Or, use a named pipe (either shell) or specific file descriptors (not csh) for that print
When the Perl script prints out that name you can assign it to a variable
in the shell script
#!/bin/csh
set DIR `perl -e'print "dir_name"'`
while in bash
#!/bin/bash
DIR="$(perl -e'print "dir_name"')"
where $(...) is preferred for the command substitution.
But those other prints to console from the Perl script then need be handled
One way is to redirect all output in Perl script other than that one print, what can be controlled by a command-line option (filename to which to redirect, which shell script can print out)
Or, take all Perl's output and pass it to console, the last line being the needed "return." This puts the burden on the Perl script to print that last (perhaps in an END block). The program's output can be printed from the shell script after it completes or line by line as it is emitted.
Or, use a named pipe (both shells) or a specific file descriptor (bash only) to which the Perl script can print that information. In this case its streams go straight to the console.
The question explicitly mentions csh so it is given below. But I must repeat the old and worn fact that shell scripting is far better done in bash than in csh. I strongly recommend to reconsider.
bash
If you need the program's output on the console as it goes, take and print it line by line
#!/bin/bash
while read line; do
echo "$line"
DIR=$line
done < <(perl script.pl)
echo "$DIR"
Or, if you don't need output on the console before the script is finished
#!/bin/bash
mapfile -t lines < <(perl script.pl)
DIR="${lines[-1]}"
printf '%s\n' "${lines[#]}" # print script.pl's output
Or, use file descriptors for that particular print
F=$(mktemp) # safe filename
exec 3> "$F" # open fd 3 to write to it
exec 4< "$F" # open fd 4 to read from it
rm -f "$F" # remove file(name) for safety; opened fd's can still access
perl -E'$fd=shift; say "...normal prints to STDOUT...";
open(FH, ">&=$fd") or die $!;
say FH "dirname";
close FH
' 3
read dir_name <&4
exec 3>&- # close them
exec 4<&-
echo "$dir_name"
I couldn't get it to work with a single file descriptor for both reading and writing (exec 3<> ...), I think because the read can't rewind after the write, thus separate descriptors are used.
With a Perl script (and not the demo one-liner above) pass the fd number as a command-line option. The script can then do this only if it's invoked with that option.
Or, use a named pipe very similarly to how it's done for csh below. This is probably best here, if the manipulation of the program's STDOUT isn't to your liking.
csh
Iterate over the program's (completed) output line by line
#!/bin/csh
foreach line ( "`perl script.pl`" )
echo "$line"
set dir_name = "$line"
end
echo "Directory name: $dir_name"
or extract the last line first and then print the whole output
#!/bin/csh
set lines = ( "`perl script.pl`" )
set dir_name = $lines[$#]
# Print program's output
while ( $#lines )
echo "$lines[1]"
shift lines
end
or use a named pipe
set fifo_name = "/tmp/fifo$$" # or use mktemp
mkfifo "$fifo_name"
( perl script.pl --fifo $fifo_name [other args] & )
set dir_name = `cat "$fifo_name"`
rm -f $fifo_name
echo "dir name from FIFO: $dir_name"
The Perl command is in the background since FIFO blocks until written and read. So if the shell script were to wait for perl ... to complete the Perl script would block as it's writing to FIFO (since that's not being read) so shell would never get to read it; we would deadlock. It is also in a subshell, with ( ), so to avoid the informational prints about the background job.
The --fifo NAME command-line option is needed so that Perl script knows what special file to use (and not to do this if the option is not there).
For an in-line example replace ( perl script ...) with this one-liner, used above as well
( perl -E'$ff = shift; say qq(\t...normal prints to STDOUT...);
open FF, ">$ff" or die $!;
say FF "dir_name_$$";
close FF
' $fifo_name
& )
(broken over lines for readability)

Inserting headers into multiple files

I found some command line with Perl that inserts headers into my files without going through the tedious process of inserting them one by one. Can someone walk me through the Perl aspect of this command line? I'm new to this and can't seem to find the right explanations for what I wrote.
cat header.txt | perl -0 -i -pe 'BEGIN{$h = <STDIN>}; print $h' 1*
-e
rather than provide a script in a xxxx.pl file, provide it on the command line
-p
makes it iterate over filename arguments somewhat like sed but also prints the contents of $_ at the end of the script.
the two above are combined in -pe
-i
indicate you want to edit the file in place and write the output to the same file. In practice, Perl renames the input file and reads from this renamed version while writing to a new file with the original name
-0
redefines the end of record character (\n by default) so that you can read the entire input file as a single line
1*
is the command line argument to your script, so I guess you are modifying any file with a name that starts with 1 (you could have used *.c, or whatever depending on the type of files you are trying to modify)
print $h
prints the variable $h that is the "main" of your script. if it was initialized with the content of the header file (the intent of this one-liner) then it will print the header file
BEGIN{ some code here }
this is stuff you execute before the script starts. this is where I'm stumped. this doesn't seem like valid perl code
so basically:
this will supposedly slurp the entire header file (because of -0) in the BEGIN block and store it in the variable $h
iterate over all the files specified by the wildcards at the end of the command line
for each file: print the header (print $h) then print hte file itself (because of -pe)
so it's equivalent to spelling the script out:
$h = gets content of the entire header file
while (<>){ #loop implied by -pe, iterates over all the 1* files
# the main contents of the "-e" script are inserted below as part of executing -pe
print h$; #print the header we saved
print $_; # implied by -pe, and since we are using -0, this prints the entire content in one shot
# end of the "-e" script. again it was a single print $h statement, the second print is implied by -pe
}
It's a bit hard to explain, take a look at the perlrun documentation for details (run man perlrun).
This is not 100% complete explanation because I don;t think the BEGIN block is right. I tried it on my ubuntu machine and it complained about its syntax too
Here's something similar, with an explanation. The program in the question doesn't run on my mac.
I needed to add the #nullable disable directive to the top of all my csharp files as part of migrating to nullable reference types.
perl -w -i -p -0777 -e 's/^/#nullable disable\n\n/' $(find . -iname '*.cs')
-w enable warnings
-i edit files in place
-p read each file block by block, printing each block after applying a perl expression. the default block size is one line
-0777 changes the default block size to the entire file
-e the perl expression to execute
The final argument uses shell command substitution to create a list of files. It passes that list of file paths to the perl command. The find command searches for files that end in .cs.
The perl program is a single substitution command. It matches the very beginning of the block and replaces (prepends, really) with "#nullable disable" and a couple new-lines.

tee to 2 blocks of code?

I am trying to use the tee command on Solaris to route output of 1 command to 2 different steams each of which comprises multiple statements. Here is the snippet of what I coded, but does not work. This iteration throws errors about unexpected end of files. If I change the > to | it throws an error Syntax Error near unexpected token do.
todaydir=/some/path
baselen=${#todaydir}
grep sometext $todaydir/somefiles*
while read iline
tee
>(
# this is the first block
do ojob=${iline:$baselen+1:8}
echo 'some text here' $ojob
done > firstoutfile
)
>(
# this is the 2nd block
do ojob=${iline:$baselen+1:8}
echo 'ls -l '$todaydir'/'$ojob'*'
done > secondoutfile
)
Suggestions?
The "while" should begin (and end) inside each >( ... ) substitution, not outside. Thus, I believe what you want is:
todaydir=/some/path
baselen=${#todaydir}
grep sometext $todaydir/somefiles* | tee >(
# this is the first block
while read iline
do ojob=${iline:$baselen+1:8}
echo 'some text here' $ojob
done > firstoutfile
) >(
# this is the 2nd block
while read iline
do ojob=${iline:$baselen+1:8}
echo 'ls -l '$todaydir'/'$ojob'*'
done > secondoutfile
)
I don't think the tee command will do that. The tee command will write stdin to one or more files as well as spit it back out to stdout. Plus I'm not sure the shell can fork off two sub-processes in the command pipeline like you are trying. You'd probably be better off to use something like Perl to fork off a couple of sub-process and write stdin to each.

Substituting environment variables in a file: awk or sed?

I have a file of environment variables that I source in shell scripts, for example:
# This is a comment
ONE=1
TWO=2
THREE=THREE
# End
In my scripts, I source this file (assume it's called './vars') into the current environment, and change (some of) the variables based on user input. For example:
#!/bin/sh
# Read variables
source ./vars
# Change a variable
THREE=3
# Write variables back to the file??
awk 'BEGIN{FS="="}{print $1=$$1}' <./vars >./vars
As you can see, I've been experimenting with awk for writing the variables back, sed too. Without success. The last line of the script fails. Is there a way to do this with awk or sed (preferably preserving comments, even comments with the '=' character)? Or should I combine 'read' with string cutting in a while loop or some other magic? If possible, I'd like to avoid perl/python and just use the tools available in Busybox. Many thanks.
Edit: perhaps a use case might make clear what my problem is. I keep a configuration file consisting of shell environment variable declarations:
# File: network.config
NETWORK_TYPE=wired
NETWORK_ADDRESS_RESOLUTION=dhcp
NETWORK_ADDRESS=
NETWORK_ADDRESS_MASK=
I also have a script called 'setup-network.sh':
#!/bin/sh
# File: setup-network.sh
# Read configuration
source network.config
# Setup network
NETWORK_DEVICE=none
if [ "$NETWORK_TYPE" == "wired" ]; then
NETWORK_DEVICE=eth0
fi
if [ "$NETWORK_TYPE" == "wireless" ]; then
NETWORK_DEVICE=wlan0
fi
ifconfig -i $NETWORK_DEVICE ...etc
I also have a script called 'configure-network.sh':
#!/bin/sh
# File: configure-network.sh
# Read configuration
source network.config
echo "Enter the network connection type:"
echo " 1. Wired network"
echo " 2. Wireless network"
read -p "Type:" -n1 TYPE
if [ "$TYPE" == "1" ]; then
# Update environment variable
NETWORK_TYPE=wired
elif [ "$TYPE" == "2" ]; then
# Update environment variable
NETWORK_TYPE=wireless
fi
# Rewrite configuration file, substituting the updated value
# of NETWORK_TYPE (and any other updated variables already existing
# in the network.config file), so that later invocations of
# 'setup-network.sh' read the updated configuration.
# TODO
How do I rewrite the configuration file, updating only the variables already existing in the configuration file, preferably leaving comments and empty lines intact? Hope this clears things up a little. Thanks again.
You can't use awk and read and write from the same file (is part of your problem).
I prefer to rename the file before I rewrite (but you can save to a tmp and then rename too).
/bin/mv file file.tmp
awk '.... code ...' file.tmp > file
If your env file gets bigger, you'll see that is is getting truncated at the buffer size of your OS.
Also, don't forget that gawk (the std on most Linux installations) has a built in array ENVIRON. You can create what you want from that
awk 'END {
for (key in ENVIRON) {
print key "=" ENVIRON[key]
}
}' /dev/null
Of course you get everything in your environment, so maybe more than you want. But probably a better place to start with what you are trying to accomplish.
Edit
Most specifically
awk -F"=" '{
if ($1 in ENVIRON) {
printf("%s=%s\n", $1, ENVIRON[$1])
}
# else line not printed or add code to meet your situation
}' file > file.tmp
/bin/mv file.tmp file
Edit 2
I think your var=values might need to be export -ed so they are visible to the awk ENVIRON array.
AND
echo PATH=xxx| awk -F= '{print ENVIRON[$1]}'
prints the existing value of PATH.
I hope this helps.
P.S. as you appear to be a new user, if you get an answer that helps you please remember to mark it as accepted, and/or give it a + (or -) as a useful answer.
I don't exactly know what you are trying to do, but if you are trying to change the value of variable THREE ,
awk -F"=" -vt="$THREE" '$1=="THREE" {$2=t}{print $0>FILENAME}' OFS="=" vars
You can do this in just with bash:
rewrite_config() {
local filename="$1"
local tmp=$(mktemp)
# if you want the header
echo "# File: $filename" >> "$tmp"
while IFS='=' read var value; do
declare -p $var | cut -d ' ' -f 3-
done < "$filename" >> "$tmp"
mv "$tmp" "$filename"
}
Use it like
source network.config
# manipulate the variables
rewrite_config network.config
I use a temp file to maintain the existance of the config file for as long as possible.

How can I pipe from terminal in Perl without losing color?

I'm trying to write a perl script which takes the output of colorgcc (or any other script that prints colored text to terminal), adds/removes parts of the string, and then prints the result in the same color as the input string.
The following code will print "Hello World" in front of each line produced by the color_producing_script. The output will be all black, while the input is multicolored. How can I modified this script to conserve the original colors?
open(CMD, "color_producing_script |");
while(<CMD>) {
print 'Hello World' . $_;
}
I'm using bash terminal.
Edit
Per the excellent first comment, this is not a Perl issue per se. Just running color_producing_script | cat strips the color off. So the question can be rephrased to "How do you force color through the pipe?"
Edit 2
It looks like the latest version of gcc (1.3.2) includes the CGCC_FORCE_COLOR environment variable in the if clause, and if it's defined, colorgcc forces color:
export CGCC_FORCE_COLOR=true
Does color_producing_script change its behavior when it's used in a pipe? Try
color_producing_script | cat
at the command line. It may have an option to force color output even when it is.
The Perl script, colorgcc, is specifically checking to see if output is to a non-tty and skipping the colorization if that's the case.
# Get the terminal type.
$terminal = $ENV{"TERM"} || "dumb";
# If it's in the list of terminal types not to color, or if
# we're writing to something that's not a tty, don't do color.
if (! -t STDOUT || $nocolor{$terminal})
{
exec $compiler, #ARGV
or die("Couldn't exec");
}
Edit:
You could modify the script in one or more of the following ways:
comment out the test and make it always produce color output
add functionality to support reading an environment variable that sets whether to colorize
add functionality to support a color-always option in the ~/.colorgccrc configuration file
add functionality to support a color-always command line option that you strip before passing the rest of the options to the compiler
You could also use the expect script unbuffer to create a pseudo-tty like this:
unbuffer gcc file.c | cat
(where cat is a standin recipient).
All of this is based on using colorgcc from the command line. There should be analogs for doing similar things within a Perl script.
Many programs that generate colored output detect if they're writing to a TTY, and switch off colors if they aren't. This is because color codes are annoying when you only want to capture the text, so they try to "do the right thing" automatically.
The simplest way to capture color output from a program like that is to tell it to write color even though it's not connected to a TTY. You'll have to read the program's documentation to find out if it has that option.
The other option is to use a Pty instead of a pipe. In Perl, you can do this with IO::Pty::HalfDuplex or IO::Pty::Easy, both of which are higher-level wrappers around the low-level module IO::Pty.
The source code of ColorGCC is quite clear about this topic!
#! /usr/bin/perl -w
# ...
# from: colorgcc-4.1.2/colorgcc-4.1.2
# downloaded from: http://www.console-colors.de/index.php?n=ConsColors.Downloads
#
# Note:
#
# colorgcc will only emit color codes if:
#
# (1) Its STDOUT is a tty and
# (2) the value of $TERM is not listed in the "nocolor" option.
#
# If colorgcc colorizes the output, the compiler's STDERR will be
# combined with STDOUT. Otherwise, colorgcc just passes the output from
# the compiler through without modification.
# .....
# If it's in the list of terminal types not to color, or if
# we're writing to something that's not a tty, don't do color.
if (! -t STDOUT || $nocolor{$terminal})
{
exec $compiler, #ARGV
or die("Couldn't exec");
}
In addition to use a Pty instead of a pipe in Perl (as already pointed out by cjm) you can trick an application into thinking its stdin is interactive, not a pipe on the command line as well.
For example:
# Linux
script -c "[executable string]" /dev/null
# FreeBSD, Mac OS X
script -q /dev/null "[executable string]"
For further solutions see: bash: force exec'd process to have unbuffered stdout