This question already has answers here:
I just assigned a variable, but echo $variable shows something else
(7 answers)
Closed 2 years ago.
The file myfile.sh looks like this:
echo "hello"
The file I run looks like this:
a=$(cat myfile.sh)
echo $a
When I run the file, I only get the output:
echo "hello"
And not what the actual file content is. What's going on here?
http://www.gnu.org/software/bash/manual/html_node/Command-Substitution.html
In a=$(cat myfile.sh) your variable gets assigned
the standard output of the command, with any trailing newlines deleted
And that is where your extra lines went.
From the man page for dash (/bin/sh) on my system,
The shell expands the command substitution by executing command in a subshell environment and replacing the command substitution with the standard output of the command, removing sequences of one or more ⟨newline⟩s at the end of the substitution. (Embedded ⟨newline⟩s before the end of the output are not removed; however, during field splitting, they may be translated into ⟨space⟩s, depending on the value of IFS and quoting that is in effect.)
(Emphasis mine.)
You could use the following:
#!/bin/sh
LF='
'
a=
while read -r line; do
a="$a$line$LF"
done
printf -- '--\n%s--' "$a"
Test:
$ printf 'a b c\nd\n\n\ne\n\n\n' | ./a.sh
--
a b c
d
e
--
Related
Is it possible to have fishshell split variables that are in cmd line arguments?
Assume I have a variable $args set like so:
$ set args "-a args"
Now, given this python program (test.py):
import sys
print(sys.argv)
If I run the above in fishshell I get this output:
$ python test.py $args
['test.py', '-a args']
Notice that the arguments are passed as one argument. When I do the equivalent in bash I get this output:
$ python test.py $args
['test.py', '-a', 'params']
Is there someway to make fish behave like bash?
You do not want fish to behave like bash (technically any POSIX compatible shell) with respect to variable expansion. The POSIX behavior is the source of endless problems and is why you need to put double-quotes around almost everything. In fact, most experienced people will tell you to add IFS=$'\n' at the top of your scripts to stop that auto-splitting from happening.
One answer is to use fish's "every var is a list" feature: set args "-a" "args" (the quotes are just for clarity and aren't needed in this example). Each element of the list becomes a separate argument to the command. This will do the right thing even if the args value contains whitespace. The other answer is to explicitly split the string on whitespace using command substitution: a_cmd (string split ' ' $args). This will not do the right thing (in fish or bash) if the args value contains whitespace.
I found a little hack with fish commandline tokenization:
function posix_expand_str --description "Expand a string the POSIX way."
set __posix_expand_str__oldline (commandline)
commandline $argv
commandline -o
commandline $__posix_expand_str__oldline
set -e __posix_expand_str__oldline
end
All strings seem like they were concatenated during testing.
When you realize this answered your question, please accept. It only POSIXes when you ask it to, and does not break strings.
Test results:
> posix_expand_str "hello world"
hello
world
> posix_expand_str "hello 'posix haters' world"
hello
posix haters
world
> posix_expand_str "hello" 'high rep "stackoverflow staff"' "world"
hello
high
rep
stackoverflow staff
world
I often see this command in node.js programs: NODE_ENV=test node app.js which sets the NODE_ENV variable to test and works. I also read here https://en.wikipedia.org/wiki/Environment_variable that this should work for any shell command, but running some tests on my own, here is what I see
$ HELLO="WORLD"
$ HELLO="MARS" echo "$HELLO"
WORLD
$
I would expect this to print MARS. Is there something I am missing here?
The syntax VAR=value command means that the command will be invoked with the environment variable VAR set to VALUE, and this will apply only for the scope of that command.
However, when you are using the command line:
HELLO="MARS" echo "$HELLO"
The shell first interprets the "$HELLO" parameter, determines that it is WORLD, and then what it actually does is run:
HELLO="MARS" echo "WORLD"
So the echo may have the HELLO variable set, but it doesn't affect what it prints - it has already been interpreted before.
Doing
HELLO="MARS"; echo "$HELLO"
does something else entirely. First it sets HELLO to MARS in the current shell, and then it goes on to interpret the echo command. By this time HELLO contains MARS, not WORLD. But this is an entirely different effect - the variable HELLO stays with the value MARS, which is not the case in the command without the ;.
Your problem is that echo is just a poor choice for a demonstartion of this. You can do other demonstrations to prove that HELLO is changed properly:
HELLO="MARS" eval 'echo $HELLO'
In this case, the shell will not interpret the $HELLO because it is within a string in single quotes. It will first put MARS in HELLO, and then call the eval 'echo $HELLO' with that variable set. The eval command with then run echo $HELLO, and you'll get the output you were expecting.
This syntax is best used for things that don't use the given variable as part of the command line, but rather use it internally.
Other answers are correct, but here a refinement :
There are 2 cases in fact when defining a list of variable separated by spaces in bash whether it ends or not with a command.
VAR1=value1 VAR2=value2 ... VARn=valuen command arg1 arg2 ... argn
and
VAR1=value1 VAR2=value2 ... VARn=valuen
don't export VAR1 ... VARn the same way.
In first case VAR1 ... VARn will be set only for command and will then not be exported to current shell.
In second case VAR1 ... VARn will alter current shell.
then ( remark that ';' is very same of using a new line )
HELLO=WORLD
HELLO=MARS echo "i don't export HELLO."
echo "HELLO=$HELLO"
will display
i don't export HELLO.
HELLO=WORLD
and
HELLO=WORLD
HELLO=MARS ; echo "i did export HELLO."
echo "HELLO=$HELLO"
will display
i did export HELLO.
HELLO=MARS
I want to create scipt to faciliate producing local text file extracts from Hive.
This is to basically execute commands like below:
hive -e "SET hive.cli.print.header=true;SELECT * FROM dropme"|perl -pe 's/(?:\t|^)\KNULL(?=\t|$)//g'>extract/outbound/dropme.txt
While the above works like a charm I find it quite problematic to implement through the parametrized following script (much simplified):
#!/bin/sh
TNAME=dropme
SQL="SELECT * FROM $TNAME"
echo $SQL
echo "SQL: $SQL"
EXTRACMD="hive -e \"SET hive.cli.print.header=true;$SQL\"|perl -pe 'BEGIN{if(defined(\$_=<ARGV>)){s/\b\w+\.//g;print}}s/(?:\t|^)\KNULL(?=\t|$)//g'>extract/outbound/$TNAME.txt"
echo "CMD: $EXTRACMD";
${EXTRACMD}
When run I get: Exception in thread "main" java.lang.NumberFormatException: For input string: "e"
I know there may be many flavours you can print the text or execute command. For instance the line echo $SQL prints me list of files in the directory instead:
SELECT file1.txt file2.txt file3.txt file4.txt FROM dropme
while the next one: echo "SQL: $SQL" gives just what I want: SQL: SELECT * FROM dropme
echo "CMD: $EXTRACMD" prints the (almost) the command to be executed. Almost, as I see \t in perl code being expanded:
CMD: hive -e "SET hive.cli.print.header=true;SELECT * FROM dropme"|perl -pe 'BEGIN{if(defined($_=<ARGV>)){s\w+\.//g;print}}s/(?: |^)\KNULL(?= |$)//g'>extract/outbound/dropme.txt
Maybe that's still ok, but what I want is to be able to copy&paste this command into (other) terminal and execute as the command I put at the top. Ideally I would like that command to be exactly the same (so with \t there)
Biggest problem I have comes when I try to execute it (${EXTRACMD} line). I'm getting the error:
Exception in thread "main" java.lang.NumberFormatException: For input string: "e" …and so on, irrelevant as bash treats every 'word' as single command here. I assume as I don't even know what is really tries to run (prior print attempt obviously doesn't help)
I'm aware that I have multiple options, like:
escaping special characters in the command definition string (like I did with doublequotes)
experimenting with echo and $VAR, '$VAR' or "$VAR"
experimenting with "${EXTRACMD}" or evaluating through eval "${EXTRACMD}"
experimenting with shopt -s extglob or set -f
but as number of combinations is quite large and with my little bash experience I feel it's better to ask for good practice here so my question is:
Is there a way to print a (complex/compound shell) command first and subsequently be able to execute it (exactly as per printed output)? In this case it would be printing the exact command from the top, then executing it the same way as by manually copying that output into terminal prompt and pressing Enter.
Do not construct commands as strings. See http://mywiki.wooledge.org/BashFAQ/050 for details.
That page also talks about a built-in way of getting the shell to tell you what it is running (section 6).
If that doesn't do what you want you can also, with bash, try using printf %q\\n "${arr[*]}".
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Renaming lots of files in Linux according to a pattern
I have multiple files in this format:
file_1.pdf
file_2.pdf
...
file_100.pdf
My question is how can I rename all files, that look like this:
file_001.pdf
file_002.pdf
...
file_100.pdf
I know you can rename multiple files with 'rename', but I don't know how to do this in this case.
You can do this using the Perl tool rename from the shell prompt. (There are other tools with the same name which may or may not be able to do this, so be careful.)
rename 's/(\d+)/sprintf("%03d", $1)/e' *.pdf
If you want to do a dry run to make sure you don't clobber any files, add the -n switch to the command.
note
If you run the following command (linux)
$ file $(readlink -f $(type -p rename))
and you have a result like
.../rename: Perl script, ASCII text executable
then this seems to be the right tool =)
This seems to be the default rename command on Ubuntu.
To make it the default on Debian and derivative like Ubuntu :
sudo update-alternatives --set rename /path/to/rename
Explanations
s/// is the base substitution expression : s/to_replace/replaced/, check perldoc perlre
(\d+) capture with () at least one integer : \d or more : + in $1
sprintf("%03d", $1) sprintf is like printf, but not used to print but to format a string with the same syntax. %03d is for zero padding, and $1 is the captured string. Check perldoc -f sprintf
the later perl's function is permited because of the e modifier at the end of the expression
If you want to do it with pure bash:
for f in file_*.pdf; do x="${f##*_}"; echo mv "$f" "${f%_*}$(printf '_%03d.pdf' "${x%.pdf}")"; done
(note the debugging echo)
I couldn't find an answer for this exact problem, so I'll ask it.
I'm working in Cygwin and want to reference previous commands using !n notation, e.g., if command 5 was which ls, then !5 runs the same command.
The problem is when trying to do substitution, so running:
!5:s/which \([a-z]\)/\1/
should just run ls, or whatever the argument was for which for command number 5.
I've tried several ways of doing this kind of substitution and get the same error:
bash: :s/which \([a-z]*\)/\1/: substitution failed
As far as I can tell the s/old/new/ history substitution syntax only does simple string substitution; it does not support full regexes. Here's what man bash has to say:
s/old/new/
Substitute new for the first occurrence of old in the event line. Any delimiter can be used in place of /. The final delimiter is optional if it is the last character of the event line. The delimiter may be quoted in old and new with a single backslash. If & appears in new, it is replaced by old. A single backslash will quote the &. If old is null, it is set to the last old substituted, or, if no previous history substitutions took place, the last string in a !?string[?] search.
Never fear, though. There are in fact easier ways to accomplish what you are trying to do:
!$ evaluates to the last argument of the previous command:
# ls /etc/passwd
/etc/passwd
# vim !$
vim /etc/passwd
!5:$ evaluates to the last argument of command #5:
# history
...
5: which ls
...
# !5:$
ls
You can also use Alt+. to perform an immediate substitution equivalent to !$. Alt+. is one of the best bash tricks I know.
This worked for me using Bash in Cygwin (note that my which ls command was number 501 in my history list; not 5 like yours):
$(!501 | sed 's/which \([a-z]\)/\1/')
You could also do it this way (which is shorter/cleaner):
$(!501 | sed 's/which //')