How to create a variable in shell script whose name is present in variable? - sh

Let's say I declare two variables key and value like this in shell script
$ key=key1
$ value=value1
now I want to create a variable of name key1 and assign it value value1. What I tried is
$ export ${key}
$ export ${value}
$ $($key=$value)
output: key1=value1: command not found
I don't know how to do this.

Use eval $key=$value instead of $($key=$value): The shell substitutes variable values first, and then the $(...) substitution. As there is not command with that name the shell shows command not found in STDERR. Tell the shell to evaluate the substitution result as a regular shell command again by using eval. Later you can export the result with export $key. Using the -x flag for the shell provides a good insight of what happens:
$ key=key1
$ value=value1
$ set -x
$ $key=$value
+ key1=value1
sh: key1=value1: not found [No such file or directory]
$ $($key=$value)
+ key1=value1
sh: key1=value1: not found [No such file or directory]
$ eval $key=$value
+ eval key1=value1
+ key1=value1
$ echo $key1
+ echo value1
value1
$ export $key
+ export key1
Also, be careful when dealing with variables this way: Whitespaces in shell variables can have unexpected results in this kind of constructions.

Related

Parsing arguments in fish shell without argparse

I'm using fish shell and wrote my own little parser function because I found argparse confusing. Basically, if a flag is matched, it uses the information from the following argument. However, I'm assuming my method must introduce bugs as I haven't seen this method used online. Are there advantages to using argparse that I'm missing?
function check_args
for current_arg in (seq 1 (count $argv))
#grab next argument
set next_arg $argv[(math $current_arg + 1)]
switch $argv[$current_arg]
case -h --help
usage
break
case -t --theme
echo "theme: " $next_arg
set -g theme themes/$next_arg.css
case -f --format
echo "format: " $next_arg
set -g format $next_arg
case -o --output
echo "output: " $next_arg
set -g output $next_arg
end
end
end
check_args $argv #calls the function with the passed arguments
With argparse:
# the -- is required!
argparse h/help t/theme= f/format= o/output= -- $argv
or exit 1
# just to inspect the variables
set -S _flag_h _flag_help _flag_t _flag_theme _flag_f _flag_format _flag_o _flag_output
if set -q _flag_help
usage
exit
end
set theme themes/$_flag_theme.css
set format $_flag_format
set output $_flag_output

How to execute gpg command with Perl

I am trying to execute the following gpg command from within Perl:
`gpg --yes -e -r me#mydomain.com "$backupPath\\$backupname"`;
However I get the following error:
Global symbol "#mydomain" requires explicit package name (did you forget to declare "my #mydomain"?)
Obviously I need to escape the '#' symbol but don't know how. How do I execute this command in Perl?
When you do:
`gpg --yes -e -r me#mydomain.com "$backupPath\\$backupname"`;
perl sees the #mydomain part and assumes you want to interpolate the #mydomain array right into the string.
But since there was no #domain array declared, it gives you the error.
The fix is simple: To tell perl that you want to treat #mydomain as a string and not as an array, simply put a backslash (\) before the #, like this:
`gpg --yes -e -r me\#mydomain.com "$backupPath\\$backupname"`;
Backticks run process in sub shell (slower, consumes more resources) and have some issues which you should investigate.
Following code demonstrates other approach which does not spawn sub shell.
use strict;
use warnings;
use feature 'say';
my $backupPath = '/some/path/dir';
my $backupname = 'backup_name';
my $command = "gpg --yes -e -r me\#mydomain.com $backupPath/$backupname";
my #to_run = split ' ', $command;
system(#to_run);
Handling backticks in Perl
Problem with backticks in multi-threaded Perl script on Windows

How to command substitution map in fish shell

I would like to be able to do this:
bat misc | rg -v -e 'TXT|txt' | map path_explode
where
function map
while read line
command $argv $line
end
end
and
function path_explode --description 'Return filename, ext, and directory from the path'
echo $argv[1] | sed 's/\(.*\)\/\(.*\)\.\(.*\)$/\2\n\3\n\1/'
end
does that make sense? I get this error:
fish: Unknown command: path_explode
fish:
command $argv $line
^
in function 'map' with arguments 'path_explode'
In fish 3 or later, variables can be used directly as commands, so you don't need eval or command in your map function.
ohh I just need to use eval instead of command
cool :)

using here document and pipeline of sh at the same time

I'm using here document of sh to run some commands. now I want to parse the output of those commands using awk. However, everytime I execute it, I get the output of the command append with something like this "% No such child process"
This is how my script looks like.
#!/bin/sh
com = "sudo -u username /path/of/file -l"
$com <<EOF | awk '{print $0}'
Commands.
.
.
.
EOF
How am I going to use heredoc and pipeline without appending that unwanted string?
Thanks
Your variable assignment is wrong in a couple of ways. First, you aren't actually assigning a variable; you're trying to run a command named com whose arguments are = and a string "sudo ...". Spaces must not be used on either side of the =:
com="sudo ..."
Second, command lines should not be stored in a variable; the shell's parser can only make that work they way you intend for very simple commands. Type the command out in full, or use a shell function.
com () {
sudo -u username /path/to/file -l
}
com <<EOF | awk '{print $0}'
...
EOF
There's no problem, check :
$ cat <<EOF | awk '{print $1}'
a b c
1 2 3
EOF
a
1

Escaping whitespace within nested shell/perl scripts

I'm trying to run a perl script from within a bash script (I'll change this design later on, but for now, bear with me). The bash script receives the argument that it will run. The argument to the script is as follows:
test.sh "myscript.pl -g \"Some Example\" -n 1 -p 45"
within the bash script, I simple run the argument that was passed:
#!/bin/sh
$1
However, in my perl script the -g argument only gets "Some (that's with the quotes), instead of the Some Example. Even if I quote it, it cuts off because of the whitespace.
I tried escaping the whitespace, but it doesn't work... any ideas?
To run it as posted test.sh "myscript.pl -g \"Some Example\" -n 1 -p 45" do this:
#!/bin/bash
eval "$1"
This causes the $1 argument to be parsed by the shell so the individual words will be broken up and the quotes removed.
Or if you want you could remove the quotes and run test.sh myscript.pl -g "Some Example" -n 1 -p 45 if you changed your script to:
#!/bin/bash
"$#"
The "$#" gets replaced by all the arguments $1, $2, etc., as many as were passed in on the command line.
Quoting is normally handled by the parser, which isn't seeing them when you substitute the value of $1 in your script.
You may have more luck with:
#!/bin/sh
eval "$1"
which gives:
$ sh test.sh 'perl -le "for (#ARGV) { print; }" "hello world" bye'
hello world
bye
Note that simply forcing the shell to interpret the quoting with "$1" won't work because then it tries to treat the first argument (i.e., the entire command) as the name of the command to be executed. You need the pass through eval to get proper quoting and then re-parsing of the command.
This approach is (obviously?) dangerous and fraught with security risks.
I would suggest you name the perl script in a separate word, then you can quote the parameters when referring to them, and still easily extract the script name without needing the shell to split the words, which is the fundamental problem you have.
test.sh myscript.pl "-g \"Some Example\" -n 1 -p 45"
and then
#!/bin/sh
$1 "$2"
if you really have to do this (for whatever reason) why not just do:
sh test.sh "'Some Example' -n 1 -p 45"
in:
test.sh
RUN=myscript.pl
echo `$RUN $1
(there should be backticks ` before $RUN and after $1)