Running stored command with environment variables - sh

This works in sh:
$ export command="true"
$ "$command"
If I need to pass some variable to the command, however, I can't get it to work:
$ export command="env FOO=\"bar\" true"
$ "$command"
sh: 2: env FOO="bar" true: not found
What is missing?

Since you've put $command in quotes this prevents the shell from performing field splitting on it's contents, it's treated as one single word. Thus the name of the of command it tries to run is the full contents of $command including it's spaces and double quote characters.
You could omit the quotes and this almost does what you want:
export command="env FOO=\"bar\" true"
$command
Except this will set FOO to "bar" (with the double quotes) and not bar (without the quotes) as you probably expected. It also won't work if you try to set FOO to something with a space in it:
$ command="env FOO=\"bar foo\" true"
$ $command
env: foo": No such file or directory
The problem here being that double quote characters in the variable command are treated like any other character after being substituted. They aren't removed nor do they prevent field splitting.
What you can do instead is use the eval command:
command="env FOO=\"bar foo\" true"
eval "$command"
This causes the shell to completely re-evaluate everything after eval again and execute as a command. In other words after it does substitution, splitting, expansion, and quote removal on eval "$command" it then does substitution, splitting and expansion and quote removal on env FOO="bar foo" true as if it were the command entered into the shell.
However the eval command is very dangerous. You have to trust everything you pass to it. If you pass something that came from an untrusted user than the user can craft a string that will let him run any command he wants.
As an alternative to eval you can use either alias or a function. The alias command is just as dangerous as eval, but if you're only going to be using $command in from the command prompt and never from scripts then using an alias is the easiest and simplest solution:
$ alias mycommand="env FOO=\"bar\" true"
$ mycommand
If you're using it in shell scripts or both from the command line and in shell scripts I would use a function:
myfunc() {
env FOO="bar" true
}
myfunc
As an aside you don't actually need to use env to set an environment variable for a command when using one of the Bourne shells, you can just set the variable:
FOO="bar" true
When used like this the variable assignment works like with the env command, it doesn't change the value of FOO in the shell's environment, only in the true command's environment.

Related

How to run Bash commands with a PowerShell Core alias?

I am trying to run a Bash command where an alias exists in PowerShell Core.
I want to clear the bash history. Example code below:
# Launch PowerShell core on Linux
pwsh
# Attempt 1
history -c
Get-History: Missing an argument for parameter 'Count'. Specify a parameter of type 'System.Int32' and try again.
# Attempt 2
bash history -c
/usr/bin/bash: history: No such file or directory
# Attempt 3
& "history -c"
&: The term 'history -c' is not recognized as the name of a cmdlet, function, script file, or operable program.
It seems the issue is related to history being an alias for Get-History - is there a way to run Bash commands in PowerShell core with an alias?
history is a Bash builtin, i.e. an internal command that can only be invoked from inside a Bash session; thus, by definition you cannot invoke it directly from PowerShell.
In PowerShell history is an alias of PowerShell's own Get-History cmdlet, where -c references the -Count parameter, which requires an argument (the number of history entries to retrieve).
Unfortunately, Clear-History is not enough to clear PowerShell's session history as of PowerShell 7.2, because it only clear's one history (PowerShell's own), not also the one provided by the PSReadLine module used for command-line editing by default - see this answer.
Your attempt to call bash explicitly with your command - bash history -c - is syntactically flawed (see bottom section).
However, even fixing the syntax problem - bash -c 'history -c' - does not clear Bash's history - it seemingly has no effect (and adding the -i option doesn't help) - I don't know why.
The workaround is to remove the file that underlies Bash's (persisted) command history directly:
if (Test-Path $HOME\.bash_history) { Remove-Item -Force $HOME\.bash_history }
To answer the general question implied by the post's title:
To pass a command with arguments to bash for execution, pass it to bash -c, as a single string; e.g.:
bash -c 'date +%s'
Without -c, the first argument would be interpreted as the name or path of a script file.
Note that any additional arguments following the first -c argument would become the arguments to the first argument; that is, the first argument acts as a mini-script that can receive arguments the way scripts usually do, via $1, ...:
# Note: the second argument, "-", becomes $0 in Bash terms,
# i.e. the name of the script
PS> bash -c 'echo $0; echo arg count: $#' self one two
self
arg count: 2

Pass in a command string to AutoHotkey

I want to run a single AutoHotkey command. A script seems kindof overkill.
In bash and powershell, you can run a command by passing it in as a string to the shell:
pwsh -Command ls
bash -c ls
Is there a way to do this with AutoHotKey.exe? In the documentation, all I see is that you can pass the name of a script file to execute. If powershell supported process substitution <(ls), I could do
AutoHotKey.exe <(echo "ls")
But I don't think there's a way to do this in powershell.
Is there another way other than creating a complicated version of process substitution myself?
The linked docs state:
[v1.1.17+]: Specify an asterisk (*) for the filename to read the script text from standard input (stdin). For an example, see ExecScript().
For instance, from PowerShell:
'MsgBox % "Hello, world."' | AutoHotKey.exe *
I'm not sure if you're looking for what the other answer states, or then for this what I'm about to write:
You can pass arguments into an AHK script and those arguments are then found from the built in variable A_Args.
Example AHK script:
for each, arg in A_Args
output .= arg "`n"
MsgBox, % output
PowerShell command:
& "C:\Path\To\My\Script.ahk" arg1 arg2 "this is the 3rd argument" arg4
This would be if you have AHK installed. If you have some portable AHK setup, you'd pass in the example script to AutoHotkey.exe and then the desired arguments.

Perl qx() command not working as expected

I have a perl script as below, where I want to access a network path on a remote windows machine from a linux machine using rsh.
$cmd = "rsh -l $username $host \"pushd \\\\network\\path\\to\\the\\directory && dir\"";
print $cmd, "\n";
print qx($cmd);
When I run the script the third line prints output The system cannot find the path specified. However, if I run the command printed by the second line directly from the terminal it works fine.
I'm not able to understand why the script is not working. If the command works from the terminal, it should work using qx() too.
While you escape your meta-characters against interpolation by double-quotes and by the remote shell, qx might itself interpolate the string again, in which case you might need to add another level of escaping. From the documentation of qx:
A string which is (possibly) interpolated and then executed as a system command with /bin/sh or its equivalent. ...
How that string gets evaluated is entirely subject to the command interpreter on your system. On most platforms, you will have to protect shell metacharacters if you want them treated literally. This is in practice difficult to do, as it's unclear how to escape which characters. See perlsec for a clean and safe example of a manual fork() and exec() to emulate backticks safely.

Perl ENV hash not populated when calling script using env -i

I have a Perl script which needs to be called from a clean environment. The way I call the script is as follows.
env -i test.pl
After invoking env -i, certain environment variables are left intact, such as PATH. However, according to the ENV hash, the PATH variable is empty.
Here is the script (abridged):
#!/usr/bin/perl -w
system('echo "system: $PATH"');
print "perl hash: $ENV{'PATH'}\n";
This is the output when I run env -i test.pl:
system: /usr/local/bin:/bin:/usr/bin
Use of uninitialized value in concatenation (.) or string at test.pl line 4.
perl hash:
How can I get the ENV hash to be correct when I invoke the script under env -i?
The -i option to env does in fact clear $PATH out of the environment. The reason you see inconsistent results from the system call is because of how system works:
If there is only one scalar argument, the argument is checked for
shell metacharacters, and if there are any, the entire argument is
passed to the system's command shell for parsing (this is /bin/sh -c
on Unix platforms, but varies on other platforms). If there are no
shell metacharacters in the argument, it is split into words and
passed directly to execvp , which is more efficient.
Because you are calling system with a single commandline that must be parsed by a shell, one is started, which initializes its own $PATH upon startup (because shells do that), and that's the one you're seeing.
After invoking env -i, certain environment variables are left intact, such as PATH. However, according to the ENV hash, the PATH variable is empty.
On my rhel system, env -i does:
-i, --ignore-environment
start with an empty environment
So when you run the Perl program, the system() call is forking a shell, whic is probably reading an rc file that is setting PATH. but when you print $ENV{PATH} directly form the Perl program, it is empty as expected.
You should either manually set PATH, or determine the full paths to the various executables that you intend to run.
If you're going to spend a lot of time running system() commands, then maybe a shell script, with some Perl to process intermediate results is a better processing model in this case.

Which shell does a Perl system() call use?

I am using a system call to do some tasks
system('myframework mycode');
but it complains of missing environment variables.
Those environment variables are set at my bash shell (from where I run the Perl code).
What am I doing wrong?
Does the system call create a brand new shell (without environment variable settings)? How can I avoid that?
It's complicated. Perl does not necessarily invoke a shell. Perldoc says:
If there is only one scalar argument, the argument is checked for shell metacharacters, and if there are any, the entire argument is passed to the system's command shell for parsing (this is /bin/sh -c on Unix platforms, but varies on other platforms). If there are no shell metacharacters in the argument, it is split into words and passed directly to execvp , which is more efficient.
So it actually looks like you would have the arguments passed right to execvp. Furthermore, whether the shell loaded your .bashrc, .profile, or .bash_profile depends on whether the shell is interactive. Likely it isn't, but you can check like this.
If you don't want to invoke a shell, call system with a list:
system 'mycommand', 'arg1', '...';
system qw{mycommand arg1 ...};
If you want a specific shell, call it explicitly:
system "/path/to/mysh -c 'mycommand arg1 ...'";
I think it's not the question of shell choice, since environment variables are always inherited by subprocesses unless cleaned up explicitly.
Are you sure you have exported your variables?
This will work:
$ A=5 perl -e 'system(q{echo $A});'
5
$
This will work too:
$ export A=5
$ perl -e 'system(q{echo $A});'
5
$
This wouldn't:
$ A=5
$ perl -e 'system(q{echo $A});'
$
system() calls /bin/sh as a shell. If you are on a somewhat different box like ARM it would be good to read the man page for the exec family of calls -- default behavior. You can invoke your .profile if you need to, since system() takes a command
system(" . myhome/me/.profile && /path/to/mycommand")
I've struggled for 2 days working on this. In my case, environment variables were correctly set under linux but not cygwin.
From mkb's answer I thought to check out man perlrun and it mentions a variable called PERL5SHELL (specific to the Win32 port). The following then solved the problem:
$ENV{PERL5SHELL} = "sh";
As is often the case - all I can really say is "it works for me", although the documentation does imply that this might be a sensible solution:
May be set to an alternative shell that perl must use internally for executing "backtick" commands or system().
If the shell used by perl does not implicitly inherit the environment variables then they will not be set for you.
I messed with environment variables being set for my script on this post where I needed the env variable $DBUS_SESSION_BUS_ADDRESS to be set, but it wouldn't when I called the script as root. You can read through that, but in the end you can check whether %ENV contains your needed variables and if not add them.
From perlvar
%ENV
$ENV{expr}
The hash %ENV contains your current environment. Setting a value in "ENV" changes
the environment for any child processes you subsequently fork() off.
My problem was that I was running the script under sudo and that didn't preserve all my user's env variables, are you running the script under sudo or as some other user, say www-data (apache)?
Simple test:
user#host:~$ perl -e 'print $ENV{q/MY_ENV_VARIABLE/} . "\n"'
and if that doesn't work then you will need to add it to %ENV at the top of your script.
try system("echo \$SHELL"); on your system.