How to export a shell variable within a Perl script? - perl

I have a shell script, with a list of shell variables, which is executed before entering a programming environment.
I want to use a Perl script to enter the programming environment:
system("environment_defaults.sh");
system("obe");
But when I enter the environment the variables are not set.

When you call your second command, it's not done in the environment you modified in the first command. In fact, there is no environment remaining from the first command, because the shell used to invoke "environment_defaults.sh" has already exited.
To keep the context of the first command in the second, invoke them in the same shell:
system("source environment_defaults.sh && obe");
Note that you need to invoke the shell script with source in order to perform its actions in the current shell, rather than invoking a new shell to execute them.
Alternatively, modify your environment at the beginning of every shell (e.g. with .bash_profile, if using bash), or make your environment variable changes in perl itself:
$ENV{FOO} = "hello";
system('echo $FOO');

Different sh -c processes will be called and environment variables are isolated within these.
Also doesn't calling environment_defaults.sh also make another sh process within what these variables will be set to in isolation?
Or start the Perl script with these environment variables exported and these will be set for all its child processes.

Each process gets its own environment, and each time you call "system" it runs a new process. So, what you are doing won't work. You'll have to run both commands in a single process.
Be aware, however, that after your Perl script exists, any environment variables it sets won't be available to you at the command line, because your Perl script is also a process with its own environment.

(UPDATE: Oh, this is not exactly what you asked for, but it might be useful for someone.)
If GDB is installed, you can set/modify parent shell variables with the following hack (non-strict style is used for clarity):
#!/usr/bin/perl
# export.pl
use File::Temp qw( tempfile );
%vars = (
a => 3,
b => 'pigs'
);
$ppid = getppid;
my #putvars = map { "call putenv (\"$_=$vars{$_}\")" } keys %vars;
$" = "\n";
$cmds = <<EOF;
attach $ppid
#putvars
detach
quit
EOF
($tmpfh, $tmpfn) = tempfile( UNLINK => 1 );
print $tmpfh $cmds;
`gdb -x $tmpfn`
Test:
$ echo "$a $b"
$ ./export.pl
$ echo "$a $b"
3 pigs

This can now be done with the Env::Modify module
use Env::Modify 'source'; # or use Env::Modify qw(source system);
source("environment_defaults.sh");
... environment from environment_defaults.sh is now available
... to Perl and to the following 'system' call
system("obe");

Related

How Perl can execute a command in the same shell with it?

I am not sure whether the title is really make sense to this problem. My problem is simple, I want to write a perl script to change my current directory and hope the result can be kept after calling the perl script. The script looks like this:
if ($#ARGV != 0) {
print "usage: mycd <dir symbol>";
exit -1;
}
my $dn = shift #ARGV;
if ($dn eq "kite") {
my $cl = `cd ./private`;
print $cl."\n";
}
else {
print "unknown directory symbol";
exit -1;
}
However, my current directory doesn't change after calling the script. What is the reason? How can I resolve it?
No, the Perl script will be run in a subprocess so it will not be able to affect the environment of the process that called it.
There are various tricks you can use such as sourcing shell scripts (in the context of the current shell rather than a sub-process), or using bash functions and aliases, but they won't work here.
How Perl can execute a command in the same shell with it?
Unless you have a very atypical shell, shells can only receive commands via STDIN, via its command line, and possibly via a command evaluation builtin.
The first two are out unless the Perl script is the parent of the shell, but you could use the third one indirectly as in the following example.
script.pl:
#!/usr/bin/perl
print "chdir 'private'\n";
bash script:
echo "$PWD" # /some/dir
eval "$( script.pl )"
echo "$PWD" # /some/dir/private
Of course, if you use bash, you could hide the details in a shell function.
mycd () {
eval "$( mycd.pl "$#" )"
}
Allowing you use to use
mycd
or even
mycd foo

Shell Programming inside Perl

I am writing a code in perl with embedded shell script in it:
#!/usr/bin/perl
use strict;
use warnings;
our sub main {
my $n;
my $n2=0;
$n=chdir("/home/directory/");
if($n){
print "change directory successful $n \n";
$n2 = system("cd", "section");
system("ls");
print "$n2 \n";
}
else {
print "no success $n \n";
}
print "$n\n";
}
main();
But it doesn't work. When I do the ls. The ls doesn't show new files. Anyone knows another way of doing it. I know I can use chdir(), but that is not the only problem, as I have other commands which I have created, which are simply shell commands put together. So does anyone know how to exactly use cli in perl, so that my compiler will keep the shell script attached to the same process rather than making a new process for each system ... I really don't know what to do.
The edits have been used to improve the question. Please don't mind the edits if the question is clear.
edits: good point made by mob that the system is a single process so it dies everytime. But, What I am trying to do is create a perl script which follows an algorithm which decides the flow of control of the shell script. So how do I make all these shell commands to follow the same process?
system spawns a new process, and any changes made to the environment of the new process are lost when the process exits. So calling system("cd foo") will change the directory to foo inside of a very short-lived process, but won't have any effect on the current process or any future subprocesses.
To do what you want to do (*), combine your commands into a single system call.
$n2 = system("cd section; ls");
You can use Perl's rich quoting features to pass longer sequences of commands, if necessary.
$n2 = system q{cd section
if ls foo ; then
echo we found foo in section
./process foo
else
echo we did not find foo\!
./create_foo > foo
fi};
$n2 = system << "EOSH";
cd section
./process bar > /tmp/log
cd ../sekshun
./process foo >> /tmp/log
if grep -q Error /tmp/log ; then
echo there were errors ...
fi
EOSH
(*) of course there are easy ways to do this entirely in Perl, but let's assume that the OP eventually will need some function only available in an external program
system("cd", "section"); attempts to execute the program cd, but there is no such program on your machine.
There is no such program because each process has its own current work directory, and one process cannot change another process's current work directory. (Programs would malfunction left and right if it was possible.)
It looks like you are attempting to have a Perl program execute a shell script. That requires recreating the shell in Perl. (More specific goals might have simpler solutions.)
What I am trying to do is create a perl script which follows an algorithm which decides the flow of control of the shell script.
Minimal change:
Create a shell script that prompts for instructions/commands. Have your Perl script launch the shell script using Expect and feed it answers/commands.

execute shell commands using perl while keeping shell environment variables

I need to run sequential shell commands using perl, but I need the shell environment to keep its variables.
for example:
$result = `cd /`;
$result = `touch test.txt`;
In this example, I need test.txt to be created on /.
Also, I don't want to run them in one line code like $result=touch /test.txt; I need seperate calls to shell while environment variables remain the same.
chdir '/';
$result = `touch test.txt`;

Eval for multiple command execution in ksh93, Solaris

I would like to execute two or more commands back to back . But these commands are stored in a variable in my script. For example,
var="/usr/bin/ls ; pwd ; pooladm -d; pooladm -e"
The problem arises when I execute this variable via my script.
Suppose I go:
#!/bin/ksh -p
..
..
var="/usr/bin/ls ; pwd;pooladm -d; pooladm -e"
..
..
$var # DOES NOT WORK ..BUT WORKS WITH EVAL
It doesn't work ..
But the moment I use eval :
eval $var
It works brilliantly.
I was just wondering if there is any other way to execute a bunch of commands stored in a variable without using eval.
Also , Is eval usage considered a bad programming practice because my coding standards appear to shun its usage than embrace it . Please do let me know.
Remember that the shell only parses the line once. So when you expand your $var, it becomes one string containing blanks. Since you have no executable named '/usr/bin/ls ; pwd;pooladm -d; pooladm -e', it can't run it.
On the other hand, eval takes its arguments are re-scans them, now you get '/usr/bin/ls', 'pwd', and so on. It works.
eval is a little chancy because it leaves a possible security hole -- consider if someone managed to get 'rm -rf /' into the string. But it's a useful tool.
Use backticks and echo. In your case
`echo $var`
You could invoke another copy of the shell to run the command:
sh -c "$var"
This isn't necessarily better than using eval. The main practical difference is that eval will run the commands in the context of the current shell, while "sh -c" runs the commands in a separate shell instance. If var contains commands to set environment variables or change the current directory, you or may not want those commands to affect the current shell.

Have perl execute shellscript & take over env vars

I have a shell script that does nothing but set a bunch of environment variables:
export MYROOTDIR=/home/myuser/mytools
export PATH=$MYROOTDIR/bin:$PATH
export MYVERSION=0.4a
I have a perl script, and I want the perl script to somehow get the perl script to operate with the env vars listed in the shell script. I need this to happen from within the perl script though, I do not want the caller of the perlscript to have to manually source the shellscript first.
When trying to run
system("sh myshell.sh")
the env vars do not "propagate up" to the process running the perl script.
Is there a way to do this?
To answer this question properly, I need to know a bit more.
Is it okay to actually run the shell script from within the perl script?
Are the variable assignments all of the form export VAR=value (i.e. with fixed assignments, no variable substitutions or command substitutions)?
Does the shell script do anything else but assign variables?
Depending on answers to these, options of different complexity exist.
Thanks for the clarification. Okay, here's how to do it. Other than assigning variables, your script has no side effects. This allows to run the script from within perl. How do we know what variables are exported in the script? We could try to parse the shell script, but that's not the Unix way of using tools that do one thing well and chain them together. Instead we use the shell's export -p command to have it announce all exported variables and their values. In order to find only the variables actually set by the script, and not all the other noise, the script is started with a clean environment using env -i, another underestimated POSIX gem.
Putting it all together:
#!/usr/bin/env perl
use strict;
use warnings;
my #cmd = (
"env", "-i", "PATH=$ENV{PATH}", "sh", "-c", ". ./myshell.sh; export -p"
);
open (my $SCRIPT, '-|', #cmd) or die;
while (<$SCRIPT>) {
next unless /^export ([^=]*)=(.*)/;
print "\$ENV{$1} = '$2'\n";
$ENV{$1} = $2;
}
close $SCRIPT;
Notes:
You need to pass to env -i all environment your myshell.sh needs, e.g. PATH.
Shells will usually export the PWD variable; if you don't want this in your perl ENV hash, add next if $1 eq 'PWD'; after the first next.
This should do the trick. Let me know if it works.
See also:
http://pubs.opengroup.org/onlinepubs/009695399/utilities/export.html
http://pubs.opengroup.org/onlinepubs/009695399/utilities/env.html
Try Shell::Source.
You can set the environment variables inside the BEGIN block.
BEGIN block is executed before the rest of the code, setting the environment variables in this block makes them visible to the rest of the code before it is compiled and run.
If you have any perl modules to 'use' based on the enviornment settings, BEGIN block makes it possible.
Perl uses a special hash %ENV to maintain the environment variables.
You can modify the contents of this hash to set the env variables.
EXAMPLE :
BEGIN
{
$ENV { 'MYROOTDIR' } = '/home/myuser/mytools';
$ENV { 'PATH' } = "$ENV{ 'MYROOTDIR' }/bin:$ENV{ 'PATH' }";
}
Wouldn't it be easier for a shell script to set the variables, and then call the perl program?
i.e.:
run.sh:
#!/bin/sh
export MYROOTDIR=/home/myuser/mytools
export PATH=$MYROOTDIR/bin:$PATH
export MYVERSION=0.4a
./program.pl
This can now be done with Env::Modify with few changes to your existing code.
use Env::Modify 'system';
...
system("sh myshell.sh");
print $ENV{MYROOTDIR}; # "/home/myuser/mytools"
or if all your shell script does is modify the environment, you can use the source function
use Env::Modify `source`;
source("myshell.sh");
print $ENV{MYVERSION}; # "0.4a"