Quotes and slashes surviving multiple layers - perl

Goal
I need to effectively run a copy (cp) command but have explicit quote symbols preserved. This is needed so that the z/OS Unix System Services Korn shell properly recognizes the target of the copy as a traditional MVS dataset.
The complexity is that this step is part of an automated process. The command is generated by Perl. That Perl is executed on a separate Docker container via ssh. This adds another layer of escaping that needs to be addressed, in addition to the escaping needed by Perl.
Basically, docker is doing something like
perl myprogram.perl
which generates the necessary SSH commands, sending them to the mainframe which tries to run them. When I run the Perl script, it generates the command
sshpass -p passwd ssh woodsmn#bldbmsb.boulder.mycompany.com export _UNIX03=NO;cp -P "RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=\(TRACK,\(1,1\)\)" /u/woodsmn/SSC.D051721.T200335.S90.CP037 "//'WOODSMN.SSC.D051721.T200335.S90.CP037'"
and the mainframe returns an error:
cp: target "//'WOODSMN.SSC.D051721.T200335.S90.CP037'" is not a directory
The sshpass is needed because my sysadmin refuses to turn on authorized users, so my only option is to run sshpass and shove a password in. The password exposure is contained and we're not worried about this.
The first command
export _UNIX03=NO
tells z/OS to treat the -P option as an indicator for MVS dataset control blocks. That is, this is where we tell the system, hey this is a fixed length of 287 characters, allocate in tracks, etc. The dataset will be assumed to be new.
For the copy command, I'm wanting z/OS to copy the HFS file (basically a normal UNIX file)
/u/woodsmn/SSC.D051721.T200335.S90.CP037
into the fully qualifed MVS dataset
WOODSMN.SSC.D051721.T200335.S90.CP037
Sometimes MVS commands assume a high level qualifier of basically the users userid and allow the user to omit this. In this case, I've explicitly specified this.
To get z/OS to treat the target as a dataset, one needs to prefix it with two slashes (/), so //
to use a fully qualified dataset, the name needs to be surrounded by an apostrophe (')
But, to avoid confusion within Korn shell, the target needs to be surrounded by double quotes (").
So, somehow between Perl, the shell running my SSH command inside the Docker container (likely bash) and the receiving Korn shell on z/OS, it's not being properly interpreted.
My scaled down Perl looks like:
use strict;
use warnings;
sub putMvsFileByHfs;
use IO::Socket::IP;
use IO::Socket::SSL;
use IPC::Run3;
use Net::SCP;
my $SSCJCL_SOURCE_DIRECTORY = "/home/bluecost/";
my $SSCJCL_STORAGE_UNIT = "TRACK";
my $SSCJCL_PRIMARY_EXTENTS = "1";
my $SSCJCL_SECONDARY_EXTENTS = "1";
my $SSCJCL_HFS_LOCATION="/u/woodsmn";
my $SSCJCL_STAGING_HLQ = "WOODSMN";
my $COST_FILE="SSC.D051721.T200335.S90.CP037";
my $SSCJCL_USER_PW="mypass";
my $SCJCL_USER_ID="woodsmn";
my $SSCJCL_HOST_NAME="bldbmsb.boulder.mycompany.com";
my $MVS_FORMAT_OPTIONS="-P ".qq(")."RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=\\("
.${SSCJCL_STORAGE_UNIT}
.",\\("
.${SSCJCL_PRIMARY_EXTENTS}
.","
.${SSCJCL_SECONDARY_EXTENTS}
."\\)\\)".qq(");
putMvsFileByHfs(${MVS_FORMAT_OPTIONS}." ",
$SSCJCL_SOURCE_DIRECTORY.'/'.$COST_FILE,
${SSCJCL_HFS_LOCATION}.'/'.$COST_FILE,
${SSCJCL_STAGING_HLQ}.'.'.$COST_FILE);
# This function copys the file first from my local volume mounted to the Docker container
# to my mainframe ZFS volume. Then it attempts to copy it from ZFS to a traditional MVS
# dataset. This second part is the failinmg part.
sub putMvsFileByHfs
{
#
# First copy the file from the local file system to my the mainframe in HFS form (copy to USS)
# This part works.
#
my $OPTIONS = shift;
my $FULLY_QUALIFIED_LOCAL_FILE = shift;
my $FULLY_QUALIFIED_HFS_FILE = shift;
my $FULLY_QUALIFIED_MVS_FILE = shift;
RunScpCommand($FULLY_QUALIFIED_LOCAL_FILE,$FULLY_QUALIFIED_HFS_FILE);
#
# I am doing something wrong here
# Attempt to build the target dataset name.
#
my $dsnPrefix = qq(\"//');
my $dsnSuffix = qq('\");
my $FULLY_QUALIFIED_MVS_ARGUMENT = ${dsnPrefix}.${FULLY_QUALIFIED_MVS_FILE}.${dsnSuffix};
RunSshCommand("export _UNIX03=NO;cp ${OPTIONS}".${FULLY_QUALIFIED_HFS_FILE}." ".${FULLY_QUALIFIED_MVS_ARGUMENT});
}
# This function marshals whatever command I want to run and mostly does it. I'm not having
# any connectivity issues. My command at least reaches the server and SSH will try to run it.
sub RunScpCommand()
{
my $ssh_source= $_[0];
my $ssh_target= $_[1];
my ($out,$err);
my $in = "${SSCJCL_USER_PW}\n";
my $full_command = "sshpass -p ".${SSCJCL_USER_PW}." scp ".${ssh_source}." ".${SSCJCL_USER_ID}."#".${SSCJCL_HOST_NAME}.":".${ssh_target};
print ($full_command."\n");
run3 $full_command,\$in,\$out,\$err;
print ($out."\n");
print ($err."\n");
return ($out,$err);
}
# This function marshals whatever command I want to run and mostly does it. I'm not having
# any connectivity issues. My command at least reaches the server and SSH will try to run it.
sub RunSshCommand
{
my $ssh_command = $_[0];
my $in = "${SSCJCL_USER_PW}\n";
my ($out,$err);
my $full_command = "sshpass -p ".${SSCJCL_USER_PW}." ssh ".${SSCJCL_USER_ID}."#".${SSCJCL_HOST_NAME}." ".${ssh_command};
print ($full_command."\n");
run3 $full_command,\$in,\$out,\$err;
print ($out."\n");
print ($err."\n");
return ($out,$err);
}
Please forgive any Perl malpractices above as I'm new to Perl, though kind constructive pointers are appreciated.

First, let's build the values we want to pass to the program. We'll worry about building shell commands later.
my #OPTIONS = (
-P => join(',',
"RECFM=FB",
"LRECL=287",
"BLKSIZE=6027",
"SPACE=($SSCJCL_STORAGE_UNIT,($SSCJCL_PRIMARY_EXTENTS,$SSCJCL_SECONDARY_EXTENTS))",
),
);
my $FULLY_QUALIFIED_LOCAL_FILE = "$SSCJCL_SOURCE_DIRECTORY/$COST_FILE";
my $FULLY_QUALIFIED_HFS_FILE = "$SSCJCL_HFS_LOCATION/$COST_FILE";
my $FULLY_QUALIFIED_MVS_FILE = "$SSCJCL_STAGING_HLQ.$COST_FILE";
my $FULLY_QUALIFIED_MVS_ARGUMENT = "//'$FULLY_QUALIFIED_MVS_FILE'";
Easy peasy.
Now it's time to build the commands to execute. The key is to avoid trying to do multiple levels of escaping at once. First build the remote command, and then build the local command.
use String::ShellQuote qw( shell_quote );
my $scp_cmd = shell_quote(
"sshpass",
-p => $SSCJCL_USER_PW,
"scp",
$FULLY_QUALIFIED_LOCAL_FILE,
"$SSCJCL_USER_ID\#$SSCJCL_HOST_NAME:$FULLY_QUALIFIED_HFS_FILE",
);
run3 $scp_cmd, ...;
my $remote_cmd =
'_UNIX03=NO ' .
shell_quote(
"cp",
#OPTIONS,
$FULLY_QUALIFIED_HFS_FILE,
$FULLY_QUALIFIED_MVS_ARGUMENT,
);
my $ssh_cmd = shell_quote(
"sshpass",
-p => $SSCJCL_USER_PW,
"ssh", $remote_cmd,
);
run3 $ssh_cmd, ...;
But there's a much better solution since you're using run3. You can entirely avoid creating a shell on the local host, and thus entirely avoid having to create a command for it! This is done by passing a reference to an array containing the program and its args instead of passing a shell command.
use String::ShellQuote qw( shell_quote );
my #scp_cmd = (
"sshpass",
-p => $SSCJCL_USER_PW,
"scp",
$FULLY_QUALIFIED_LOCAL_FILE,
"$SSCJCL_USER_ID\#$SSCJCL_HOST_NAME:$FULLY_QUALIFIED_HFS_FILE",
);
run3 \#scp_cmd, ...;
my $remote_cmd =
'_UNIX03=NO ' .
shell_quote(
"cp",
#OPTIONS,
$FULLY_QUALIFIED_HFS_FILE,
$FULLY_QUALIFIED_MVS_ARGUMENT,
);
my #ssh_cmd = (
"sshpass",
-p => $SSCJCL_USER_PW,
"ssh", $remote_cmd,
);
run3 \#ssh_cmd, ...;
By the way, it's insecure to pass passwords on the command line; other users on the machine can see them.
By the way, VAR=VAL cmd (as a single command) sets the env var for cmd. I used that shorthand above.

The parameter specifying the disk units is "TRK" not TRACK", so this has to be
-P "RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=\(TRK,\(1,1\)\)"
Also, I never had to escape the paranthesis, when running such a command interactively from an SSH session. So this works for me
-P "RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=(TRK,(1,1))"
Then, the error
cp: target "//'WOODSMN.SSC.D051721.T200335.S90.CP037'" is not a directory
indicates that cp understood it has to copy more than one source file, thus it requests the final pathname to be a directoy. Seems to confirm that the cp did not run on the remote mainframe but on your local shell (as someone pointed out caused by not escaping the semicolon). And your local UNIX does not understand the z/OS specific MVS Data Set notation //'your.mvs.data.set'.
Instead of exporting _UNIX03=NO, you could replace
-P "RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=(TRK,(1,1))"
with
-W "seqparms='RECFM=FB,LRECL=287,BLKSIZE=6027,SPACE=(TRK,(1,1))'"
Then, only one command is to be run.

Related

scp with special characters programmatically

I have been searching for this for a while, and can't find a satisfactory answer.
I have a perl script that needs to copy a file from one host to another, essentially
sub copy_file{
my($from_server, $from_path, $to_server, $to_path, $filename) = #_;
my $from_location = "$from_server:\"\\\"${from_path}${filename}\\\"\"";
my $to_location = $to_path . $filename;
$to_location =~ s/\s/\\\\ /g;
$to_location = "${to_server}:\"\\\"${to_location}\\\"\"";
return system("scp -p $from_location $to_location >/dev/null 2>&1"");
}
The problem is, some of my filenames look like this:
BLAH;BLAH;BLAH.TXT
Some really nicely named file( With spaces, prentices, &, etc...).xlx
I am already handling whitespaces, and the code for that is quite ugly since on each side, the files could be local or remote, and the escaping is different for the from and to part of the scp call.
what I am really looking for is either to somehow to escape all possible special characters or somehow bypass the shell expansion entirely by using POSIX system calls. I am ok with writing a XS Module if need be.
I have the correct keys set up in the .ssh directory
Also I am not honestly sure which special characters do and don't cause problems. I would like to support all legal filename characters.
Say you want to copy file foo(s) using scp.
As shown below, scp treats the source and target as shell literals, so you pass the following arguments to scp:
scp
-p
--
host1.com:foo\(s\) or host1.com:'foo(s)'
host2.com:foo\(s\) or host2.com:'foo(s)'
You can do that using the multi-argument syntax of system plus an escaping function.
use String::ShellQuote qw( shell_quote );
my $source = $from_server . ":" . shell_quote("$from_path/$filename");
my $target = $to_server . ":" . shell_quote("$to_path/$filename");
system('scp', '-p', '--', $source, $target);
If you really wanted to build a shell command, use shell_quote as usual.
my $cmd = shell_quote('scp', '-p', '--', $source, $target);
$ ssh ikegami#host.com 'mkdir foo ; touch foo/a foo/b foo/"*" ; ls -1 foo'
*
a
b
$ mkdir foo ; ls -1 foo
$ scp 'ikegami#host.com:foo/*' foo
* 100% 0 0.0KB/s 00:00
a 100% 0 0.0KB/s 00:00
b 100% 0 0.0KB/s 00:00
$ ls -1 foo
*
a
b
$ rm foo/* ; ls -1 foo
$ scp 'ikegami#host.com:foo/\*' foo
* 100% 0 0.0KB/s 00:00
$ ls -1 foo
*
There are three ways to handle this:
Use the multi-argument form of system which will completely avoid the shell:
system 'scp', '-p', $from_location, $to_location;
Disadvantage: you can't use shell features like redirection.
Use String::ShellQuote to escape the characters.
$from_location = shell_quote $from_location;
$to_location = shell_quote $to_location;
Disadvantage: certain strings can exist which can't be quoted safely. Furthermore, this solution is not portable as it assumes Bourne shell syntax.
Use IPC::Run which essentially is a supercharged system command that allows redirections.
run ['scp', '-p', $from_location, $to_location],
'>', '/dev/null', # yes, I know /dev/null isn't portable
'2>', '/dev/null'; # we could probably use IO::Null instead
Disadvantage: a complex module like this has certain limitations (e.g. Windows support is experimental), but I doubt you'll run into any issues here.
I strongly suggest you use IPC::Run.
A few suggestions:
It's not part of the standard Perl distribution, but Net::SSH2 and Net::SSH2::SFTP are two highly ranked CPAN modules that will do what you want. ssh, scp, and sftp all use the same protocol and the sshd daemon. If you can use scp, you can use sftp. This gives you a very nice Perlish way to copy files from one system to another.
The quotemeta command will take a string and quote all non-text ASSCII characters for you. It's better than attempting to do the situation yourself with the s/../../ substitution.
You can use qq(..) and q(...) instead of quotation marks. This will allow you to use quotation marks in your string without having to quote them over and over again. This is especially useful in the system command.
For example:
my $error = system qq(scp $user\#host:"$from_location" "$to_location");
One more little trick: If the system command is passed a single parameter, and that parameter has shell metacharacters in it, the system command will be passed to the default system shell. However, if you pass the system command a list of items, those items are passed to execvp directly without being passed to the shell.
Passing a_list_ of arguments to system via an array is a great way to avoid problems with file names. Spaces, shell metacharacters, and other shell issues are avoided.
my #command;
push #command, 'scp';
push #command, "$from_user\#$from_host:$from_location",
push #command, "$to_user\#$to_host:$to_location"
my $error = system #command;
use Net::OpenSSH:
my $ssh = Net::OpenSSH->new($host);
$ssh->scp_get($remote_path, $local_path);
The way arguments have to be quoted varies depending on the shell running on the remote side. The stable version of the module has support for Bourne compatible shells. The development version available from GitHub has also support for csh and several Windows flavors (Windows quoteing is, err, interesting).
For instance:
my $ssh = Net::OpenSSH->new($host, remote_shell => 'MSWin', ...);
Note that on Windows, there are string that just can not be properly quoted!

Regarding running multiple ssh commands using perl

I am bit stuck here I want to ssh in to a machine and then run about 3 commands which are basically setup commands and then i want to return back to my machine with env variables of that machine
like
setup1
setup2
setup3
env > envtext.txt.
return back
All this i have to do in perl
i tried commands like
system("ssh #machine command1 && command 2") doesnt work
is there something like?
system("ssh #machine command1 -cmd command 2 -cmd command 3")
if not than what is the best way to do it
like making a shell script then calling it or i can do it in perl itself without any shell scripts?
code
#!/usr/bin/perl -w
use Net::SSH::Perl;
my $host = "address";
my $user = "name";
my $password = "password";
-- set up a new connection
my $ssh = Net::SSH::Perl->new($host,
debug=>0,
identity_files => ['path to key'],
options=> ["StrictHostKeyChecking no"]
#interactive => yes,
);
-- authenticate
$ssh->login($user,$password);
-- execute the command
my($stdout, $stderr, $exit) = $ssh->cmd("env");
print $stdout;`
error it gives is Permission denied at ssh.pl line 25
Thank you
I think your question is about SSHing to a single remote host and running multiple commands there. If that's true, then you need to pack your multiple commands up into a single command line that the remote shell can execute. The easiest way to do this is to use the list form of system, and pass the command line as a single parameter:
system "ssh", "machine", "setup1; setup2; setup3";
On to the second part of your question: You want to get data back from the remote side. For that, you'll want your program to read SSH's output rather than using system. For this, you can use:
open my $FH, "-|", "ssh", "machine", "setup1; setup2; setup3; env";
my #lines_from_ssh = <$FH>;
close $FH;
If you also need to send input to the remote side, look into IPC::Open2. If you need to capture both stdout and stderr, see IPC::Open3.
What you can do :
system("ssh $_ command1 -cmd command 2 -cmd command 3") for #machines;
Another Pure Perl solution is to use Net::OpenSSH

Have perl execute shellscript & take over env vars

I have a shell script that does nothing but set a bunch of environment variables:
export MYROOTDIR=/home/myuser/mytools
export PATH=$MYROOTDIR/bin:$PATH
export MYVERSION=0.4a
I have a perl script, and I want the perl script to somehow get the perl script to operate with the env vars listed in the shell script. I need this to happen from within the perl script though, I do not want the caller of the perlscript to have to manually source the shellscript first.
When trying to run
system("sh myshell.sh")
the env vars do not "propagate up" to the process running the perl script.
Is there a way to do this?
To answer this question properly, I need to know a bit more.
Is it okay to actually run the shell script from within the perl script?
Are the variable assignments all of the form export VAR=value (i.e. with fixed assignments, no variable substitutions or command substitutions)?
Does the shell script do anything else but assign variables?
Depending on answers to these, options of different complexity exist.
Thanks for the clarification. Okay, here's how to do it. Other than assigning variables, your script has no side effects. This allows to run the script from within perl. How do we know what variables are exported in the script? We could try to parse the shell script, but that's not the Unix way of using tools that do one thing well and chain them together. Instead we use the shell's export -p command to have it announce all exported variables and their values. In order to find only the variables actually set by the script, and not all the other noise, the script is started with a clean environment using env -i, another underestimated POSIX gem.
Putting it all together:
#!/usr/bin/env perl
use strict;
use warnings;
my #cmd = (
"env", "-i", "PATH=$ENV{PATH}", "sh", "-c", ". ./myshell.sh; export -p"
);
open (my $SCRIPT, '-|', #cmd) or die;
while (<$SCRIPT>) {
next unless /^export ([^=]*)=(.*)/;
print "\$ENV{$1} = '$2'\n";
$ENV{$1} = $2;
}
close $SCRIPT;
Notes:
You need to pass to env -i all environment your myshell.sh needs, e.g. PATH.
Shells will usually export the PWD variable; if you don't want this in your perl ENV hash, add next if $1 eq 'PWD'; after the first next.
This should do the trick. Let me know if it works.
See also:
http://pubs.opengroup.org/onlinepubs/009695399/utilities/export.html
http://pubs.opengroup.org/onlinepubs/009695399/utilities/env.html
Try Shell::Source.
You can set the environment variables inside the BEGIN block.
BEGIN block is executed before the rest of the code, setting the environment variables in this block makes them visible to the rest of the code before it is compiled and run.
If you have any perl modules to 'use' based on the enviornment settings, BEGIN block makes it possible.
Perl uses a special hash %ENV to maintain the environment variables.
You can modify the contents of this hash to set the env variables.
EXAMPLE :
BEGIN
{
$ENV { 'MYROOTDIR' } = '/home/myuser/mytools';
$ENV { 'PATH' } = "$ENV{ 'MYROOTDIR' }/bin:$ENV{ 'PATH' }";
}
Wouldn't it be easier for a shell script to set the variables, and then call the perl program?
i.e.:
run.sh:
#!/bin/sh
export MYROOTDIR=/home/myuser/mytools
export PATH=$MYROOTDIR/bin:$PATH
export MYVERSION=0.4a
./program.pl
This can now be done with Env::Modify with few changes to your existing code.
use Env::Modify 'system';
...
system("sh myshell.sh");
print $ENV{MYROOTDIR}; # "/home/myuser/mytools"
or if all your shell script does is modify the environment, you can use the source function
use Env::Modify `source`;
source("myshell.sh");
print $ENV{MYVERSION}; # "0.4a"

How can I call a shell command in my Perl script?

What would be an example of how I can call a shell command, say 'ls -a' in a Perl script and the way to retrieve the output of the command as well?
How to run a shell script from a Perl program
1. Using system system($command, #arguments);
For example:
system("sh", "script.sh", "--help" );
system("sh script.sh --help");
System will execute the $command with
#arguments and return to your script when finished. You may check $!
for certain errors passed to the OS by the external application. Read
the documentation for system for the nuances of how various
invocations are slightly different.
2. Using exec
This is very similar to the use of system, but it will
terminate your script upon execution. Again, read the documentation
for exec for more.
3. Using backticks or qx//
my $output = `script.sh --option`;
my $output = qx/script.sh --option/;
The backtick operator and it's equivalent qx//, excute the command and options inside the operator and return that commands output to STDOUT when it finishes.
There are also ways to run external applications through creative use of open, but this is advanced use; read the documentation for more.
From Perl HowTo, the most common ways to execute external commands from Perl are:
my $files = `ls -la` — captures the output of the command in $files
system "touch ~/foo" — if you don't want to capture the command's output
exec "vim ~/foo" — if you don't want to return to the script after executing the command
open(my $file, '|-', "grep foo"); print $file "foo\nbar" — if you want to pipe input into the command
Examples
`ls -l`;
system("ls -l");
exec("ls -l");
Look at the open function in Perl - especially the variants using a '|' (pipe) in the arguments. Done correctly, you'll get a file handle that you can use to read the output of the command. The back tick operators also do this.
You might also want to review whether Perl has access to the C functions that the command itself uses. For example, for ls -a, you could use the opendir function, and then read the file names with the readdir function, and finally close the directory with (surprise) the closedir function. This has a number of benefits - precision probably being more important than speed. Using these functions, you can get the correct data even if the file names contain odd characters like newline.
As you become more experienced with using Perl, you'll find that there are fewer and fewer occasions when you need to run shell commands. For example, one way to get a list of files is to use Perl's built-in glob function. If you want the list in sorted order you could combine it with the built-in sort function. If you want details about each file, you can use the stat function. Here's an example:
#!/usr/bin/perl
use strict;
use warnings;
foreach my $file ( sort glob('/home/grant/*') ) {
my($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size,$atime,$mtime,$ctime,$blksize,$blocks)
= stat($file);
printf("%-40s %8u bytes\n", $file, $size);
}
There are a lot of ways you can call a shell command from a Perl script, such as:
back tick
ls which captures the output and gives back to you.
system
system('ls');
open
Refer #17 here: Perl programming tips
You might want to look into open2 and open3 in case you need bidirectional communication.
I have been using system and qq to run linux programs inside perl. And it has worked well.
#!/usr/bin/perl # A hashbang line in perl
use strict; # It can save you a lot of time and headache
use warnings; # It helps you find typing mistakes
# my keyword in Perl declares the listed variable
my $adduser = '/usr/sbin/adduser';
my $edquota = '/usr/sbin/edquota';
my $chage = '/usr/bin/chage';
my $quota = '/usr/bin/quota';
my $nomeinteiro;
my $username;
my $home;
# system() function executes a system shell command
# qq() can be used in place of double quotes
system qq($adduser --home $home --gecos "$fullname" $username);
system qq($edquota -p john $username);
system qq($chage -E \$(date -d +180days +%Y-%m-%d) $username);
system qq($chage -l $username);
system qq($quota -s $username);

How to export a shell variable within a Perl script?

I have a shell script, with a list of shell variables, which is executed before entering a programming environment.
I want to use a Perl script to enter the programming environment:
system("environment_defaults.sh");
system("obe");
But when I enter the environment the variables are not set.
When you call your second command, it's not done in the environment you modified in the first command. In fact, there is no environment remaining from the first command, because the shell used to invoke "environment_defaults.sh" has already exited.
To keep the context of the first command in the second, invoke them in the same shell:
system("source environment_defaults.sh && obe");
Note that you need to invoke the shell script with source in order to perform its actions in the current shell, rather than invoking a new shell to execute them.
Alternatively, modify your environment at the beginning of every shell (e.g. with .bash_profile, if using bash), or make your environment variable changes in perl itself:
$ENV{FOO} = "hello";
system('echo $FOO');
Different sh -c processes will be called and environment variables are isolated within these.
Also doesn't calling environment_defaults.sh also make another sh process within what these variables will be set to in isolation?
Or start the Perl script with these environment variables exported and these will be set for all its child processes.
Each process gets its own environment, and each time you call "system" it runs a new process. So, what you are doing won't work. You'll have to run both commands in a single process.
Be aware, however, that after your Perl script exists, any environment variables it sets won't be available to you at the command line, because your Perl script is also a process with its own environment.
(UPDATE: Oh, this is not exactly what you asked for, but it might be useful for someone.)
If GDB is installed, you can set/modify parent shell variables with the following hack (non-strict style is used for clarity):
#!/usr/bin/perl
# export.pl
use File::Temp qw( tempfile );
%vars = (
a => 3,
b => 'pigs'
);
$ppid = getppid;
my #putvars = map { "call putenv (\"$_=$vars{$_}\")" } keys %vars;
$" = "\n";
$cmds = <<EOF;
attach $ppid
#putvars
detach
quit
EOF
($tmpfh, $tmpfn) = tempfile( UNLINK => 1 );
print $tmpfh $cmds;
`gdb -x $tmpfn`
Test:
$ echo "$a $b"
$ ./export.pl
$ echo "$a $b"
3 pigs
This can now be done with the Env::Modify module
use Env::Modify 'source'; # or use Env::Modify qw(source system);
source("environment_defaults.sh");
... environment from environment_defaults.sh is now available
... to Perl and to the following 'system' call
system("obe");