Do special variables maintain their existence in function calls in other required modules? - perl

I'm not a perl expert and I don't quite get how all of perl's scoping rules work.
I'm setting an $ENV{'whatever'} environment variable, then I'm calling a function in another source .pl file and trying to read that ENV entry, and I'm getting nothing back. Docs say everywhere that ENV persists for the current process and any forked children, but is access to the %ENV variable available in other source files?
The source file was included via a 'require' command. Is that the right way to do it, or is there something static (first time in) about how variables are made available when a source file is required?

%ENV is a global, so it is accessible from everywhere in every source file loaded into a process.
%ENV is inherited when a new process is created with a fork but the new process gets its own copy so any changes made in one will not be visible in the other.
If you're loading the other source file with do or require or use then it's being loaded into the same process and it will see the same %ENV.
However if you're loading the new script with system or exec then the new script is loading in a new process and it will get its own copy of %ENV.

From perldoc perlvar:
%ENV
The hash %ENV contains your current environment. Setting a value in
ENV changes the environment for any child processes you subsequently
fork() off.
require-ing a .pl file is not the same as forking a command.
It would be simpler to just set the necessary environmental variables through a Bash wrapper:
$ cat wrapper.sh
#!/bin/bash
export whatever="/usr/bin/some_dir/"; # Set to env
perl script.pl; # Invoke the script
$ cat script.pl
#!/usr/bin/perl
print $ENV{whatever}; # wrapper.sh : "/usr/bin/some_dir/"
# script.pl : ""

Related

FInding relative path in Perl

I have the following code in a perl module,
package Foo;
our $pathToScript = "/home/Lucas/project841/python_script.py";
It is frequently called by other modules in the same file directory through
$output = `$Foo::pathToScript`;
# etc
I would like to remove the hard coding of the actual path and use relative path, Eg. ./python_script.py to call the script from other modules.
What would be the ideal way?
You said your Perl script is "frequently" called from the same directory where the Python script resides, not "always". If you remove the absolute path, you'll need to change that "frequently" to "always", and just change $pathToScript to the Python script name (no path).
You could also consider setting the environment PATH (in the Perl script) so that the Python script (without the full path in $pathToScript) is always found, regardless of where the user is running from or where the Perl script is located.

How to make setenv to an output of a perl script

I wrote a perl script that prints a path to a very specific file. I want to define a personal environment variables (by using setenv in .aliases file) that gives the output of this script.
For example, let's say that the file "myscript.pl" prints the path "/home/files/reports/file". Let's call the variable (that I want to define in .aliases file) 'myoutput'. I want that when I type "most $myoutput" in Unix, this file will be opened by most, and when I type "echo $myoutput", Unix will print the path.
How can I define a personal variable which value is determined by a script?
If you use bash, you can put the following to your .bashrc:
export myoutput=$(perl /path/to/myscript.pl)
For tcsh, use .cshrc instead, and modify the line to
setenv myoutput `perl /path/to/myscript.pl`
You need to start a new session to make the variable exist.
myoutput=$(perl myscript.pl)
When your script prints more, select the correct line:
myoutput=$(perl myscript.pl | grep /home)

Log4Perl: How do I change the logger file used from running code? (After a fork)

I have an ETL process set up in perl to process a number of files, and load them to a database.
Recently, for performance reasons I set up the code to be multi-threaded, through use of a fork() call and a call to system("perl someOtherPerlProcess.pl $arg1 $arg2").
I end up with about 12 instances of someOtherPerlProcess.pl running with different arguments, and these processes each work through one directories worth of files (corresponding to a single table in our database).
The applications main functions work, but I am having issues with figuring out how to configure my logging.
Ideally, I would like to have all the someOtherPerlProcess.pl share the same $log_config value to initialize their loggers, but have each of those create a log file in the directory they are working on.
I haven't been able to figure out how to do that. I also noticed that in the directory I am calling these perl scripts from I see several files (ARRAY(0x260eec), ARRAY(0x313f8), etc) that contain all my logging messages!
Is there a simple way to change the log4perl.appender.A1.filename value from running code?
Or to otherwise dynamically configure the file name we use, but use all other values from a config file?
I came up with a less than ideal solution for this, which is to configure my logger from someOtherPerlProcess.pl directly.
my $FORKED_LOG_CONF = "log4perl.appender.A1.filename=$directory_to_load/log.txt
log4perl.rootLogger=WARN, A1
log4perl.appender.A1=Log::Log4perl::Appender::File
log4perl.appender.A1.mode=append
log4perl.appender.A1.autoflush=1
log4perl.appender.A1.layout=PatternLayout
log4perl.appender.A1.layout.ConversionPattern=[%p] %d{yyyy-MM-dd HH:mm:ss}: %m%n";
#Logger start up
Log::Log4perl::init( \$FORKED_LOG_CONF);
my $logger = get_logger();
The $directory_to_load is the process specific portion of the logger, which works in the context of the perl process that is running and has a (local) value for that variable, but that method will fail if used in an external config file.
I would be happy to hear of any alternative solutions.
In your config file:
log4perl.appender.A1.filename=__LOGFILE__
In your script:
use File::Slurp;
my $log_cfg = read_file( $log_cfgfile );
my $logfile = "$directory_to_load/log.txt";
$log_cfg =~ s/__LOGFILE__/$logfile/;
Log::Log4perl::init( \$log_cfg );

How do we replace PATH in all the files with an env variable

I have around 230 files which are *.pl , *.txt and some are *.conf files which has a default path set to the current environment say /home/AD/USR/perl/5.8.0/bin/perl. I need to replace "/home/AD/USR" with an environment variable ${USR_PATH}. The files I want to modify are in subdirectories. Which means my script should find e.g find .|xargs grep -l "/home/AD/USR" all the files and then replace the string.
OLD: /home/AD/USR/perl/5.8.0/bin/perl
New : ${USR_PATH}/perl/5.8.0/bin/perl
Can some one give me a clue how do I do that?
Shell : /bin/bash
Env : Linux x86_64
If you replace part of a string with ${USR_PATH} you will refer to the perl variable $USR_PATH, not the environment variable, which is in perl referred to as $ENV{USR_PATH}.
perl -pi.bak -we 's#/home/AD/USR(?=/perl/5.8.0/bin/perl)#\$ENV{USR_PATH}#g'
*.pl *.txt *.conf
Using the lookahead will save you the trouble of replacing the rest of the path afterwards.
I assume you want to replace it with the literal value. If you want to replace it with the actual value in the environment variable, just remove the backslash in front of $ENV.
While using an environment variable seems handy and all, it will reduce your scripts portability. Why not use a configuration file? If you had done that from the start, you wouldn't be having this trouble. Search CPAN for a nice module.
perl -i -pe 's|/home/AD/USR/perl/5.8.0/bin/perl|\${USR_PATH}/perl/5.8.0/bin/perl|' <your files>

Prevent vim :make from changing the current working directory?

Synopsis:
When calling vim's make command, it changes the current working directory (cwd) to the directory of the current file. It then runs the makeprg from there. I want to prevent the make command from changing the cwd, and instead call the makeprg from the directory of the parent vim instance
Example:
I have the following standard perl project hierarchy
project/
lib/
My/
Module/
Foo.pm
My PERL5LIB is set to
PERL5LIB=':lib'
In my .vimrc I have
compiler perl
set makeprg=perl\ -c\ %
I edit my module using vim from the root project level:
/path/to/project$ vim lib/My/Module/Foo.pm
In vim :pwd works as expected:
:pwd
"/path/to/project"
Also calling !perl -c works as expected, finds my project lib, and displays the result in a shell window:
:!perl-c %
OUTPUT:
perl -c lib/My/Module/Foo.pm
lib/My/Module/Foo.pm Syntax ok
However :make returns an error
:make
"Can't open perl script lib/My/Module/Foo.pm : No such file or directory"
Setting makeprg to pwd shows the problem
:set makeprg=pwd
:make
"/path/to/project/lib/My/Module"
So before make runs makeprg it is changing to the directory of the current file, which is why perl can't find 'lib/.../Foo.pm' there.
Is there any way to prevent make from doing this?
If Vim's :make command is changing the current working directory, and autochdir is not set, a plugin may have added an autocommand to the QuickFixCmdPre set. One plugin that does this is eclim, which calls the QuickFixLocalChangeDirectory() function if g:EclimMakeLCD is set to 1.
Use :au to find all the autocommands in your current configuration, paying particular attention to entries for QuickFixCmdPre and make.