perl configuration file which contains variables? - perl

I want to create a perl configuration file. I want a file format which has variables. So something like this:
DefaultDirectory = /var/myProgram/
OutputDirectory = $DefaultDirectory/output
InputDirectory = $DefaultDirectory/input
This seems simple, but I'm not sure what is available with perl. The perl ini options I see don't appear to support it. I looked into YAML, but it seems almost overkill.
Can anyone suggest a good file format and CPAN module that supports it which can support simple variables? I'm stuck with perl 5.5 so hopefully an older module.

Try Config::General.
test.cfg
# Simple variables
DefaultDirectory = /var/myProgram
OutputDirectory = $DefaultDirectory/output
InputDirectory = $DefaultDirectory/input
# Blocks of related variables
<host_dev>
host = devsite.example.com
user = devuser
password = ComeOnIn
</host_dev>
<host_prod>
host = prodsite.example.com
user = produser
password = LockedDown
</host_prod>
test.pl
#!/usr/bin/perl
use strict;
use warnings;
use Config::General;
my $conf = Config::General->new(
-ConfigFile => 'test.cfg',
-InterPolateVars => 1
);
my %config = $conf->getall;
print <<HERE;
Default directory: $config{'DefaultDirectory'}
Output directory: $config{'OutputDirectory'}
Input directory: $config{'InputDirectory'}
Development host: $config{'host_dev'}{'host'}
Development password: $config{'host_dev'}{'password'}
Production host: $config{'host_prod'}{'host'}
Production password: $config{'host_prod'}{'password'}
HERE
Output:
Default directory: /var/myProgram
Output directory: /var/myProgram/output
Input directory: /var/myProgram/input
Development host: devsite.example.com
Development password: ComeOnIn
Production host: prodsite.example.com
Production password: LockedDown

Have you considered just writing your own perl module that contains config?
Something like (MyConfig.pm):
package MyConfig;
our $DefaultDirectory = '/path/to/somewhere';
our $setting_for_something = 5;
1;
You can then import that with use MyConfig;. You may need to set use lib or FindBin to find the module first though (depends on where you invoke the script - use will search cwd).
But really - perl 5.5? That's .... well worth updating, given that's the version from 2004. I do hope you don't run too much stuff on 10 year old software - the world has changed A LOT in the intervening time. (And so has Perl)

Related

Server does not have Digest::SHA 'hmac_sha256_base64' installed, can I use Digest::HMAC_SHA1 'hmac_sha1' instead?

I am preparing a CGI script that needs to confirm that the body of a incoming post request message, when is converted into a HMAC-SHA256 hash, is exactly the same content that also comes in a header tag of the same incoming message.
I have been able to confirm using Python that the procedure is as explained above, but when I do the same functionality on a CGI script I cannot match the contents and probably is because I am not using the correct encryption / hashing library.
My server provider does not have the Digest::SHA library and thus, I can not use the 'hmac_sha256_base64' function. I cannot ask them to install it, I just can use what is already available.
I have checked the available libraries and there is a Digest::HMAC_SHA1 'hmac_sha1' library / function. So I am doing as follows:
my $q = CGI->new;
my %headers = map { $_ => $q->http($_) } $q->http();
# below is the secret key, is an example but I am using the good one
my $channel_secret="abcdabcdabcdabcdabcdabcdabcdabcd"
# Incoming request body string
my $httpRequestBody = $q->param( 'POSTDATA' );
# now, I want to use Digest::SHA hmac_sha256_base64 but this server
# does not have it so I am using the following one...
# because I thought it was the equivalent new function to do the same
# but probably it is not...
use Digest::HMAC_SHA1 'hmac_sha1';
use MIME::Base64 'encode_base64';
$digest = hmac_sha1($httpRequestBody, $channel_secret);
my $signature = encode_base64($digest);
So basically I expect that these two variables contain the same string:
$headers{'A_EXISTING_TAG_OF_THE_HEADER'}
$signature
But they are totally different. I suspect that I am not using the correct algorithm.
So my question is:
If my server provider does not include Digest::SHA 'hmac_sha256_base64' in the available libraries, then what other alternatives do I have to make the same? Is Digest::HMAC_SHA1 'hmac_sha1' the same functionality or not?
Download the tarball for Digest::SHA::PurePerl ( you'll find the download link on this page https://metacpan.org/pod/Digest::SHA::PurePerl )
Create a library folder, something like this
.
|-- library
| `-- Digest
| `-- SHA
| `-- PurePerl.pm
`-- your_script.pl
your_script.pl looks like this, you can implement similarly:
#!/usr/bin/perl
use lib '.';
use lib '/tmp/iadvd/library/';
use Digest::SHA::PurePerl qw(sha1 sha1_hex);
print sha1_hex('Pradeep'),"\n";

Using root logger across multiple modules

ActiveState Perl 5.12 on WinXP
I've recently become a convert to Log4perl and used it successfully where I defined a specific logger for each module that makes up my application. Example:
Upload.pm-----
my $logger = get_logger("Upload"); # specific logger
my $layout = Log::Log4perl::Layout::PatternLayout->new( "%d %p> %F{1}:%L %M - %m%n");
my $appender = Log::Log4perl::Appender->new( "Log::Dispatch::File",
filename => "Upload.log",
mode => "append"
);
$appender->layout($layout);
$logger->level($OFF); # or $INFO
$logger->add_appender($appender)
....
This works but is hard to track program flow across numerous log files ie Upload.log , Parse.log , FileRead.log etc. Quick-n-dirty solution: use same filename in all loggers.
This works much better, but only as long as the program is used serially and in sequence.
Now for the wrinkle - suppose module Upload.pm is used by several progams ie readMyFiles.pl, readHerFiles.pl and dumpAllFiles.pl . When running readMyFiles.pl I want the logger in Upload.pm to write to readMyFiles.pl.log and when runnng dumpAllFiles.pl want the logger in Upload.pm to write to dumpAllFiles.pl.log
One method might be to declare an our variable $logFileName in my .pl files and use it in all my modules like so:
filename => $logFileName,
Another might be to remove all the loggers from my .pm's and define them only in the .pl's - but then how would I reference $logger in the .pm's ?
All thoughts and suggestions are appreciated.
Still-learning Steve
Configure your logging settings in the caller, not in the module. If somebody else uses your module, they might not want to log things to the same place that you do. They might also want to format log messages differently, or use a different type of appender, or...the list goes on.
Your module should only get a logger and write messages to it:
MyModule.pm
#!/usr/bin/perl
package MyModule;
use strict;
use Log::Log4perl qw(get_logger);
sub foo {
my $logger = get_logger("Foo");
$logger->debug("Hello from MyModule");
}
1;
Your main program(s) should configure and initialize logging:
logtest
#!/usr/bin/perl
use strict;
use warnings;
use Log::Log4perl qw(get_logger);
use MyModule;
Log::Log4perl->init("log4perl.cfg");
my $logger = get_logger("Foo");
$logger->debug("Hello from main");
MyModule::foo();
I prefer to use a separate config file for Log4perl settings:
log4perl.cfg
log4perl.logger.Foo=DEBUG, Screen
log4perl.appender.Screen=Log::Dispatch::Screen
log4perl.appender.Screen.Threshold=DEBUG
log4perl.appender.Screen.layout=Log::Log4perl::Layout::PatternLayout
log4perl.appender.Screen.layout.ConversionPattern=[%r] %F %L %c - %m%n
Output:
[0] logtest 12 Foo - Hello from main
[0] MyModule.pm 11 Foo - Hello from MyModule
Note that you must initialize logging with Log::Log4perl->init in the caller before trying to get_logger in the module. If you don't, log messages from the module will be ignored and you'll get the following warning:
Log4perl: Seems like no initialization happened.
Forgot to call init()?
See the documentation for details.

Change Environment Variables During Perl Unit Testing

I am trying to write some unit tests for this perl module function, but am having some issues with environment variables. I'll list the files first, then explain the issue in greater detail.
processBuildSubs.pm
package processBuildSubs;
use strict;
use warnings;
use LWP::UserAgent;
use HTTP::Request::Common;
use HTTP::Status;
# Declare environment variables used in this package that are needed
use constant URI_BASE => $ENV {"URI_BASE"};
use constant URI_RESOURCE => $ENV {"URI_RESOURCE"};
# Shell Environment Related Constants visible.
# Make visible.
our $URI_BASE = URI_BASE;
our $URI_RESOURCE = URI_RESOURCE;
sub populatePartitions
{
# Define locals
my $url;
my $ua = new LWP::UserAgent;
$url = "$URI_BASE"."$URI_RESOURCE"."/some/path";
# Make a request to the $url
$res = $ua->request (GET $url);
if ($res->code() != HTTP::Status->RC_OK() )
{
# The request didn't return 200 OK so it's in here now.
}
else
{
# The request returned 200 OK, so now it's here.
}
}
I want to be able to unit test both the if path and the else path, however, it would be best for me if I don't need to change the processBuildSubs.pm code at all. It's an external file that I don't currently have control over. I am just tasked in unit testing it (although I do understand it could be tested more efficiently if we could also change the source code).
So in order to test both paths, we need the environment variables URI_BASE and URI_RESOURCE to be set accordingly, so that the request fails once, and succeeds another time. (I am interested in learning how to stub out this call at a future time, but that's reserved for another question.)
Here's my test file:
processBuildSubs.t
use strict;
use Test::More qw(no_plan);
BEGIN { use_ok('processBuildSubs') };
# Test 1 of populatePartitions() function
my $processBuildProdsCall = processBuildSubs::populatePartitions();
is( $populatePartitionsCall, 0, "populatePartitions() Test for 0 Val Passed" );
# Test 2 of populatePartitions() function
# I need to change some environment variables that processBuildSubs depends on here.
my $processBuildProdsCall = processBuildSubs::populatePartitions();
is( $populatePartitionsCall, 0, "populatePartitions() Test for 0 Val Passed" );
The best attempt we have right now at changing the environment variables is using an external shell script like so (But it would be ideal to change them between the my calls in the file above):
run_tests.sh
#!/bin/bash
# Run the tests once
perl ./BuildProcess.pl
perl ./Build testcover # Ultimately calls the processBuildSubs.t test file
# Now export some variables so the other test passes.
export URI_BASE="https://some-alias/"
export URI_RESOURCE="some-resource"
# Then run the test suite again with the env set so the else condition passes.
perl ./BuildProcess.pl
perl ./Build testcover
As you can see, this would be a bad way of doing things, as we run the entire test suite with different environments each time. Ideally we'd like to setup our environment in the processBuildSubs.t file if possible, between tests.
Please let me know if I can provide any further information.
Are you averse to having separate scripts for separate test environments?
# processBuildSubs.t
BEGIN {
#ENV{"URI_BASE","URI_RESOURCE"} = ("https://some-alias/","some-resource");
}
use Test::More;
... tests go here ...
# processBuildSubs-env2.t
BEGIN {
#ENV{"URI_BASE","URI_RESOURCE"} = ("https://another-alias/","another-resource");
}
use Test::More;
... tests go here ...
By setting %ENV in a BEGIN block, before any other modules are loaded, you make the different environment variables available to your other modules at compile-time.

'No such file or directory' even though I own the file and it has read permissions for everyone

I have a perl script on CentOS and am trying to read a file using File::Slurp:
my $local_filelist = '~/filelist.log';
use File::Slurp;
my #files = read_file($local_filelist);
But I get the following error:
Carp::croak('read_file \'~/filelist.log\' - sysopen: No such file or directory') called at /usr/local/share/perl5/File/Slurp.pm line 802
This is despite the fact that I am running the script as myuser and:
(2013-07-26 06:55:16 [myuser#mybox ~]$ ls -l ~/filelist.log
-rw-r--r--. 1 myuser myuser 63629044 Jul 24 22:18 /home/myuser/filelist.log
This is on perl 5.10.1 x86_64 on CentOS 6.4.
What could be causing this?
I've not used File::Slurp, but I'll hazard a guess that it doesn't understand the ~ for home directory. Does it work if you specify the full path - e.g., use:
my $local_filelist = "$ENV{HOME}/filelist.log";
Using double quotes will mean that perl will expand $ENV{HOME}.
Just use the glob function. That's what it is for.
my $local_filelist = glob '~/filelist.log';
I think that when you are running a script ~ may not be set or is set to somewhere else - try replacing:
'~/filelist.log'
with:
'/home/myuser/filelist.log'
in your script.

Perl Rover v3 pass environment variable to in the Rulesets

I am using perl Rover module version 3 to login to the Linux/Unix server and run the script. In the ruleset if I add the full path name it copies the script to the remote server, not able to substitute the environment variable.
eg.
This works:
copy:{
put_file "/home/u1/find.sh" "/tmp/"
};
This didn't work:
copy:{
put_file "$HOME/find.sh" "/tmp/"
};
used $ENV{'HOME'}, this also didn't work.
How can I pass the environment variable?
Rover module document.
http://rover.sourceforge.net/QuickStart/Rover-QuickStart-3.html#ss3.2
http://rover.sourceforge.net/
After reviewing the source code for rover, which I never used, I determined it was not possible from the existing code.
I created a new extension for you, that has that functionality, it supports the ~ and ${HOME} syntax, (which are bash extensions and not part of the OS directly, that is why perl does not support them).
code is here:
https://github.com/h4ck3rm1k3/perl-rover/commit/2c78aefb97e819956bb665b04056763f8df1b242
I have had a hard time testing it because I never used rover before, and rover does not seem to support scp.(I read it is supported,but could not test it yet.) Anyway, let me know if you like it. I will put more work into it if reasonably requested.
Update
Here is my example ruleset :
example ruleset
[rulesets]
test:
{
put_file_from_home put_file "~/find2.sh" "/tmp/"
put_file_from_home put_file "${HOME}/find3.sh" "/tmp/"
}, ;
example output
Here is the example output, I cannot get rover to work. See the test case below.
Test output
perl -I lib t/example2.t
Local was ~/find2.sh and home was /home/mdupont at lib/Rover/CoreExtension.pm line 19.
Local now /home/mdupont/find2.sh at lib/Rover/CoreExtension.pm line 22.
Local was ${HOME}/find3.sh and home was /home/mdupont at lib/Rover/CoreExtension.pm line 19.
Local now /home/mdupont/find3.sh at lib//Rover/CoreExtension.pm line 22.
new config option for the new sshport option
[hosts]
someexample:{
os linux
username myusername
description 'myhost'
sshport 12345
ftp_method_used sftp
};
update2
Dont use quotes around the name, use a comma between the args,
To git#github.com:h4ck3rm1k3/perl-rover.git
2207417..7637741 CoreExtension -> CoreExtension
[rulesets]
test: { put_file_from_home ~/find2.sh,/tmp/ }, ;
[hosts]
localhost:{
os linux
username mdupont
description 'localhost'
ftp_methods sftp
ftp_method_used sftp };
mike
Old question but new answer, since your using Rover v3 you can just extend the Rover::Core modules by overloading it.
Add this to your ~/.rover/contrib directory:
CoreVariables.pm:
package CoreVariables;
use strict;
use Exporter;
our #ISA = qw( Exporter );
our #EXPORT = qw( put_file );
sub put_file {
my ($self, $host, $command) = #_;
$command =~ s/(\$[\w{}]+)/$1/eeg;
return Rover::Core::put_file($self, $host, $command);
}
And add the following to your ~/.rover/config [modules] section (must be after Rover::Core):
CoreVariables:{
};
And then you can store environment variables in your rover config when using put_file. Add other routines if you wish, this only extends put_file.
And since this is such an easy task I will add it to the requested feature list and include it in the next release (I am the Rover author).
The better place to ask Rover questions is on the sourceforge website of course: http://sourceforge.net/projects/rover/