How to handle external dependencies in perl's ExtUtils:MakeMaker - perl

I have a series of perl scripts for which I'm writing a Makefile.PL script, but I'm rather inexperienced with ExtUtils::MakeMaker.
One of the scripts I wrote makes a system call to a command line utility that must be installed in order for the script to run properly. My script can gracefully detect that the utility is missing and issue an error about installing it and putting it in the user's path, but is there some standard way to handle this in the Makefile.PL script? Could it even gasp attempt to install the third-party utility if I enter the download link in the Makefile.PL script?
At the very least, I'd like the script to warn the user if the external dependency was not found. I know I can write a test case that uses it. Is this as simple as copying and pasting the subroutine I wrote in the script itself that checks for the third party utility and prints an error if it's not found or would that be the "wrong way to do it"?

Let's call this external dependency foobar, for sake of argument.
As per #KeepCalmAndCarryOn's comment, firstly consider whether foobar could be replaced by something from CPAN (maybe Foo::Bar), or a few lines of Perl.
Otherwise, the best course of action is:
Create a new CPAN distribution called Alien::Foobar. The job of Alien::Foobar is to download, perhaps compile, and then install foobar, as part of Alien::Foobar's Makefile.PL or Build.PL.
(There exists a module called Alien::Base which aims to make doing this sort of thing easier. It's mostly aimed at installing libraries rather than binaries, though I've had some success using it for the latter.)
Now the Makefile.PL you were originally working on can declare a dependency on Alien::Foobar.

If you have an external dependency on a command-line utility (i.e. there's no perl module that does what the utility does), ExtUtils::MakeMaker is not designed to handle such a dependency. What you need to do is write an install script or edit the make file to handle the dependency. Here are the considerations in doing so:
Check if the dependency exists and if the version is sufficient.
Download the dependent package
Configure, compile, & install the dependent package
Test to make sure it works
Update the user's environment setup if necessary
Run your perl package's installation steps (e.g. perl makefile.PL;make;sudo make install)
Note, you may need to know whether your script is running as root or not, which you can verify using id -u to check if the user ID is root (i.e. '0').

Related

Installing an perl based web-app in extremely restricted environment

Because i have a long series of comments with #ikegami, I cleaning up the question, in a hope it will be more understandable. Unfortunately, english isn't my "main" language. :(
Let say, having an environment where:
no development tools are installed (no make, nor gcc or like)
perl is installed with its core packages, nothing more
no outgoing network access is allowed - e.g. the user couldn't use curl nor cpan to download/install perl dependencies
the user even doesn't have admin (root) rights
but want install and evaluate some perl based web-app, let call it as MyApp
The MyApp
doesn't uses any XS-based module. (at least, I hope - in the development me using plenv and cpanm, so never checked the installed dependencies in depth)
it is an pure PSGI app, the simple plackup app.psgi works OK
the app uses some data-files which should be included in the "deployment".
The main question is: how to prepare the MyApp, and the all used CPAN-modules, to be easily installed in such restricted environment?
The goal is:
i don't need save my efforts and my time
but i want save the user's time and want minimize the needed actions on his side, so the installation (deployment) should be simple-as-possible.
E.g. how to get an running web-app to the user's machine with minimum possible (his) steps.
- the simplest thing is could be something as:
- copy one file (zip, or tarbal)
- unpack it
- from the terminal execute some run.pl in the unpacked directory.
To get the above simple installation, my idea was the following:
1.) Create an tarball, and after the unpacking will contain 3 folders and 1 perl-script, let say:
myapp_repo/
myapp_repo/distlib #will contain all MyApp's perl modules also ALL used CPAN modules and their dependecies
myapp_repo/datafiles #will contain app-specific data files and such
myapp_repo/install.pl
myall_repo/lib #will contain modules directly used by the `install.pl`
2.) I will develop an install.pl script, and it will be used as the installer-tool, like
perl install.pl new /path/to/app_root
and it will (should):
create the all needed directories under the /path/to/app_root (especially the lib where the will install the perl modules)
will call "local" cpanm internally (from the myapp_repo/lib) to install the app's perl modules and their CPAN dependencies using only distribution files from the distlib.
will generate and install the needed runtime script and the app.psgi into the /path/to/app_root/bin
will install the needed data-files for the app.
3.) So, after this the user should be able to simply run:
/path/to/app_root/bin/plackup /path/to/app_root/bin/app.psgi
In short, the user should use:
the system-wide perl and the system-wide perl-core modules
and any other
runtime perl-scripts (like plackup)
and the required CPAN-modules
should be installed to an self-contained directory tree using only files (no net-access).
E.g. the install.pl should somewhat call internally the cpanm to achieve (as equivalent) for the following cpanm command
cpanm --mirror file://path/to/myapp_repo/distlib --mirror-only My::App
which, should install My::App and all dependencies without network access using only the files from the myapp_repo/distlib
Some questions:
Is possible to use cpanm (called as an locally installed module) without the make?
For creating the myapp_repo/distlib, me thinking about using Pinto. Is it the right tool for achieve the above?
forgot me something? or with other words:
Is the above an viable (read: working) way?
are are any other tools, which i could/should to use for simplifying the creation of such distribution tarball?
#ikegami suggesting some method:
- "install everything" in one fresh-directory on my machine
- transfer this self-contained directory to the target machine
It sound very good, because this directory could contain all the needed app-specific data-files too, unfortunately, I don't understand the details how his solution should be done.
The FatPacked solution looks interesting too - need learn about it.
Don't write your own make or installer. Just copy it make from a different machine (which is basically what apt/yum/etc do anyway, and which you'd have to do even if you wrote your own). You'd be able to use cpan in 5 minutes!
Also, that should allow you to install gcc if you need it (e.g. to install an XS module), although it doesn't sound like you do. If you do install gcc, I'd install my own perl to avoid having to deal with PERL5LIB.
Tools such as minicpan will allow you to install any module from CPAN without internet access. Of course, you can keep using the command you are already using it if mirrors the packages you need.
The above explains how to simply and quickly setup a machine so it can use cpan and thus install any module easily.
If you just want to install a specific module and its dependencies, you can completely avoid using cpan on the target machine. First, you need a fresh install of Perl (preferable of the same version as the one on the target system). Then, simply install the module to a fresh dir on your machine, and transfer that dir to the target machine. That's it; nothing else needs to be done. This even works for XS modules if the two machine are similar enough.
This is what ppm (ActiveState's Perl package manager) does.
Unfortunately, while this solution is almost as simple as the one above, it's not nearly as flexible, it doesn't run the test suite of the modules being installed, etc. It does have the advantage of not requiring the transfer of any binary (if you're not installing any XS modules).

Is it possible to install perl prerequisites before distribution testing and how?

I try to build a Perl distribution for a home-made module, from the Module::Starter base. Every test pass on my machine, but when I upload it to CPAN to get some more universal tests from cpantesters.org, some test failed on other architectures or OS, but I can't understand why. I can see in test reports that some of my prerequisites are not installed before testing but I would like it to.
I've tried to list these dependencies into the Makefile.PL PREREQ_PM hash and then in the TEST_REQUIRES hash, but it didn't changed a lot of results.
Then, when I've removed the dependencies from my local machine and tried to install my module using Cpanm, it downloads dependencies first, test passed and install has been a success.
This is my first try for a module, so I think I am missing something, maybe I am too used of the Cpanm magic. Thanks for any help.
The problem is something different. Andreas' smoker very probably built the dependency App::Ack (which looks in the fail reports like being absent) successfully. But here come at least two problems:
When a distribution gets tested, then its dependencies may or may not be installed already. However, it's guaranteed that all dependent modules are made available through the PERL5LIB environment variable, so make test usually works (To be more specific, if the install Module command is used in the CPAN shell, then all dependencies are installed immediately. If the test Module command is used, then dependencies are only built, but not installed. The CPAN user can do the installation later using install_tested). So it may be that App::Ack is not installed here, just built. Especially this means that the ack script is not installed in the final location.
Even if it is installed, many smoke testers or users who have multiple perls installed in parallel use a non-standard directory for this perl. So ack wouldn't be installed in /usr/bin or /usr/local/bin, but in the bin directory belonging to this perl. This directory may or may not be in the user's PATH at all. So you cannot assume that can_run("ack") works here. A workaround here is to add $Config{scriptdir} temporarily to $ENV{PATH}. Another solution would be to use the App module instead of the script, if it's possible. Unfortunately it looks like ack can only be called as a script.
If you look at a sample fail report, then you can see that App::Ack was installed (it appears in the PREREQUISITES section both under requires and build_requires, you can also see which App::Ack version is installed in the "HAVE" column). You can also see the user's PATH (in the ENVIRONMENT section). And you may guess about the scriptdir for this perl, it's usually the same directory where the perl binary itself is installed, and the path to current perl is visible in $^X (under "Perl special variables").
If you want to reproduce the behavior, then you need to deinstall ack from your machine, build a custom perl using ./configure.gnu --prefix=/path/to/custom/perl-5.X.Y, and use this perl for tests.

Do I have to run make/make install to test each change to a Perl distribution file?

Do I have to run make and make install each time I change a .pm file for Perl? I'm doing a ton of testing and this is becoming cumbersome.
You don't have to install the module to test it.
If I'm testing inside my distribution directory, I just use the test target:
% make test
Or, if I'm using Module::Build:
% ./Build test
Since make is a dependency management tool, it also takes care of any other steps it needs to perform so it can run the test target. You don't need to run each target separately. Module::Build does the same thing.
If I want to test a single file, I combine the make command with a call to perl that also uses the blib module to set the right #INC:
% make; perl -Mblib t/single_test.t
Some people like using prove for the same thing. No matter which method I use, I'm probably using the arrow keys to move back to a previous command line to re-run it. I do very little typing in any of this.
It depends on module setup, but under the standard MakeMaker I use, "make test" runs a "make" if any files have been modified, so when doing intra-module development "make test" is the only command you need until you've finished.
Evan Carroll got it basically right. To expand on his answer: use the testing tools that come with Perl to tighten the workflow.
Let's say you are in your project directory and you hack on the files in its lib/ subdirectory. Execute prove -l to run all tests. That's easier than messing with absolute paths in the PERL5LIB environment variable.
Presumably you're editing a lib module in a non-lib location, rather than clobbering a global library for each modification - do the sensible thing and change the library path perl uses with PERL5LIB, which will append internally to #INC (the use search path):
PERL5LIB=/home/user/code/perl/project/lib perl myapp.pl
If your program isn't pure-perl and requires a make system, there is no way to do this short of rebuilding, but pure-perl (PP) doesn't really require make under normal circumstances. If you do it this way, running perl under a normal environment will yield the predictable and tested results, running it with your PERL5LIB will allow you to test the program.

How do I start a new Perl module distribution?

I'm trying to set up a large-ish project, written in Perl. The IBM MakeMaker tutorial has been very helpful so far, but I don't understand how to link all the modules into the main program. In my project root, I have MANIFEST, Makefile.PL, README, a bin directory, and a lib directory. In my bin directory, I have my main script (Main.pl). In the lib directory, I have each of my modules, divided up into their own respective directories (i.e. Utils::Util1 and Utils::Utils2 in the utils directory, etc). In each module directory, there is also a t directory, containing tests
My MANIFEST file has the following:
bin/Main.pl
lib/Utils/Util1.pm
lib/Utils/Util2.pm
lib/Utils/t/Utils1.t
lib/Utils/t/Utils2.t
Makefile.PL
MANIFEST
README
Makefile.PL is the following:
use ExtUtils::MakeMaker;
WriteMakefile(
'NAME'=>'Foo',
'VERSION_FROM'=>'bin/Main.pl',
'PREREQ_PM'=>{
"XML::Simple"=> 2.18}, #The libraries that we need and their
#minimum version numbers
'EXE_FILES' =>[("bin/Main.pl")]
);
After I make and run, the program crashes, complaining that it cannot find Utils::Util1, and when I run 'make test, it says no tests defined. Can anyone make any suggestions? I have never done a large scale project like this in perl, and I will need to add many more modules
If you are just starting to create Perl modules (which is also Perl's equivalent of a project), don't use Makemaker. Module::Build is the way to go, and it's now part of the standard library. Makemaker is for us old salts who haven't converted to Module::Build yet. :) I'll strike that now that Module::Build is unmaintained and out of favor; I still use MakeMaker.
You should never start off a Perl project by trying to create the structure yourself. It's too much work and you'll always forget something.
There's h2xs, a program that comes with perl and was supposed to be a tool to convert .h files into Perl's glue language XS. It works fine, but its advantage is that it comes with perl:
% h2xs -AXn Module::Name
Something like Module::Starter is a bit more sophisticated, although you have to get it from CPAN. It's the tool we use in Intermediate Perl because it's simple. It fills in some templates with your information:
% module-starter --author=... --email=... --module=...
If you are doing to do this quite a bit, you might then convert that to Distribution::Cooker so you can customize your files and contents. It's a dinky utility I wrote for myself so I could use my own templates.
% dist_cooker Module::Name
If you're really hard core, you might want Dist::Zilla, but that's more for people who already know what they are doing.
Might I also suggest module-starter? It'll automatically create a skeleton project which "Just Works". I learned what little I know about Perl modules organization by reading the generated skeleton files. It's all well-documented, and quite easy to use as a base for growing a larger project in. You can check out the getting-started docs to see what it gives you.
Running module-starter will give you a Perl distribution, consisting of a number of modules (use the command line option --module, such as:
module-starter --distro=Project --module=Project::Module::A,Project::Module::B [...]
to create multiple modules in a single distribution). It's then up to you whether you'd prefer to organize your project as a single distribution consisting of a number of modules working together, or as a number of distributions which can be released separately but which depend on each other (as configured in your Build or Makefile.PL file) to provide a complete system.
Try this structure:
bin/Main.pl
lib/Utils/Util1.pm
lib/Utils/Util2.pm
Makefile.PL
MANIFEST
README
t/Utils1.t
t/Utils2.t
As ysth said, make does not install your modules, it just builds them in a blib directory. (In your case it just copies them there, but if you had XS code, it would be compiled with a C compiler.) Use make install to install your modules for regular scripts to use.
If you want to run your script between make and make install, you can do:
perl -Mblib bin/Main.pl
The -Mblib instructs perl to temporarily add the appropriate directories to the search path, so you can try out an uninstalled module. (make test does that automatically.)
By default, tests are looked for in a top-level t directory (or a test.pl file, but that has some limitations, so should be avoided).
You say "After I make and run"...make puts things into a blib directory structure ready to be installed, but doesn't do anything special to make running a script access them. (make test is special; it does add appropriate paths from blib to perl's #INC to be able to run the tests.) You will need to do a "make install" to install the modules where your script will find them (or use a tool like PAR to package them together with your script).

What is a quick way to run a single script to automatically install missing modules using only Perl core?

I inherited a project which is supposed to be able to be deployed to other servers. This project has a number of simple module dependencies which however might not be present on all target machines.
As such I'd like to be able to run a single command line script that checks which Perl modules are installed and tries to automatically install missing ones via CPAN.
Since this should be very basic (i.e. needing to install stuff to run the module installer would defeat the point) said script should only use Perl 5.8.8 core modules.
Does something like that exist already or would i need to write it myself?
Creating a Bundle package is one possible answer.
You can then look at something like CPAN::Shell (see CPAN module) to automate the process.
/I3az/
Update re: brian's comment about Task:: - Here are some pertinent links:
Writing a CPAN Task (using Module::Install)
"Task:: or Bundle::"? (Perlmonks)
Use Module::Install, it will be bundled with your module/program. You can use "auto_install" command to automatically install dependencies.