How to bundle modules for an offline server with cpanm - perl

I would like to do cpanm SomeModule to install SomeModule together with about 10 dependencies, but the target server has no internet access. I do have a very similar development machine (same Perl environment, same Perl version) where cpanm is able to download its source modules.
After studying the man page of cpanm, I have the feeling that I can create a tarball on the development machine, transfer that to the server, and use it to install the modules in one go.
Unfortunately, I do not seem to find which exact combination it is. Especially, as on the dev machine the modules are already installed, I need to force it to still add all the dependencies to the tarball (excluding core modules of course).
Can someone give the commands for the dev machine and the target machine?
EDIT: this is specifically about cpanm. Of course, if you can say with authority that it is definitely not possible with cpanm, that would be a valid answer as well...
EDIT: The comments and answers so far suggest using pinto or minicpan to create a bundle of CPAN module sources. This works well (especially pinto is quite trivial to use for this). I used pinto now to solve my current problem, but still, Pinto itself has a lot of prerequisite modules (>100 compared to Perl-Core). My hope with this question was that cpanm, which is a standalone, installation-less script, can do it itself (it has extensive options that kind of sound like they could go into that direction). That would be nice for bootstrapping Perl installations without large overheads.

You can download the tars from CPAN or metacpan for all your dependencies manually, then copy them and install one by one in the right order. That's a bit of work for ten modules, but it's not too bad. You can write a script.
But you can also use minicpan to create a local small CPAN that only contains what you need. Those are great to have a local copy of some or all of CPAN, e.g. on a USB drive when you need to install a module while hacking code on a flight. It's essentially a directory full of more directories and tars. You can just pick the things you need, zip it up, move it to your production server, unpack it there and tell cpanm to install from that local CPAN mirror.

You can use Carton to bundle the dependencies locally (on your machine with internet access) and then use either Carton itself to install the bundled distributions, or use cpanm itself and specify the bundle location.
You'll need carton 1.0.32 (to generate the package index) and cpanm 1.7016 (for the --from option) for this to work.
In the root of your distribution, you can do
$ carton install # will install the dependencies in `local`
$ carton bundle # will cache the dependencies in `vendor`
$ tree vendor/
vendor/
└── cache
├── authors
│   └── id
│   └── F
│   └── FO
│   └── FOOBAR
│   ├── Some-Dist-1.337.tar.gz
│   └── Another-Dist-0.001001.tar.gz
└── modules
└── 02packages.details.txt.gz
Later, after transferring this to your other airgapped machine, you can either use carton:
$ carton install --cached
$ carton exec scripts/your-script.pl
or install with cpanm directly
# To emulate carton:
$ cpanm -L local --from "$PWD/vendor/cache" --installdeps --notest --quiet .
# Or to install globally:
$ cpanm --from "$PWD/vendor/cache" .

Related

Detecting python modules from different directories in my project structure

I have a project structure in VSCode like this:
Project/
.venv/
virtual environment containing pip packages like numpy
config/
__init__.py
useful scripts
src/
program.py
I want program.py to be able to import packages from the virtual environment as well as the config package I made.
The issue is that it's not detecting the "config" module. I'm afraid that if I do something like append the system path, it will just make the virtual environment modules undetected instead. Do I have to change my project structure or is there an easier way?

perlbrew custom folder for libs

When I use
perlbrew lib create testlibs
to have a folder to store test libraries from CPAN, it creates a folder under the following path:
$HOME/.perlbrew/libs/perl-5.32.0#testlibs/lib/perl5
How can I tell perlbrew to store the libs folder ( perl-5.32.0#testlibs ) under a different folder such as /opt/perl5/libs, etc.?
I have very little disk space for home mountpoint on my server, so I need libs folders to be created on a different mountpoint. I already have PERLBREW_ROOT set to a different mountpoint, but that only makes perlbrew install new perl installation folders under it. It doesn't do it for the libs folders, which was surprising.
I already read the documentation and googled the hell out of this, but I can't find some env variable I can use to tell perlbrew where to install libs folders.
I know I can manipulate locallib env variables and even use lib on my scripts to point to some other folder, but it would be nice to have this all included within perlbrew.

Bazel build rules for BLAS & LAPACK

I have not found any project/repo providing Bazel build rules for Blas or Lapack.
This is quite unfortunate as these tools are often the primary libraries one must use for project oriented towards numerical computations.
Does such thing already exist somewhere?
No BUILD files yet...
However, a quick fix if you want to use already installed Blas or Lapack libs is to add a line like this one (according to the installed libs on your computer):
build --linkopt="-llapacke -llapack -lblas"
in your bazel.rc file (in the tools/ directory):
YourBazelProject/.
├── ...
├── WORKSPACE
└── tools
└── bazel.rc
I am not aware of BUILD files for these libraries either.
If you do create them, sharing them publicly or trying to submit them upstream (in the library code) will be a great way for others to benefit from your efforts.

how do I import a CPAN module using XS into my project?

My company has policies against using Perl modules which are not in Debian/Ubuntu's repositories. To "import" a non-xs module into my project/repo, it's usually just a matter of copying over the .pm files and putting them in the appropriate directory in lib/. Then I can use as if I'd cpan installed it.
But what do I do for an XS module? How do I "pre-compile" and to where should I copy over the .so and other XS related files? If you look, for example, at Ubuntu's DBD-SQLite package contents here, it seems like it should definitely be possible.
You should look into dh-make-perl for making .deb packages of your perl modules. That way you can install them like a regular shipped module.
You should also read Building Debian packages of Perl modules

Use purescript-halogen (with pulp)

Following PureScript by Example, I'm using pulp for installing packages.
Halogen requires virtual-dom as extra dependency. From the documentation and the example packages, it seems to me that adding it involves a bunch of build tools that I haven't used before (gulp, webpack, bower, etc.). I downloaded the examples and tried to run them with npm install & npm run example but I got unknwon module errors.
So, I'd like to know a minimal viable way to install halogen into a new pulp project (which hopefully doesn't require me to delve into the slew of build tools, or at least not for small projects).
I think you should be able to build it with pulp browserify --to some-file.js - the Browserify option is there for situations like this, where you want to produce a single JS file from a collection of CommonJS modules that may include npm dependencies.