perl automatic module loading like in php - perl

I am creating a bigger application in Perl, and I am wondering whether something like class autoloading similar to php’s can be done in Perl?
I checked catching exceptions and $SIG{__DIE__} redefining etc., but this doesn't seem to be solution because it will be stopped on first “cannot find method new via package”, and then to load every module I should again call the whole program.
One solution could be to maybe scan all files in my lib and give it to #INC at runtime, but I don’t know whether this is good solution — probably not.
Do anybody have a suggestion?

Well, you probably want to read up on the following:
AUTOLOAD and AutoLoader
autouse
Class::AutoUse
later
etc.
None of those are quite the same as PHP's approach though.
However, it really is best practice to list all "normal" dependencies. This makes it easier to build installers / deploy to CPAN etc. There are a bunch of other modules that deal with loading plugins, where you really don't know what to load until runtime.
Is there some difficulty with figuring out your dependencies, or do you just want to avoid a bunch of "use" statements at the top of each file?

Related

Managing shared code amongst PowerShell modules

I've been diving into some of the more advanced features of powershell modules and manifests recently, with a view to handling scenarios more advanced than just a basic export of a few functions. It sounds like it should be obvious, but I'm struggling to find a nice solution for sharing common 'helper' type functions across several large non trivial modules. In particular, I'm looking for a solution that:
Allows sharing of 'helper' type functions without necessarily being exported by anyone
Allow installation via PsGet from a local repo path
Let me go into some of the challenges I see.
First of all, as far as I can tell, PsGet does not handle module dependencies well. This implies sharing between modules is going to be a struggle. Maybe a solution to this is to avoid PsGet, and use a custom script to 'install' modules to the local module path, which might be more tolerant of dependencies and load order.
My point about not using module exports to share helper functions also seems to be an issue. The reason I can see for this is desiring aliases, helpers etc for common internal actions (needed inside useful functions), that are either useless or unsafe to expose. For example, a nice brief alias for getting the local script path (commonly used, noisier than it should be). Or I recently made a nice simple wrapper around PromptForChoice with fewer options. Maybe this whole thing isn't a real issue. But I can't help but feel that shipping a 'utils' module that exports low level functions that are useful inside real modules, but not to an end user, seems like the wrong way to go.
What I've been playing with is a small build structure that tests and then packs modules, and I want to get some code sharing possible. I've been looking for an alternative using ScriptsToProcess in the manifest, but these seem to be absolute paths, not relative.
Imagine a folder structure:
modules
utils
console_helpers.ps1
moduleA
moduleA.psm1
moduleA.psd1
moduleB
moduleB.psm1
moduleB.psd1
packed_modules
moduleA.zip
moduleB.zip
What I was considering was that you could list relative paths in each ScriptsToProcess, and then my pack phase will go and drag those relative paths in to each zip.
Is this a horrible crazy idea? Am I right that ps modules and PsGet really don't have decent dependency support? I would love to hear feedback from anyone who has looked into this space. I think the answer I'm hoping to get in rough priority might be:
Here's an example of sharing code without exposing it (probably a build/pack level solution)
Here's how to make module dependencies work nicely, using PsGet
Here's how to make module dependencies work nicely, but you can't use PsGet
Just expose everything from modules
This is a terrible idea and you're terrible
Thanks!
UPDATE as suggested by CalebB
Here's another example to illustrate what I'm trying to resolve. I find it useful to wrap up '&' style execution of commands with a wrapper function, to deal with stuff like checking exit codes etc. If i'm building half a dozen modules, many of them will want to make use of that helper (obviously).
My options today seem to be put it in a module and export it, but maybe I don't want it exported, I want more of a . source style access. And if I've got a family of modules all trying to use this stuff, the options for module dependency management are limited (PsGet limitation etc).
If I'm 'building' all the modules at once (with some decent psake and pester infrastructure), maybe I can use a hack at this point to embed scripts into my zipped modules to 'solve' all these problems?
Allows sharing of 'helper' type functions without necessarily being exported by anyone
Mhm... what is wrong with dot sourcing the scripts you need within particular module ? You could :
Keep your folder structure and symlink the desired functions into module folder.
Try to use AbsolutePath with ScriptProcess that has "relative part" in it, for example %PSScriptRoot%\..\utils (not tried in that context but generally works). If not, u can always add preprocessor to fix paths for you if it doesn't work
Delete undesired imported elements manually via function:, alias:, and var: provider.
Import extra utilities only when u use them then remove them at the end ? If the desire is that user can't see them you can encrypt them.
Here's how to make module dependencies work nicely, but you can't use PsGet
Chocolatey uses NuGet so it handles dependencies and can load from the local store. As a benefit, OneGet supports it which is something everybody will use eventually.
I've posted the solution I've come up with on github. I've rolled in a few other features I want when building modules, but the key solution for this question here uses reading and updating the psd1 of each module.
You include scripts that you want to embed in the NestedModules property of your manifest. My build phase will find each script and copy it into the module folder for packing and zipping. The manifest that ships in the package has the script paths converted to the now local file name.
I'm still not sure of this is ideal, but it seems to be a nice compromise to deal with the issues here.
A key issue I encountered along the way was that the ScriptsToProcess list is executed literally at module import time, so it is only useful for bootstrapping the import of your functionality. The NestedModules property is actually the list of additional scripts you want to be . sourced and available when your module is used.

Is there a way to load python scripts from a zipfile in ironpython

I want to have extensions to my application written in IronPython. Part of those extensions will use decorators, and so I wish to include the decorator module in the package.
The issue is that the decorator depends on several modules existing in the IronPython distribution, and those modules depend on other modules and so on.
The easiest solution would be to include the entire Lib folder in the application, but that would increase the footprint by 500 files and 12 mb.
To avoid that I'm trying to zip the modules and load them from the zip file instead of directly from the filesystem, but I haven't found a straightforward way to do so.
I've spotted the importer mechanism for loading modules via a "path_hooks" global , which seems to give me access to something similar to the imp mechanism in Python, but I'm not sure of how to use it.
Is there a hook for the import mechanism in IronPython that I'm missing?
How should I go about implementing this?
What you want is zipimport support, which isn't implemented yet. If you'd like to help out with that I can put you in touch with the guy who's working on it.
Otherwise, it looks like you might just need to stub out the bits of inspect.py that decorator.py needs.

Perl CPAN-style Packaging with no lib/*.pm

I've got a collection of Perl scripts and a couple XML data files that they depend on which I'd like to distribute. Currently, I've got a shell script which copies bin/* and share/* to a target installation tree. It seems a little clunky, so I'd like to go with something like the standard CPAN way of packaging Perl.
Does is make sense to bundle what I've got in a CPAN-style package? I suspect there is nothing wrong with it, but every tutorial I've looked at thinks that lib/Blah.pm is an essential file in any package - I don't even have a lib/ directory, let alone any .pm files.
Is there a standard solution for packaging a collection of Perl scripts, along with some data in a share/ directory?
Distributions don't care about modules. Most of the tools are set up to handle modules by default because that's the common case, but you can really distribute anything as long as you provide the logic to tell the build files what to do with whatever files that you provide.
ExtUtils::Makemaker is difficult to use for this sort of thing, but Module::Build (despite the word "Module") makes it much easier. However, you have to know a bit about custom Module::Build classes so you can override the default behavior that you don't want.
If you are talking about standalone scripts, you can look at my scriptdist distribution, or the Dr. Dobbs article I wrote about it. It won't handle the share/ portion for you, but it's not too hard to add.

What is the modern way of creating an XS module from scratch?

I need to write an XS module for Perl. It is my understanding that h2xs is pretty much deprecated today, what is the preferred method for starting an XS module today? I looked at Module::Starter, but it only handles pure Perl modules.
No, h2xs is not deprecated. Module::Starter is certainly more convenient if you create many pure Perl modules, but there's no reason to avoid h2xs. I would recommend reading all the way through its doc before using it, though, so that you know what all you might want it to do or not do.
Personally I just use Module::Starter and add the .xs file myself. It depends on what your aim is: if you're making a one-on-one mapping to a C api then h2xs can do a lot of boilerplate for you, but if you're making a completely new interface, or when you're only doing things with perl itself (and not some external library) it doesn't add much but trouble IMHO.
Personally, whenever I start making a new module I just do it by cping and editing files from another module of mine that's similar to it, and editing as appropriate. Of course, nothing in that approach says it has to be one of mine. There's plenty of code on CPAN you can take copies of and be inspired by...
You should also look at using Inline::C

Should I use Module::Install or Module::Build?

I'm writing a programmer's text editor (yes another one) in Perl called Kephra, which is also a CPAN module of course and bundled with Module::Install. Recently I saw that Module::Build has gone into core, so if I switch I could reduce dependencies. Is there any other reason to switch?
We use Module::Build in our group.
The main reason is Easy Extensibility.
Module::Build allows you to do more with your build process in pure Perl through subclassing. If you want to do more using Module::Install, you have to have knowledge of how Makefiles work, AFAIK. Since you presumably already know Perl, this can be an advantage.
As you said, using Module::Build removes the dependency on an external make program, which can be viewed as a good thing.
However, the main cons that I can think of are:
Although Module::Build has hit core, not everyone will be using an up-to-date version of Perl. For users with older versions of the core, you will be creating a new dependency.
Lots of veterans (not necessarily Perl people) are used to the perl Makemaker.PL; make; make install paradigm, and can be thrown off by having Build.PL instead. Hopefully this isn't a big deal.
Module::Build has occasionally broken our builds when its functionality has changed because the documentation didn't cover an edge case which we were using. The edge case was then changed and documented, but we had to re-code our subclass to get our build to work again (this happened for us at the recent upgrade from 0.2808 to 0.3).
All that said, though, I still recommend Module::Build simply for the extensibility. If that's not an issue for you, you may be better off sticking with Module::Install.
The cud as already been chewed a bit on this before in "Which framework should I use to write modules?"
After spitting out the cud I decided to go with Module::Build but clearly different answers are possible! (though I've been happy with M::B so far).
Well, Module::Build is a pretty good module, it's supposed to be a drop in replacement for ExtUtils::MakeMaker, that is, replace the Makefile.PL by a Build.PL, which generate a Build instead of a Makefile. It was also meant as "simple things should stay simple, hard things should be possible".
Module::Install takes a different approach and generates a Makefile.
Also, don't forget that not everyone runs the latest version of everything :-)
I don't remember any comparison of those modules, but I think you could find a few things from Module::Build and Module::Install respective cpanratings pages.