How can someone distribute native (non-"compiled/perl2exe/...") Perl scripts without forcing users to be aware of the custom (non-CPAN) modules that the scripts needs in order to run?
The problem is users will inevitably copy the script somewhere else on the system and take the script out of its native environment and then it can no longer find the modules it needs to run.
I've sometimes settled with just copying the module into the actual script, but I'd prefer a cleaner solution.
Update: I better clarify. I distribute a bunch of scripts which happen to use similar modules in the backend. The users understand how to run Perl scripts, but rather than relying on telling them "no don't move the script" I'd prefer to simply allow them to move the files. The path of least resistence.
The right way is to tell them "Don't do that!" I would hope that they wouldn't expect to move an exe file and have the program continue to work. This is no different.
That said, there are a couple of alternatives. One is replacing the script with a wrapper (e.g. pl2bat) that knows the full path to the real script. Another is to use PAR, but that would require PAR and/or parl (from PAR::Packer) to be installed.
If a script that your prepared for a client needs "custom" modules, simply pack your modules as if you were trying to upload them to cpan. Then give the package to the client and he can use the cpan utility to install the script and the modules.
Distribute an installer along with the script. The installer will need to be run with root privileges and it will put the custom modules into the standard system location (/usr/local/lib/perl5/site_perl or whatever).
I've not tried this, but Module::Install looks helpful in this regard. It's described as a:
Standalone, extensible Perl module installer
As a variant of the "put your modules all in one place and make your applications aware of it" that will even work across multiple computers and networks, maybe you should check out PAR::Repository and respectively PAR::Repository::Client. You'd just provide a single binary client executable that connects to the repository (via file system or https) and executes any of the arbitrary number of programs (using an arbitrary set of modules) provided by the repository that the user asks for.
If there are many users, this also has a benefit for maintenance: Simply update the software provided by the repository and the users will start using the updated code for their system when they next start the programs.
It would be really nice if you could just use a NeXTSTEP style application bundle. Since you probably aren't developing for a platform that uses bundles, you'll have to make do.
Put all support files in a known location, and point your executable at those files for access to settings and libraries. The easiest way to do this is with a simple installer.
For example, with an app called foo, put all your required files in /opt/xlyd_apps/foo, libraries in /opt/xlyd_apps/foo/lib, configuration in/opt/xlyd_apps/foo/etc, and so on. Put the executable in /opt/xlyd_apps/foo/bin.
The important thing is to make sure the executable knows to look in /opt/xlyd_apps/foo for all its goodies, so if the customer/client move the foo script to a new location the install still works.
So, while you can't make the whole thing self contained and relocatable, you have made the actual calling script relocatable.
I've actually come up with my own solution, and I'm kind of curious what kind of reception it will have.
I've written a script that reads a perl script and looks for "use/require" statements. Upon finding them it checks if the module is part of the standard library (looks at module path for /perl5/\d+.\d+[\d.]+/) and then rewrites the use/require line in two different ways depending on usage.
If require is found:
{
... inline the entire module here...
}
If use is found:
BEGIN {
... inline the entire module here...
}
If use has imports, immediately follow above with:
BEGIN { import Module ...imports seen... }
I understand this doesn't work with modules that use XS, but I was fine with this. Mostly I need to support only pure perl modules.
Related
I have a Perl application which is used in two contexts: It can be used as a diagnostic tool which displays information about a system, or as a testing tool which sends Modbus commands to that system. The problem I have is that allowing the user to send commands to the system in a diagnostic context is a potential safety risk, so I want to create two executables: A testing version that includes the Modbus module and a diagnostic one that does not.
My current solution is to include the Modbus module like this:
BEGIN { eval { require Modbus; }; Modbus->import; }
This only includes the Modbus module if I use the option -M Modbus while building the .exe with PAR Packager. The issue with this approach is that it fails unless this is the only place where Modbus is imported. So if another developer who isn't aware of this risk comes along, it only takes one require statement to bypass this fix.
Is there a way for me to prevent a specific module from being included in the executable unless I explicitly want it to be (either with the -M option or some other method)? I've been trying to figure something out with Devel::Hide, but haven't had much luck. All the solutions I've found so far fail the "other developer who doesn't know about this" test.
I'm using Strawberry Perl 5.20.3.3, but I can upgrade if necessary.
When I've done this sort of thing, I've created a small module that is included if it is present and not included if it is not:
eval { require MaybeItsThere }
In my Makefile (or whatever build system you want), I have targets for development and production builds. One of the subtasks for the dev build creates that MaybeItsThere file. It can also set whatever it needs for PERL5LIB and so on such that only the dev build can load it.
However, as you say, the enterprising developer can quickly find out what they need to do to get the features they want.
Having read through your comments, it seems that you're not going to be able to enforce correct usage. (You gave examples how the module has already been misused by people sidestepping your build process.) I would suggest an alternative approach.
Document the problem at that place where someone would find a workaround.
In other words, if some other developers is going to look at the code or at the build to see the name of the missing module, let them see a warning about the dangers right there. Put a comment block explaining why the Modbus module should only be used when diagnostics are disabled or filtered. Make the build's failure put a warning right there at the same place as the name of the missing module; use a die "Modbus should not be used when user interaction is enabled."; or something similar to convey the message in the same place where the other developer would be looking for the solution.
Ultimately, you can't force someone to use your build tools, but you can try to educate them while they're trying to work around you, instead of documenting the problem somewhere else that they might not see.
I've been diving into some of the more advanced features of powershell modules and manifests recently, with a view to handling scenarios more advanced than just a basic export of a few functions. It sounds like it should be obvious, but I'm struggling to find a nice solution for sharing common 'helper' type functions across several large non trivial modules. In particular, I'm looking for a solution that:
Allows sharing of 'helper' type functions without necessarily being exported by anyone
Allow installation via PsGet from a local repo path
Let me go into some of the challenges I see.
First of all, as far as I can tell, PsGet does not handle module dependencies well. This implies sharing between modules is going to be a struggle. Maybe a solution to this is to avoid PsGet, and use a custom script to 'install' modules to the local module path, which might be more tolerant of dependencies and load order.
My point about not using module exports to share helper functions also seems to be an issue. The reason I can see for this is desiring aliases, helpers etc for common internal actions (needed inside useful functions), that are either useless or unsafe to expose. For example, a nice brief alias for getting the local script path (commonly used, noisier than it should be). Or I recently made a nice simple wrapper around PromptForChoice with fewer options. Maybe this whole thing isn't a real issue. But I can't help but feel that shipping a 'utils' module that exports low level functions that are useful inside real modules, but not to an end user, seems like the wrong way to go.
What I've been playing with is a small build structure that tests and then packs modules, and I want to get some code sharing possible. I've been looking for an alternative using ScriptsToProcess in the manifest, but these seem to be absolute paths, not relative.
Imagine a folder structure:
modules
utils
console_helpers.ps1
moduleA
moduleA.psm1
moduleA.psd1
moduleB
moduleB.psm1
moduleB.psd1
packed_modules
moduleA.zip
moduleB.zip
What I was considering was that you could list relative paths in each ScriptsToProcess, and then my pack phase will go and drag those relative paths in to each zip.
Is this a horrible crazy idea? Am I right that ps modules and PsGet really don't have decent dependency support? I would love to hear feedback from anyone who has looked into this space. I think the answer I'm hoping to get in rough priority might be:
Here's an example of sharing code without exposing it (probably a build/pack level solution)
Here's how to make module dependencies work nicely, using PsGet
Here's how to make module dependencies work nicely, but you can't use PsGet
Just expose everything from modules
This is a terrible idea and you're terrible
Thanks!
UPDATE as suggested by CalebB
Here's another example to illustrate what I'm trying to resolve. I find it useful to wrap up '&' style execution of commands with a wrapper function, to deal with stuff like checking exit codes etc. If i'm building half a dozen modules, many of them will want to make use of that helper (obviously).
My options today seem to be put it in a module and export it, but maybe I don't want it exported, I want more of a . source style access. And if I've got a family of modules all trying to use this stuff, the options for module dependency management are limited (PsGet limitation etc).
If I'm 'building' all the modules at once (with some decent psake and pester infrastructure), maybe I can use a hack at this point to embed scripts into my zipped modules to 'solve' all these problems?
Allows sharing of 'helper' type functions without necessarily being exported by anyone
Mhm... what is wrong with dot sourcing the scripts you need within particular module ? You could :
Keep your folder structure and symlink the desired functions into module folder.
Try to use AbsolutePath with ScriptProcess that has "relative part" in it, for example %PSScriptRoot%\..\utils (not tried in that context but generally works). If not, u can always add preprocessor to fix paths for you if it doesn't work
Delete undesired imported elements manually via function:, alias:, and var: provider.
Import extra utilities only when u use them then remove them at the end ? If the desire is that user can't see them you can encrypt them.
Here's how to make module dependencies work nicely, but you can't use PsGet
Chocolatey uses NuGet so it handles dependencies and can load from the local store. As a benefit, OneGet supports it which is something everybody will use eventually.
I've posted the solution I've come up with on github. I've rolled in a few other features I want when building modules, but the key solution for this question here uses reading and updating the psd1 of each module.
You include scripts that you want to embed in the NestedModules property of your manifest. My build phase will find each script and copy it into the module folder for packing and zipping. The manifest that ships in the package has the script paths converted to the now local file name.
I'm still not sure of this is ideal, but it seems to be a nice compromise to deal with the issues here.
A key issue I encountered along the way was that the ScriptsToProcess list is executed literally at module import time, so it is only useful for bootstrapping the import of your functionality. The NestedModules property is actually the list of additional scripts you want to be . sourced and available when your module is used.
I am writing a perl program that I want to share with others, eventually via cpan. it's getting to the point where I should start thinking about this on a bigger scale.
a decade ago, I used the h2xs package maker once. is this still the most recommended way to get started? there used to be a couple of alternatives. because I am starting from scratch with very little recollection, anything simple will do at this point.
I need to read a few long text files (not perl modules) for configuration. where do I put them and how do I access them, no matter where the module is installed? (FindBin?) _DATA_ is inconvenient.
I need to provide an executable (linux and osx). can putting an executable into the user's path be part of the module installation? (how?)
I would like to be able to continue developing it, run it for test purposes, have a new version, repack it, and reupload it easily.
before uploading to cpan, can I share a cpan bundle for easy local installation to downloaders and testers?
# cpan < mybundle.cpanbundle
advice appreciated.
regards,
/iaw
If anything I say conflicts with Andy Lester, listen to him instead. He knows more than I ever will.
Module::Starter is a good, simple way to generate module scaffolding. My take is it's been the default for this sort of thing for a few years now.
For configuration/support files, I think you probably want File::ShareDir. Might be worth considering Data::Section if it's just a matter of needing multiple __DATA__ sections though.
You can certainly put scripts in the bin subdirectory of your distribution, the build tool will put it in the right place at install time.
A build tool will take care of the work-flow you describe.
Bundles are something different. You make a distribution and share the tarball/archive.
If you set up PERL5LIB appropriately, then repeat make test, make install, make dist to your heart's content. For development/sharing purposes a lot of projects do their work on github or similar - makes it easy to share. They have private accounts for business purposes too. Very useful if you want to rewind and see where/when a problem was introduced.
If you get a copy of cpanm (simple to install, fairly lightweight) then it can install from a tar.gz file or even direct from a git repository. You can also tell it to install to a local dir (local::lib compatible - another utility that's very useful).
Hopefully that's reasonably up-to-date as of 2014. You may see Dist::Zilla mentioned for module development. My understanding is that it's most useful for those with a large family of CPAN distributions to manage. Oh - if you (or other readers) aren't aware of them, do check out autodie and Try::Tiny around errors and exceptions, Moose (for a full-featured object-oriented framework) and Moo (for a smaller lightweight version).
I think that advice is all reasonably non-controversial. I find cpanm to be much more pleasant than the "full" cpan client, and Moo seems pretty popular nowadays too.
Take a look at Module::Starter and its much more capable (and complex) successor Dist::Zilla.
Whatever you do, don't use h2xs. Module::Starter was created specifically because h2xs was such an inappropriate tool for creating distributions.
I want to have extensions to my application written in IronPython. Part of those extensions will use decorators, and so I wish to include the decorator module in the package.
The issue is that the decorator depends on several modules existing in the IronPython distribution, and those modules depend on other modules and so on.
The easiest solution would be to include the entire Lib folder in the application, but that would increase the footprint by 500 files and 12 mb.
To avoid that I'm trying to zip the modules and load them from the zip file instead of directly from the filesystem, but I haven't found a straightforward way to do so.
I've spotted the importer mechanism for loading modules via a "path_hooks" global , which seems to give me access to something similar to the imp mechanism in Python, but I'm not sure of how to use it.
Is there a hook for the import mechanism in IronPython that I'm missing?
How should I go about implementing this?
What you want is zipimport support, which isn't implemented yet. If you'd like to help out with that I can put you in touch with the guy who's working on it.
Otherwise, it looks like you might just need to stub out the bits of inspect.py that decorator.py needs.
I've got a collection of Perl scripts and a couple XML data files that they depend on which I'd like to distribute. Currently, I've got a shell script which copies bin/* and share/* to a target installation tree. It seems a little clunky, so I'd like to go with something like the standard CPAN way of packaging Perl.
Does is make sense to bundle what I've got in a CPAN-style package? I suspect there is nothing wrong with it, but every tutorial I've looked at thinks that lib/Blah.pm is an essential file in any package - I don't even have a lib/ directory, let alone any .pm files.
Is there a standard solution for packaging a collection of Perl scripts, along with some data in a share/ directory?
Distributions don't care about modules. Most of the tools are set up to handle modules by default because that's the common case, but you can really distribute anything as long as you provide the logic to tell the build files what to do with whatever files that you provide.
ExtUtils::Makemaker is difficult to use for this sort of thing, but Module::Build (despite the word "Module") makes it much easier. However, you have to know a bit about custom Module::Build classes so you can override the default behavior that you don't want.
If you are talking about standalone scripts, you can look at my scriptdist distribution, or the Dr. Dobbs article I wrote about it. It won't handle the share/ portion for you, but it's not too hard to add.