Is there a way to load python scripts from a zipfile in ironpython - import

I want to have extensions to my application written in IronPython. Part of those extensions will use decorators, and so I wish to include the decorator module in the package.
The issue is that the decorator depends on several modules existing in the IronPython distribution, and those modules depend on other modules and so on.
The easiest solution would be to include the entire Lib folder in the application, but that would increase the footprint by 500 files and 12 mb.
To avoid that I'm trying to zip the modules and load them from the zip file instead of directly from the filesystem, but I haven't found a straightforward way to do so.
I've spotted the importer mechanism for loading modules via a "path_hooks" global , which seems to give me access to something similar to the imp mechanism in Python, but I'm not sure of how to use it.
Is there a hook for the import mechanism in IronPython that I'm missing?
How should I go about implementing this?

What you want is zipimport support, which isn't implemented yet. If you'd like to help out with that I can put you in touch with the guy who's working on it.
Otherwise, it looks like you might just need to stub out the bits of inspect.py that decorator.py needs.

Related

Managing shared code amongst PowerShell modules

I've been diving into some of the more advanced features of powershell modules and manifests recently, with a view to handling scenarios more advanced than just a basic export of a few functions. It sounds like it should be obvious, but I'm struggling to find a nice solution for sharing common 'helper' type functions across several large non trivial modules. In particular, I'm looking for a solution that:
Allows sharing of 'helper' type functions without necessarily being exported by anyone
Allow installation via PsGet from a local repo path
Let me go into some of the challenges I see.
First of all, as far as I can tell, PsGet does not handle module dependencies well. This implies sharing between modules is going to be a struggle. Maybe a solution to this is to avoid PsGet, and use a custom script to 'install' modules to the local module path, which might be more tolerant of dependencies and load order.
My point about not using module exports to share helper functions also seems to be an issue. The reason I can see for this is desiring aliases, helpers etc for common internal actions (needed inside useful functions), that are either useless or unsafe to expose. For example, a nice brief alias for getting the local script path (commonly used, noisier than it should be). Or I recently made a nice simple wrapper around PromptForChoice with fewer options. Maybe this whole thing isn't a real issue. But I can't help but feel that shipping a 'utils' module that exports low level functions that are useful inside real modules, but not to an end user, seems like the wrong way to go.
What I've been playing with is a small build structure that tests and then packs modules, and I want to get some code sharing possible. I've been looking for an alternative using ScriptsToProcess in the manifest, but these seem to be absolute paths, not relative.
Imagine a folder structure:
modules
utils
console_helpers.ps1
moduleA
moduleA.psm1
moduleA.psd1
moduleB
moduleB.psm1
moduleB.psd1
packed_modules
moduleA.zip
moduleB.zip
What I was considering was that you could list relative paths in each ScriptsToProcess, and then my pack phase will go and drag those relative paths in to each zip.
Is this a horrible crazy idea? Am I right that ps modules and PsGet really don't have decent dependency support? I would love to hear feedback from anyone who has looked into this space. I think the answer I'm hoping to get in rough priority might be:
Here's an example of sharing code without exposing it (probably a build/pack level solution)
Here's how to make module dependencies work nicely, using PsGet
Here's how to make module dependencies work nicely, but you can't use PsGet
Just expose everything from modules
This is a terrible idea and you're terrible
Thanks!
UPDATE as suggested by CalebB
Here's another example to illustrate what I'm trying to resolve. I find it useful to wrap up '&' style execution of commands with a wrapper function, to deal with stuff like checking exit codes etc. If i'm building half a dozen modules, many of them will want to make use of that helper (obviously).
My options today seem to be put it in a module and export it, but maybe I don't want it exported, I want more of a . source style access. And if I've got a family of modules all trying to use this stuff, the options for module dependency management are limited (PsGet limitation etc).
If I'm 'building' all the modules at once (with some decent psake and pester infrastructure), maybe I can use a hack at this point to embed scripts into my zipped modules to 'solve' all these problems?
Allows sharing of 'helper' type functions without necessarily being exported by anyone
Mhm... what is wrong with dot sourcing the scripts you need within particular module ? You could :
Keep your folder structure and symlink the desired functions into module folder.
Try to use AbsolutePath with ScriptProcess that has "relative part" in it, for example %PSScriptRoot%\..\utils (not tried in that context but generally works). If not, u can always add preprocessor to fix paths for you if it doesn't work
Delete undesired imported elements manually via function:, alias:, and var: provider.
Import extra utilities only when u use them then remove them at the end ? If the desire is that user can't see them you can encrypt them.
Here's how to make module dependencies work nicely, but you can't use PsGet
Chocolatey uses NuGet so it handles dependencies and can load from the local store. As a benefit, OneGet supports it which is something everybody will use eventually.
I've posted the solution I've come up with on github. I've rolled in a few other features I want when building modules, but the key solution for this question here uses reading and updating the psd1 of each module.
You include scripts that you want to embed in the NestedModules property of your manifest. My build phase will find each script and copy it into the module folder for packing and zipping. The manifest that ships in the package has the script paths converted to the now local file name.
I'm still not sure of this is ideal, but it seems to be a nice compromise to deal with the issues here.
A key issue I encountered along the way was that the ScriptsToProcess list is executed literally at module import time, so it is only useful for bootstrapping the import of your functionality. The NestedModules property is actually the list of additional scripts you want to be . sourced and available when your module is used.

perl automatic module loading like in php

I am creating a bigger application in Perl, and I am wondering whether something like class autoloading similar to php’s can be done in Perl?
I checked catching exceptions and $SIG{__DIE__} redefining etc., but this doesn't seem to be solution because it will be stopped on first “cannot find method new via package”, and then to load every module I should again call the whole program.
One solution could be to maybe scan all files in my lib and give it to #INC at runtime, but I don’t know whether this is good solution — probably not.
Do anybody have a suggestion?
Well, you probably want to read up on the following:
AUTOLOAD and AutoLoader
autouse
Class::AutoUse
later
etc.
None of those are quite the same as PHP's approach though.
However, it really is best practice to list all "normal" dependencies. This makes it easier to build installers / deploy to CPAN etc. There are a bunch of other modules that deal with loading plugins, where you really don't know what to load until runtime.
Is there some difficulty with figuring out your dependencies, or do you just want to avoid a bunch of "use" statements at the top of each file?

how to determine dependent library in zend framework?

I completed my very first project in zend framework!! Thanks to stackoverflow community!!
While uploading files, i didn't know how to include zend library so i uploaded the whole library in the /library folder of my project base.
Is there a way to determine which library is used and which in not (like compilation that automatically copies dependent files to library folder incase webhost does not provide zend library ..)? i would be awfully bad to manually add each file and test weather the particular library is added or not.
This answer essentially says don't worry about including the whole library. I usually put the whole library in the project library folder, just like you did.
But if it is truly problematic to include the whole library, you could take a look at Jani Hartikainen's Packageizer which, at least in a previous form that I played with, allowed you to specify the components you needed and it would chase the dependencies and wrap them in a neat little package.
Disk space is cheap. Just have the entire ./Zend library directory (and maybe ./ZendX, if you are using that) into your own library directory where it will used. With autoloading, nothing that isn't being used will take up any significant memory. taking even 5 minutes trying to figure out is time (and therefore money) that is more usefully spent writing code.
I wonder if it would be worthwhile/reliable to subclass the autoloader for this and have it optionally log each class it loads during site operation (sort of how Zend_Translate can log untranslated strings).
You could have it turned off normally but in your testing environment you would turn it on (via your application.ini), and have it build your dependency list while your unit tests are running.

Perl CPAN-style Packaging with no lib/*.pm

I've got a collection of Perl scripts and a couple XML data files that they depend on which I'd like to distribute. Currently, I've got a shell script which copies bin/* and share/* to a target installation tree. It seems a little clunky, so I'd like to go with something like the standard CPAN way of packaging Perl.
Does is make sense to bundle what I've got in a CPAN-style package? I suspect there is nothing wrong with it, but every tutorial I've looked at thinks that lib/Blah.pm is an essential file in any package - I don't even have a lib/ directory, let alone any .pm files.
Is there a standard solution for packaging a collection of Perl scripts, along with some data in a share/ directory?
Distributions don't care about modules. Most of the tools are set up to handle modules by default because that's the common case, but you can really distribute anything as long as you provide the logic to tell the build files what to do with whatever files that you provide.
ExtUtils::Makemaker is difficult to use for this sort of thing, but Module::Build (despite the word "Module") makes it much easier. However, you have to know a bit about custom Module::Build classes so you can override the default behavior that you don't want.
If you are talking about standalone scripts, you can look at my scriptdist distribution, or the Dr. Dobbs article I wrote about it. It won't handle the share/ portion for you, but it's not too hard to add.

How to distribute native perl scripts without custom module overhead?

How can someone distribute native (non-"compiled/perl2exe/...") Perl scripts without forcing users to be aware of the custom (non-CPAN) modules that the scripts needs in order to run?
The problem is users will inevitably copy the script somewhere else on the system and take the script out of its native environment and then it can no longer find the modules it needs to run.
I've sometimes settled with just copying the module into the actual script, but I'd prefer a cleaner solution.
Update: I better clarify. I distribute a bunch of scripts which happen to use similar modules in the backend. The users understand how to run Perl scripts, but rather than relying on telling them "no don't move the script" I'd prefer to simply allow them to move the files. The path of least resistence.
The right way is to tell them "Don't do that!" I would hope that they wouldn't expect to move an exe file and have the program continue to work. This is no different.
That said, there are a couple of alternatives. One is replacing the script with a wrapper (e.g. pl2bat) that knows the full path to the real script. Another is to use PAR, but that would require PAR and/or parl (from PAR::Packer) to be installed.
If a script that your prepared for a client needs "custom" modules, simply pack your modules as if you were trying to upload them to cpan. Then give the package to the client and he can use the cpan utility to install the script and the modules.
Distribute an installer along with the script. The installer will need to be run with root privileges and it will put the custom modules into the standard system location (/usr/local/lib/perl5/site_perl or whatever).
I've not tried this, but Module::Install looks helpful in this regard. It's described as a:
Standalone, extensible Perl module installer
As a variant of the "put your modules all in one place and make your applications aware of it" that will even work across multiple computers and networks, maybe you should check out PAR::Repository and respectively PAR::Repository::Client. You'd just provide a single binary client executable that connects to the repository (via file system or https) and executes any of the arbitrary number of programs (using an arbitrary set of modules) provided by the repository that the user asks for.
If there are many users, this also has a benefit for maintenance: Simply update the software provided by the repository and the users will start using the updated code for their system when they next start the programs.
It would be really nice if you could just use a NeXTSTEP style application bundle. Since you probably aren't developing for a platform that uses bundles, you'll have to make do.
Put all support files in a known location, and point your executable at those files for access to settings and libraries. The easiest way to do this is with a simple installer.
For example, with an app called foo, put all your required files in /opt/xlyd_apps/foo, libraries in /opt/xlyd_apps/foo/lib, configuration in/opt/xlyd_apps/foo/etc, and so on. Put the executable in /opt/xlyd_apps/foo/bin.
The important thing is to make sure the executable knows to look in /opt/xlyd_apps/foo for all its goodies, so if the customer/client move the foo script to a new location the install still works.
So, while you can't make the whole thing self contained and relocatable, you have made the actual calling script relocatable.
I've actually come up with my own solution, and I'm kind of curious what kind of reception it will have.
I've written a script that reads a perl script and looks for "use/require" statements. Upon finding them it checks if the module is part of the standard library (looks at module path for /perl5/\d+.\d+[\d.]+/) and then rewrites the use/require line in two different ways depending on usage.
If require is found:
{
... inline the entire module here...
}
If use is found:
BEGIN {
... inline the entire module here...
}
If use has imports, immediately follow above with:
BEGIN { import Module ...imports seen... }
I understand this doesn't work with modules that use XS, but I was fine with this. Mostly I need to support only pure perl modules.