Distribute shell scripts using setuptools and pyproject.toml - setuptools

I'm trying to distribute a shell script along with a Python package. Ideally, the shell script is installed when I run pip install my_package. I read from this SO that, my expected behavior is exactly what the scripts keyword of setuptools.setup provides. E.g. the script my_script will be installed with the following setup.py script:
setup(
...
scripts=['my_script'],
...
)
However, I cannot use the above method for two reasons:
the official doc did not mention this behavior. I don't know if I can continue to do this way.
my whole project is built on pyproject.toml, without setup.py. Although pyproject.toml has provided a [project.scripts] table, as explained in the setuptools official doc, the scripts can only be python functions instead of shell scripts.
For completeness, in my case, the shell script reads git status and sets environment variables, which will be read from within my python project. The shell script and my python project are bonded so tightly that I would rather not split them into two projects.
I have also tried to use a python function to execute the shell script, e.g.
[project.scripts]
my_script = 'my_project:my_func'
def my_func():
subprocess.run(...)
The problem with this solution is that every time I run my_script, my_project is loaded and the loading process is really slow.

Maybe a link in the comments leads to this information already. Anyway, I think it is worth posting that scripts = [...] in setup.py can be written in pyproject.toml as:
[tool.setuptools]
script-files = ["scripts/myscript1", "scripts/myscript2"]
However, this feature is deprecated. I hope the authors of the packaging tools will recognize the problem with shell scripts and deal with it.
Link: setuptools docs

I'm not exactly sure it will work for you case, but I solved this by creating a "shim" setup.py file (it has an added benefit of being able to install your project in edit mode).
It usually just calls setup(), but it was possible to pass the scripts argument:
"""Shim setup file to allow for editable install."""
from setuptools import setup
if __name__ == "__main__":
setup(scripts=["myscript"])
Everything else was loaded from pyproject.toml.

Related

Installing an perl based web-app in extremely restricted environment

Because i have a long series of comments with #ikegami, I cleaning up the question, in a hope it will be more understandable. Unfortunately, english isn't my "main" language. :(
Let say, having an environment where:
no development tools are installed (no make, nor gcc or like)
perl is installed with its core packages, nothing more
no outgoing network access is allowed - e.g. the user couldn't use curl nor cpan to download/install perl dependencies
the user even doesn't have admin (root) rights
but want install and evaluate some perl based web-app, let call it as MyApp
The MyApp
doesn't uses any XS-based module. (at least, I hope - in the development me using plenv and cpanm, so never checked the installed dependencies in depth)
it is an pure PSGI app, the simple plackup app.psgi works OK
the app uses some data-files which should be included in the "deployment".
The main question is: how to prepare the MyApp, and the all used CPAN-modules, to be easily installed in such restricted environment?
The goal is:
i don't need save my efforts and my time
but i want save the user's time and want minimize the needed actions on his side, so the installation (deployment) should be simple-as-possible.
E.g. how to get an running web-app to the user's machine with minimum possible (his) steps.
- the simplest thing is could be something as:
- copy one file (zip, or tarbal)
- unpack it
- from the terminal execute some run.pl in the unpacked directory.
To get the above simple installation, my idea was the following:
1.) Create an tarball, and after the unpacking will contain 3 folders and 1 perl-script, let say:
myapp_repo/
myapp_repo/distlib #will contain all MyApp's perl modules also ALL used CPAN modules and their dependecies
myapp_repo/datafiles #will contain app-specific data files and such
myapp_repo/install.pl
myall_repo/lib #will contain modules directly used by the `install.pl`
2.) I will develop an install.pl script, and it will be used as the installer-tool, like
perl install.pl new /path/to/app_root
and it will (should):
create the all needed directories under the /path/to/app_root (especially the lib where the will install the perl modules)
will call "local" cpanm internally (from the myapp_repo/lib) to install the app's perl modules and their CPAN dependencies using only distribution files from the distlib.
will generate and install the needed runtime script and the app.psgi into the /path/to/app_root/bin
will install the needed data-files for the app.
3.) So, after this the user should be able to simply run:
/path/to/app_root/bin/plackup /path/to/app_root/bin/app.psgi
In short, the user should use:
the system-wide perl and the system-wide perl-core modules
and any other
runtime perl-scripts (like plackup)
and the required CPAN-modules
should be installed to an self-contained directory tree using only files (no net-access).
E.g. the install.pl should somewhat call internally the cpanm to achieve (as equivalent) for the following cpanm command
cpanm --mirror file://path/to/myapp_repo/distlib --mirror-only My::App
which, should install My::App and all dependencies without network access using only the files from the myapp_repo/distlib
Some questions:
Is possible to use cpanm (called as an locally installed module) without the make?
For creating the myapp_repo/distlib, me thinking about using Pinto. Is it the right tool for achieve the above?
forgot me something? or with other words:
Is the above an viable (read: working) way?
are are any other tools, which i could/should to use for simplifying the creation of such distribution tarball?
#ikegami suggesting some method:
- "install everything" in one fresh-directory on my machine
- transfer this self-contained directory to the target machine
It sound very good, because this directory could contain all the needed app-specific data-files too, unfortunately, I don't understand the details how his solution should be done.
The FatPacked solution looks interesting too - need learn about it.
Don't write your own make or installer. Just copy it make from a different machine (which is basically what apt/yum/etc do anyway, and which you'd have to do even if you wrote your own). You'd be able to use cpan in 5 minutes!
Also, that should allow you to install gcc if you need it (e.g. to install an XS module), although it doesn't sound like you do. If you do install gcc, I'd install my own perl to avoid having to deal with PERL5LIB.
Tools such as minicpan will allow you to install any module from CPAN without internet access. Of course, you can keep using the command you are already using it if mirrors the packages you need.
The above explains how to simply and quickly setup a machine so it can use cpan and thus install any module easily.
If you just want to install a specific module and its dependencies, you can completely avoid using cpan on the target machine. First, you need a fresh install of Perl (preferable of the same version as the one on the target system). Then, simply install the module to a fresh dir on your machine, and transfer that dir to the target machine. That's it; nothing else needs to be done. This even works for XS modules if the two machine are similar enough.
This is what ppm (ActiveState's Perl package manager) does.
Unfortunately, while this solution is almost as simple as the one above, it's not nearly as flexible, it doesn't run the test suite of the modules being installed, etc. It does have the advantage of not requiring the transfer of any binary (if you're not installing any XS modules).

How to handle external dependencies in perl's ExtUtils:MakeMaker

I have a series of perl scripts for which I'm writing a Makefile.PL script, but I'm rather inexperienced with ExtUtils::MakeMaker.
One of the scripts I wrote makes a system call to a command line utility that must be installed in order for the script to run properly. My script can gracefully detect that the utility is missing and issue an error about installing it and putting it in the user's path, but is there some standard way to handle this in the Makefile.PL script? Could it even gasp attempt to install the third-party utility if I enter the download link in the Makefile.PL script?
At the very least, I'd like the script to warn the user if the external dependency was not found. I know I can write a test case that uses it. Is this as simple as copying and pasting the subroutine I wrote in the script itself that checks for the third party utility and prints an error if it's not found or would that be the "wrong way to do it"?
Let's call this external dependency foobar, for sake of argument.
As per #KeepCalmAndCarryOn's comment, firstly consider whether foobar could be replaced by something from CPAN (maybe Foo::Bar), or a few lines of Perl.
Otherwise, the best course of action is:
Create a new CPAN distribution called Alien::Foobar. The job of Alien::Foobar is to download, perhaps compile, and then install foobar, as part of Alien::Foobar's Makefile.PL or Build.PL.
(There exists a module called Alien::Base which aims to make doing this sort of thing easier. It's mostly aimed at installing libraries rather than binaries, though I've had some success using it for the latter.)
Now the Makefile.PL you were originally working on can declare a dependency on Alien::Foobar.
If you have an external dependency on a command-line utility (i.e. there's no perl module that does what the utility does), ExtUtils::MakeMaker is not designed to handle such a dependency. What you need to do is write an install script or edit the make file to handle the dependency. Here are the considerations in doing so:
Check if the dependency exists and if the version is sufficient.
Download the dependent package
Configure, compile, & install the dependent package
Test to make sure it works
Update the user's environment setup if necessary
Run your perl package's installation steps (e.g. perl makefile.PL;make;sudo make install)
Note, you may need to know whether your script is running as root or not, which you can verify using id -u to check if the user ID is root (i.e. '0').

Does use of 'virtualenv' leave my "real" Python installation alone?

I've had many questions about Python for which a suggested answer is often "use virtualenv", but I have a (lovingly maintained and perfectly functioning) Python installation that I'm loath to disturb.
I want to be absolutely sure, so I'll ask twice: Does use of virtualenv in any way disturb my "real" Python installation? Using virtualenv does not in any way modify the files or paths in my "real" installation, right?
Virtualenv creates separated Python environment. Python interpreter is linked from one of system-installed that you choose creating virtualenv( --python commandline switch) and, optionally, wheater use or not system site-packages (--system-site-packages).
All packages that you install using virtualenv remains only on virtualenv directory site-packages folder and do not mess system packages.

How to make sphinx look for modules in virtualenv while building html?

I want to build html docs using a virtualenv instead of the native environment on my machine.
I've entered the virtualenv but when I run make html I get errors saying the module can't be imported - I know the errors are due to the module being unavailable in my native environment.
How can I specify which environment should be used when searching for docs (eg the virtualenv)?
The problem is correctly spotted by Mathijs.
$ which sphinx-build
/usr/local/bin/sphinx-build
I solved this issue installing sphinx itself in the virtual environment.
With the environment activated:
$ source /home/migonzalvar/envs/myenvironment/bin/activate
$ pip install sphinx
$ which sphinx-build
/home/migonzalvar/envs/myenvironment/bin/sphinx-build
It seems neat enough.
The problem here is that make html uses the sphinx-build command as a normal shell command, which explicitly specifies which Python interpreter to use in the first line of the file (ie. #!/usr/bin/python). If Python gets invoked in this way, it will not use your virtual environment.
A quick and dirty way around this is by explicitly calling the sphinx-build Python script from an interpreter. In the Makefile, this can be achieved by changing SPHINXBUILD to the following:
SPHINXBUILD = python <absolute_path_to_sphinx-build-file>/sphinx-build
If you do not want to modify your Makefile you can also pass this parameter from the command line, as follows:
make html SPHINXBUILD='python <path_to_sphinx>/sphinx-build'
Now if you execute make build from within your VirtualEnv environment, it should use the Python interpreter from within your environment and you should see Sphinx finding all the goodies it requires.
I am well aware that this is not a neat solution, as a Makefile like this should not assume any specific location for the sphinx-build file, so any suggestions for a more suitable solution are warmly welcomed.
I had the same problem, but I couldn't use the accepted solution because I didn't use the Makefile. I was calling sphinx-build from within a custom python build file. What I really wanted to do was to call sphinx-build with the exact same environment that I was calling my python build script with. Fiddling with paths was too complicated and error prone, so I ended up with what seems to me like an elegant solution, which is to "manually" load the console script entry point and call it:
from pkg_resources import load_entry_point
cmd = load_entry_point('Sphinx', 'console_scripts', 'sphinx-build')
cmd(['sphinx-build', basepath, destpath])

What is a quick way to run a single script to automatically install missing modules using only Perl core?

I inherited a project which is supposed to be able to be deployed to other servers. This project has a number of simple module dependencies which however might not be present on all target machines.
As such I'd like to be able to run a single command line script that checks which Perl modules are installed and tries to automatically install missing ones via CPAN.
Since this should be very basic (i.e. needing to install stuff to run the module installer would defeat the point) said script should only use Perl 5.8.8 core modules.
Does something like that exist already or would i need to write it myself?
Creating a Bundle package is one possible answer.
You can then look at something like CPAN::Shell (see CPAN module) to automate the process.
/I3az/
Update re: brian's comment about Task:: - Here are some pertinent links:
Writing a CPAN Task (using Module::Install)
"Task:: or Bundle::"? (Perlmonks)
Use Module::Install, it will be bundled with your module/program. You can use "auto_install" command to automatically install dependencies.