Suppose I have the following directory structure:
src/
└── python/
└── generated/
├── __init__.py
├── a.py
└── lib/
├── __init__.py
└── b.py
What does my setup.py need to look like in order to create a dist with a directory layout like:
src/
└── python/
├── __init__.py
├── a.py
└── lib/
├── __init__.py
└── b.py
The goal is to simply eliminate the generated folder. I've tried endless variations with package_dir and can't get anything produced other than the original directory structure.
Your setup.py should be placed in your src directory and should look like this:
#!/usr/bin/env python3
import setuptools
setuptools.setup(
name='Thing',
version='1.2.3',
packages=[
'python',
'python.lib',
],
package_dir={
'python': 'python/generated',
},
)
Note the package_dir setting. It instructs setuptools to get the code for the python package from the directory python/generated. In the built distributions you will then find the right directory structure.
First, here is my solution:
#!/usr/bin/env python
import os, shutil
from setuptools import setup
from setuptools.command.build_py import build_py
class BuildPyCommand(build_py):
"""Custom build command."""
def run(self):
shutil.rmtree('src.tmp', ignore_errors=True)
os.mkdir('src.tmp')
shutil.copytree('src/python/generated', 'src.tmp/python')
build_py.run(self)
setup(cmdclass={ 'build_py': BuildPyCommand },
name='Blabla',
version='1.0',
description='best desc ever',
author='Me',
packages=['python', 'python.lib'],
package_dir={'': 'src.tmp'},
setup_requires=['wheel']
)
And you can generate your distribution with:
python setup.py build bdist_wheel
The idea is perform a two steps build:
I generate a valid source structure
I build this temporary structure
And I deliver it in a wheel because it doesn't require future users to understand my trick. If you give it a try with a source distribution, you will notice that you need to publish the generated files as data (not difficult, but troublesome, and, I guess you will want to hide your tricks from your users).
But, I think that there is a design flaw in your process. The file src/python/generated/__init__.py, assumed to be a module <something>.generated eventually becomes your <something>.python, which is troublesome. It would be much simpler and more robust to generate a valid Python structure: src/generated/python/__init__.py. The setup.py would become trivial and your generator wouldn't be more complex.
Related
Problem statement
When building a Python package I want the build tool to automatically execute the steps to generate the necessary Python files and include them in the package.
Here are some details about the project:
the project repository contains only the hand-written Python and YAML files
to have a fully functional package the YAML files must be compiled into Python scripts
once the Python files are generated from YAMLs, the program needed to compile them is no longer necessary (build dependency).
the hand-written and generated Python files are then packaged together.
The package would then be uploaded to PyPI.
I want to achieve the following:
When the user installs the package from PyPI, all necessary files required for the package to function are included and it is not necessary to perform any compile steps
When the user checks-out the repository and builds the package with python -m build . --wheel, the YAML files are automatically compiled into Python and included in the package. Compiler is required.
When the user checks-out the repository and installs the package from source, the YAML files are automatically compiled into Python and installed. Compiler is required.
(nice to have) When the user checks-out the repository and installs in editable mode, the YAML files are compiled into Python. The user is free to make modifications to both generated and hand-written Python files. Compiler is required.
I have a repository with the following layout:
├── <project>
│ └── <project>
│ ├── __init__.py
│ ├── hand_written.py
│ └── specs
│ └── file.ksc (YAML file)
└── pyproject.toml
And the functional package should look something like this
├── <project>
│ └── <project>
│ ├── __init__.py
│ ├── hand_written.py
│ └── generated
│ └── file.py
├── pyproject.toml
└── <other package metadata>
How can I achieve those goals?
What I have so far
As I am very fresh to Python packaging, I have been struggling to understand the relations between the pyproject.toml, setup.cfg and setup.py and how I can use them to achieve the goals I have outlined above. So far I have a pyproject.toml with the following content:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "<package>"
version = "xyz"
description = "<description>"
authors = [ <authors> ]
dependencies = [
"kaitaistruct",
]
From reading the setuptools documentation, I understand that there are the build commands, such as:
build_py -- simply copies Python files into the package (no compiling; works differently in editable mode)
build_ext -- builds C/C++ modules (not relevant here?)
I suppose adding the compile steps for the YAML files will involve writing a setup.py file and overwriting a command, but I don't know if this is the right approach, whether it will even work, or if there are better methods, such as using a different build backend.
Alternative approaches
A possible alternative approach would be to manually compile the YAML files prior to starting the installation or build of the package.
This is a much asked question, but none of the solutions mentioned on SO have worked so far.
The folder structure is as follows:
project/
└── tests/
├── conftest.py
├── __init__.py
└── int_tests/
└── test_device.py
└── project_core/
└── tests/
├── conftest.py
├── __init__.py
└── int_tests/
└── test_device.py
import file mismatch:
imported module 'test_device' has this __file__ attribute:
/home/.../project/project_core/tests/int_tests/test_device.py
which is not the same as the test file we want to collect:
/home/.../project/tests/int_tests/test_device.py
HINT: remove __pycache__ / .pyc files and/or use a unique basename for your test file modules
Steps tried so far:
Removing pycache and pyc files.
Adding _init to each folder. (As is stated in pytest GIP)
Removing _init from each folder.
Do i need init files in each tests/subfolder?
The same error occurs with conftest.py as well. This error is not limited to vscode-pytest plugin, also occurs on the terminal.
PS : For CI purposes, the system is configured with docker & tox. Development is done in venv.
I have trouble creating a package with setuptools. I have a repository which I'm cleaning up to make it a package. The directory structure looks something like this
my-proj
├── setup.py
├── MANIFEST.in
├── MakeFile
├── README.rst
├── setup.py
└── myproj
├── __init__.py
├── my_code.py
├── templates
│ ├── template1.yaml
│ ├── template2.yaml
Initial version of "my_code.py" had code snippet which would directly reference the files withing templates folder to do some processing. If I package this using setup tools, I provide the following information in these files:
MANIFEST.in:
include README.rst
include requirements.txt
include LICENSE.txt
recursive-include myproj/templates *
setup.py:
setup(
name='myproj',
package_dir={'testbed_init': 'testbed_init'},
package_data={'templates': ['templates/*'], 'configs': ['configs/*']},
include_package_data=True,
)
My question is as follows. In "my_Code.py" I used to reference templates directly without any problem as I would run script from the myproj folder. If I package this, how can I make sure, I include the templates as part of package and when script runs, I need to open the templates relative to where the package is installed.
Code snippet from my_code.py:
if _type == "a":
temp_file = f"templates/template1.yaml"
else:
temp_file = f"templates/template2.yaml"
build_config(deploy_esx_file, output_file, data)
Code snippet of what happens in build_config:
def build_config(template_file, output_file, inputs):
templateLoader = jinja2.FileSystemLoader(searchpath="./")
templateEnv = jinja2.Environment(loader=templateLoader)
template = templateEnv.get_template(template_file)
outputText = template.render(inputs)
with open(output_file, 'w') as h:
h.write(outputText)
I have the following project structure:
A/
|- B.pm
|- B/
|- one.pm
|- two.pm
|- three.pm
in B.pm I have:
package A::B;
use A::B::one;
use A::B::two;
use A::B::three;
Now, I'm trying to install this module locally using cpanp. When in the A directory I simply run:
cpanp i .
It says Module 'A' installed successfully, however, when I list the content of my $PERL5LIB directory, all I can see is B.pm instead of A/.
What am I doing wrong?
This is probably not the recommended way to do it but for those looking for a quick-and-dirty solution, just move everything to a lib directory.
For me it looks like:
A-B
└── lib
└── A
├── B
│ ├── one.pm
│ ├── three.pm
│ └── two.pm
└── B.pm
When in the A-B directory, I simply run:
cpan .
As I just want to install my module locally this approach worked for me but let me know if you think there are good reasons to use tools like module-starter (as suggested by #HåkonHægland) or at least write my own Makefile.PL (which is actually the approach I ended up with as I wanted to list dependencies).
According to docs https://packaging.python.org/en/latest/distributing/#data-files
setuptools will honor data_files configed in setup.py. But i can't make it work. This is my setup.py:
setup(
name='booking_order',
version=version,
packages=find_packages(),
package_data={
'booking_order': ['fake_backends/static/*',
'scripts/*',
'*.sample'],
},
data_files=[
('/etc/booking', ['etc/booking.conf'])
],
This is the project's file tree:
.
├── booking_order
│ ├── __init__.py
│ ├── tests
│ │ ├── __init__.py
├── etc
│ ├── booking.conf
├── README.md
├── setup.py
The behavior is, if i run python setup.py install, file etc/booking.conf will got installed to /etc/booking. But if i first python setup.py sdist upload, then pip install booking_order, there will be an error "error: can't copy 'etc/booking.conf': doesn't exist or not a regular file".
I checked python setup.py sdist doesn't include files in etc at all.
EDIT:
it seems this is the answer: https://github.com/pypa/setuptools/issues/521
Answer it myself.
According to pypa, and non-package-data-files。"Setuptools doesn't support installing data files to some arbitrary location on a user’s machine; this is a feature, not a bug."
If one need to install files to locations like /etc, /usr/share, eg, then he/she may use data_files flag from distutils, which feature is not totally cleaned up from setuptools. "Not totally cleaned up" means you need to add those files to MANIFEST.in manually, which is different as in distutils.
Of course, it will be better if one can manage these configuration files with rpm or deb package system. For me it's just a temporary solution to use pip here.