How to configure setup.py work with main method? - setuptools

I made my tool executable and added __name__:main method which I want to give to my setup.py. This is what it looks like:
from setuptools import setup
setup(
name='client',
version='0.0.1',
entry_points={
'console_scripts': [
'client = client.main:main'
]
}
)
I do not have an __init__.py file, because I want to run my tool as a script, not as a package. I am struggling with the following error -> No module named 'client.main:main. How am I supposed to solve this?
The directory level of my tool is:
requests\
client.py
setup.py

Related

pytest.ini doesn't take effect when calling pytest vs. pytest <test_name>

I am working creating some testing infrastructure and struggling with taking care of all the dependencies correctly.
The directory structure I have looks like:
conftest.py
kernels/
|-kernel_1/
|---<kernel_src1>
|---__init__.py
|---options.json
|---test/
|-----test_func1.py
|-kernel_2/
|---<kernel_src2>
|---__init__.py
|---pytest.ini
|---options.json
|---scripts/
|-----__init__.py
|-----some_module.py
|---test/
|-----test_func2.py
When I call pytest on any of these tests, the test first compiles and simulates the kernel source code (C++) and compares the output against golden that is generated in python. Since all the kernels will be compiled individually, I create an output directory to store compile/simulation logs along with some header files that we generated in the kernel_1 directory.
For example, pytest kernel_1/test/test_func1.py will create a directory in kernel_1/build_test_func1/<compile/sim logs>.
I use the conftest.py which updates cwd to the test directory based on the accepted answer here:
Change pytest working directory to test case directory
I also added pytest.ini to add kernel_2 to the pythonpath when running test_func2 so we can find modules in scripts folder:
[pytest]
pythonpath=.
Tests run correctly when calling it from:
cd kernel_2/; pytest
cd kernel_2/test; pytest
cd kernel_2; pytest test/test_func1.py
cd kernel_2/test; pytest test_func1.py
The test also runs correctly when calling it like this: pytest kernel_2/test/test_func2.py
But I start seeing ModuleImportError when calling it from top-level without specifying the test
pytest
ImportError while importing test module '<FULL_PATH>/kernel_2/test/test_func2.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
<FULL_PATH>miniconda3/envs/pytest/lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
kernel_2/test/test_func2.py:8: in <module>
from scripts.some_module import some_func
E ModuleNotFoundError: No module named 'scripts'
The issue looks when collecting pytest.ini in a specific kernel doesn't take effect when calling pytest, but I haven't been able to find a way to fix this issue. Any comments, concerns are appreciated!

Pytest can't find files/modules

I have had a look at several different topics on this matter but can't work out how to apply it to my situation. I don't have an init.py in my test folder and I have tried to use conftest. I have a directory structure like this:
--app
--app.py
--src
--init.py
--module1.py
--module2.py
--module3.py
--configs
--config.json
--non-default-config.json
--tests
--test1.py
--conftest.py
where app.py imports module1, which then imports modules 2&3 (using import src.module2). I load up config.json in all the modules files (and app.py) using:
with open('configs/config.json') as f:
CFG = json.load(f)
This works when I run app.py from the app directory. However, when I run pytest (which I believe should also be referencing from the app directory, since conftest.py is in the app directory) and it imports module1 (using import src.module1), it cannot find configs/config.json, but will find app/configs/config.json. I cannot use this as it will cause my app to break when I run app.py. However, Pytest can find the imports from within the src folder, even though this is on the same level as the configs folder.
If I move the conftest.py outside of the app directory and import module1 using import app.src.module1 then this import succeeds, but the import of module2 inside module1 then fails.
How can I resolve this issue? And is there a better way of structuring my project?
Solved this by running pytest from inside the app folder instead of from the base directory.

How to run a module inside a package, using relative imports?

This question is specific to PyDev. The package structure looks like this:
app
├── __init__.py
├── sub1
│   ├── __init__.py
│   └── mod1.py
└── sub2
├── __init__.py
└── mod2.py
The mod1.py module:
from __future__ import print_function
def f():
print('It works!')
The mod2.py module:
from __future__ import absolute_import
from ..sub1 import mod1
if __name__ == '__main__':
mod1.f()
Everything works beautifully from the shell, the python -m app.sub2.mod2 command prints:
It works!
as expected, all is fine. (The from __future__ import absolute_import line seems to have no effect: I can comment it out and everything still works just fine.)
If I click on mod2 in the PyDev IDE and try to Run As > Python Run, I get
ValueError: Attempted relative import in non-package
which is not surprising as the -m switch is not turned on by default. If I edit the Run/Debug settings for mod2: Arguments > VM Arguments and add -m here; the -m is most likely passed to the python interpreter but now I get:
/usr/bin/python: Import by filename is not supported.
The from __future__ import absolute_import line seems to have no effect; it does not matter whether I comment it out or not; I am using Python 2.7.
I am out of ideas at this point.
In PyDev, how can I run a module inside a package that uses relative
imports?
How should I change the settings once (globally) such that whenever I
try to run a module inside a package, PyDev does the right thing?
(That is, I don't have to individually specify the settings for each
module that I wish to run.)
The developer in person confirmed that it is not possible in PyDev yet; I have opened a ticket for it.
Running a module inside a package, using relative imports
UPDATE: As of Dec 2, 2016, the issue is resolved, see the accepted answer.
Edit:
In PyDev 5.4.0, there's now an option to run using the -m flag (which will import the module through its regular name and not as it was __main__ so that relative imports will work there).
You can enable it at: Preferences > PyDev > Run (i.e.: this will enable it for all runs -- maybe in the future there'll be an option to make it per run, but for now it's set globally for all launches).
Original answer:
The problem is that you have relative imports in your main module and PyDev executes the file with python path/to/file_to_execute.py instead of python -m my.module.
A simple fix is doing a separate main module which in turn imports a main() function from that module and runs it (although again: it can't have relative imports in the module executed as __main__ (this happens because the module is called __main__ and thus cannot resolve a relative import because it wasn't actually imported with a name which can be used to resolve the relative import).
Another fix would be changing the launch configuration to add the '-m my.module' in the VM arguments (go to run > run configurations to do that -- but you have to do that for each main module you want to run, including unit-tests).
And the last fix would be changing PyDev itself (so, please create a ticket for that in the PyDev tracker: https://www.brainwy.com/tracker/PyDev/ -- or submit a pull request, which would make adding that feature much faster ;) )

Setuptools how to build shared library before package installation

My pacakge has *.py files and *.c files, the *.py files use ctypes to import shared library
built from the c source.
Now I have problem how to write my setup.py.
The setup script needs to build my_c_file.c to my_c_file.so, and then copy it to python libpath.
I want to know the what is the 'should' way?
You should probably have a look at Building C and C++ Extensions with distutils.
If you build a setup.py file around the example below, setuptools should compile your c file into my_c_lib.so and automatically add it to your installed package (untested).
from distutils.core import setup, Extension
c_module = Extension('my_c_lib',
sources = ['my_c_file.c'])
setup (name = 'my_package',
version = '1.0',
description = 'This is a package in which I compile a C library.',
ext_modules = [c_module])

Force py.test to use installed version of module

I have a mixed Python/C++ library with test files mixed in amongst source files in the same directories. The layout looks like
/home/irving/geode
geode
__init__.py
vector
__init__.py
test_vector.py
...
...
Unfortunately, the library is unusable in-place since it lacks .so extension modules. Question: Can I make py.test always use an installed version, even when run from /home/irving/geode or a subdirectory?
The test files have from __future__ import absolute_import, and run fine if executed directly as scripts. For example, if I do
cd geode/vector
./test_vector.py
which does import geode, it finds the installed version. However, if I run py.test in geode/vector, it finds the local copy of geode, and then dies.
I think you have two options:
run py.test --pyargs geode.vector.test_vector to make pytest interpretet the argument as an import path, deriving the file system path from it. This should run the test against the installed version.
move the tests out into a tests directory without an __init__.py file. This way you need to pip install -e . to work in-place or can do python setup.py install and the py.test tests to run tests against the installed version.