I have had a look at several different topics on this matter but can't work out how to apply it to my situation. I don't have an init.py in my test folder and I have tried to use conftest. I have a directory structure like this:
--app
--app.py
--src
--init.py
--module1.py
--module2.py
--module3.py
--configs
--config.json
--non-default-config.json
--tests
--test1.py
--conftest.py
where app.py imports module1, which then imports modules 2&3 (using import src.module2). I load up config.json in all the modules files (and app.py) using:
with open('configs/config.json') as f:
CFG = json.load(f)
This works when I run app.py from the app directory. However, when I run pytest (which I believe should also be referencing from the app directory, since conftest.py is in the app directory) and it imports module1 (using import src.module1), it cannot find configs/config.json, but will find app/configs/config.json. I cannot use this as it will cause my app to break when I run app.py. However, Pytest can find the imports from within the src folder, even though this is on the same level as the configs folder.
If I move the conftest.py outside of the app directory and import module1 using import app.src.module1 then this import succeeds, but the import of module2 inside module1 then fails.
How can I resolve this issue? And is there a better way of structuring my project?
Solved this by running pytest from inside the app folder instead of from the base directory.
Related
I am working creating some testing infrastructure and struggling with taking care of all the dependencies correctly.
The directory structure I have looks like:
conftest.py
kernels/
|-kernel_1/
|---<kernel_src1>
|---__init__.py
|---options.json
|---test/
|-----test_func1.py
|-kernel_2/
|---<kernel_src2>
|---__init__.py
|---pytest.ini
|---options.json
|---scripts/
|-----__init__.py
|-----some_module.py
|---test/
|-----test_func2.py
When I call pytest on any of these tests, the test first compiles and simulates the kernel source code (C++) and compares the output against golden that is generated in python. Since all the kernels will be compiled individually, I create an output directory to store compile/simulation logs along with some header files that we generated in the kernel_1 directory.
For example, pytest kernel_1/test/test_func1.py will create a directory in kernel_1/build_test_func1/<compile/sim logs>.
I use the conftest.py which updates cwd to the test directory based on the accepted answer here:
Change pytest working directory to test case directory
I also added pytest.ini to add kernel_2 to the pythonpath when running test_func2 so we can find modules in scripts folder:
[pytest]
pythonpath=.
Tests run correctly when calling it from:
cd kernel_2/; pytest
cd kernel_2/test; pytest
cd kernel_2; pytest test/test_func1.py
cd kernel_2/test; pytest test_func1.py
The test also runs correctly when calling it like this: pytest kernel_2/test/test_func2.py
But I start seeing ModuleImportError when calling it from top-level without specifying the test
pytest
ImportError while importing test module '<FULL_PATH>/kernel_2/test/test_func2.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
<FULL_PATH>miniconda3/envs/pytest/lib/python3.7/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
kernel_2/test/test_func2.py:8: in <module>
from scripts.some_module import some_func
E ModuleNotFoundError: No module named 'scripts'
The issue looks when collecting pytest.ini in a specific kernel doesn't take effect when calling pytest, but I haven't been able to find a way to fix this issue. Any comments, concerns are appreciated!
I have two modules, both named connection.py in two separate environments listed below. Both of the folders containing connection.py are in my PYTHONPATH system environment variable.
However, if that of spec is not placed above that of bvbot, spec's test_connection.py attempts to import from the connection.py of bvbot.
If in cmd, I can resolve this by moving the path of spec above that of bvbot. But, in Visual Studio Code, spec's test_connection.py still imports from bvbot's connection.py.
The two environments of interest are:
C:\Users\You_A\Desktop\2016Coding\VirtualEnviroments\spec\spec_trading
C:\Users\You_A\Desktop\2016Coding\VirtualEnviroments\bvbot\Legacy_bvbot
Structure of the spec path above:
src/
spec_trading/
__init__.py
connection.py
tests/
__init__.py
connection.py
spec test_connection.py:
import pytest
from connection import Connection, OandaConnection
class TestConnection:
def test_poll_timeout(self):
connection = Connection()
timeout = 10.0
connection.set_poll_timeout(timeout)
assert connection.poll_timeout == timeout
What I am doing wrong here? How can I resolve this without resorting to manually faffing with my systems environment variables and resolve the VSC issue?
Easiest solution is to not use implicit relative imports (I assume this is Python 2.7). Basically use explicit relative imports and make sure the imports resolve within the package they are contained within instead of Python having to search sys.path for the module.
And if you are using Python 2.7, put from __future__ import absolute_import at the top of the file.
This question is specific to PyDev. The package structure looks like this:
app
├── __init__.py
├── sub1
│ ├── __init__.py
│ └── mod1.py
└── sub2
├── __init__.py
└── mod2.py
The mod1.py module:
from __future__ import print_function
def f():
print('It works!')
The mod2.py module:
from __future__ import absolute_import
from ..sub1 import mod1
if __name__ == '__main__':
mod1.f()
Everything works beautifully from the shell, the python -m app.sub2.mod2 command prints:
It works!
as expected, all is fine. (The from __future__ import absolute_import line seems to have no effect: I can comment it out and everything still works just fine.)
If I click on mod2 in the PyDev IDE and try to Run As > Python Run, I get
ValueError: Attempted relative import in non-package
which is not surprising as the -m switch is not turned on by default. If I edit the Run/Debug settings for mod2: Arguments > VM Arguments and add -m here; the -m is most likely passed to the python interpreter but now I get:
/usr/bin/python: Import by filename is not supported.
The from __future__ import absolute_import line seems to have no effect; it does not matter whether I comment it out or not; I am using Python 2.7.
I am out of ideas at this point.
In PyDev, how can I run a module inside a package that uses relative
imports?
How should I change the settings once (globally) such that whenever I
try to run a module inside a package, PyDev does the right thing?
(That is, I don't have to individually specify the settings for each
module that I wish to run.)
The developer in person confirmed that it is not possible in PyDev yet; I have opened a ticket for it.
Running a module inside a package, using relative imports
UPDATE: As of Dec 2, 2016, the issue is resolved, see the accepted answer.
Edit:
In PyDev 5.4.0, there's now an option to run using the -m flag (which will import the module through its regular name and not as it was __main__ so that relative imports will work there).
You can enable it at: Preferences > PyDev > Run (i.e.: this will enable it for all runs -- maybe in the future there'll be an option to make it per run, but for now it's set globally for all launches).
Original answer:
The problem is that you have relative imports in your main module and PyDev executes the file with python path/to/file_to_execute.py instead of python -m my.module.
A simple fix is doing a separate main module which in turn imports a main() function from that module and runs it (although again: it can't have relative imports in the module executed as __main__ (this happens because the module is called __main__ and thus cannot resolve a relative import because it wasn't actually imported with a name which can be used to resolve the relative import).
Another fix would be changing the launch configuration to add the '-m my.module' in the VM arguments (go to run > run configurations to do that -- but you have to do that for each main module you want to run, including unit-tests).
And the last fix would be changing PyDev itself (so, please create a ticket for that in the PyDev tracker: https://www.brainwy.com/tracker/PyDev/ -- or submit a pull request, which would make adding that feature much faster ;) )
I have a mixed Python/C++ library with test files mixed in amongst source files in the same directories. The layout looks like
/home/irving/geode
geode
__init__.py
vector
__init__.py
test_vector.py
...
...
Unfortunately, the library is unusable in-place since it lacks .so extension modules. Question: Can I make py.test always use an installed version, even when run from /home/irving/geode or a subdirectory?
The test files have from __future__ import absolute_import, and run fine if executed directly as scripts. For example, if I do
cd geode/vector
./test_vector.py
which does import geode, it finds the installed version. However, if I run py.test in geode/vector, it finds the local copy of geode, and then dies.
I think you have two options:
run py.test --pyargs geode.vector.test_vector to make pytest interpretet the argument as an import path, deriving the file system path from it. This should run the test against the installed version.
move the tests out into a tests directory without an __init__.py file. This way you need to pip install -e . to work in-place or can do python setup.py install and the py.test tests to run tests against the installed version.
Background
I have a Python project with this directory structure:
py/:
db/ __init__.py run.py
py/db:
handle.py __init__.py util.py
The files are simple enough that I'm not sure I need to post them; nevertheless:
py/run.py
from db.handle import Handle
py/db/handle.py:
import util
class Handle:
def __init__(self, x):
self.x = util.addtwo(x)
py/db/util.py:
def addtwo(x):
return x + 2
If I run handle.py from within the db subdirectory, it imports util without error. However, when I run run.py, handle.py fails with an import error. I can guess that handle.py is being run in the py directory (instead of py/db), and putting a call to os.getcwd() in handle.py confirms this. I can fix this problem using sys.path like so (in run.py):
import sys
sys.path.append("db")
from db.handle import Handle
Question
When importing, from a subdirectory, a module that contains imports to other local modules in that directory, why doesn't Python check the current directory of the module making the import statement? In my example, why doesn't Python check the db first when handle.py contains import statements? Is there a PEP that describes this or is it a behavior with an obvious rationale that I missed?
I thought it might be related to PEP 328:
all import statements be absolute by default (searching sys.path only) with special syntax (leading dots) for accessing package-relative imports.
but I'm not sure.
Your import is "absolute" and module names is looked for in the PYTHONPATH, and that typically includes the current directory.
If you want to import a module from the same folder that your module is in, you use a relative import:
from . import util
or
from .util import addtwo