This question is specific to PyDev. The package structure looks like this:
app
├── __init__.py
├── sub1
│ ├── __init__.py
│ └── mod1.py
└── sub2
├── __init__.py
└── mod2.py
The mod1.py module:
from __future__ import print_function
def f():
print('It works!')
The mod2.py module:
from __future__ import absolute_import
from ..sub1 import mod1
if __name__ == '__main__':
mod1.f()
Everything works beautifully from the shell, the python -m app.sub2.mod2 command prints:
It works!
as expected, all is fine. (The from __future__ import absolute_import line seems to have no effect: I can comment it out and everything still works just fine.)
If I click on mod2 in the PyDev IDE and try to Run As > Python Run, I get
ValueError: Attempted relative import in non-package
which is not surprising as the -m switch is not turned on by default. If I edit the Run/Debug settings for mod2: Arguments > VM Arguments and add -m here; the -m is most likely passed to the python interpreter but now I get:
/usr/bin/python: Import by filename is not supported.
The from __future__ import absolute_import line seems to have no effect; it does not matter whether I comment it out or not; I am using Python 2.7.
I am out of ideas at this point.
In PyDev, how can I run a module inside a package that uses relative
imports?
How should I change the settings once (globally) such that whenever I
try to run a module inside a package, PyDev does the right thing?
(That is, I don't have to individually specify the settings for each
module that I wish to run.)
The developer in person confirmed that it is not possible in PyDev yet; I have opened a ticket for it.
Running a module inside a package, using relative imports
UPDATE: As of Dec 2, 2016, the issue is resolved, see the accepted answer.
Edit:
In PyDev 5.4.0, there's now an option to run using the -m flag (which will import the module through its regular name and not as it was __main__ so that relative imports will work there).
You can enable it at: Preferences > PyDev > Run (i.e.: this will enable it for all runs -- maybe in the future there'll be an option to make it per run, but for now it's set globally for all launches).
Original answer:
The problem is that you have relative imports in your main module and PyDev executes the file with python path/to/file_to_execute.py instead of python -m my.module.
A simple fix is doing a separate main module which in turn imports a main() function from that module and runs it (although again: it can't have relative imports in the module executed as __main__ (this happens because the module is called __main__ and thus cannot resolve a relative import because it wasn't actually imported with a name which can be used to resolve the relative import).
Another fix would be changing the launch configuration to add the '-m my.module' in the VM arguments (go to run > run configurations to do that -- but you have to do that for each main module you want to run, including unit-tests).
And the last fix would be changing PyDev itself (so, please create a ticket for that in the PyDev tracker: https://www.brainwy.com/tracker/PyDev/ -- or submit a pull request, which would make adding that feature much faster ;) )
Related
I have had a look at several different topics on this matter but can't work out how to apply it to my situation. I don't have an init.py in my test folder and I have tried to use conftest. I have a directory structure like this:
--app
--app.py
--src
--init.py
--module1.py
--module2.py
--module3.py
--configs
--config.json
--non-default-config.json
--tests
--test1.py
--conftest.py
where app.py imports module1, which then imports modules 2&3 (using import src.module2). I load up config.json in all the modules files (and app.py) using:
with open('configs/config.json') as f:
CFG = json.load(f)
This works when I run app.py from the app directory. However, when I run pytest (which I believe should also be referencing from the app directory, since conftest.py is in the app directory) and it imports module1 (using import src.module1), it cannot find configs/config.json, but will find app/configs/config.json. I cannot use this as it will cause my app to break when I run app.py. However, Pytest can find the imports from within the src folder, even though this is on the same level as the configs folder.
If I move the conftest.py outside of the app directory and import module1 using import app.src.module1 then this import succeeds, but the import of module2 inside module1 then fails.
How can I resolve this issue? And is there a better way of structuring my project?
Solved this by running pytest from inside the app folder instead of from the base directory.
I have two modules, both named connection.py in two separate environments listed below. Both of the folders containing connection.py are in my PYTHONPATH system environment variable.
However, if that of spec is not placed above that of bvbot, spec's test_connection.py attempts to import from the connection.py of bvbot.
If in cmd, I can resolve this by moving the path of spec above that of bvbot. But, in Visual Studio Code, spec's test_connection.py still imports from bvbot's connection.py.
The two environments of interest are:
C:\Users\You_A\Desktop\2016Coding\VirtualEnviroments\spec\spec_trading
C:\Users\You_A\Desktop\2016Coding\VirtualEnviroments\bvbot\Legacy_bvbot
Structure of the spec path above:
src/
spec_trading/
__init__.py
connection.py
tests/
__init__.py
connection.py
spec test_connection.py:
import pytest
from connection import Connection, OandaConnection
class TestConnection:
def test_poll_timeout(self):
connection = Connection()
timeout = 10.0
connection.set_poll_timeout(timeout)
assert connection.poll_timeout == timeout
What I am doing wrong here? How can I resolve this without resorting to manually faffing with my systems environment variables and resolve the VSC issue?
Easiest solution is to not use implicit relative imports (I assume this is Python 2.7). Basically use explicit relative imports and make sure the imports resolve within the package they are contained within instead of Python having to search sys.path for the module.
And if you are using Python 2.7, put from __future__ import absolute_import at the top of the file.
I think there is a bug with respect to how PyDev (version 4.6) recognizes intra-package imports when selecting Grammar 3.x for the project preferences. I have a project like this:
foobar
mypack
__init__.py
mod1.py
mod2.py
mod2.py simply says
from mod1 import fun1
mod1.py simply says
def fun1():
print("Hey we are in fun1 in mod1")
If the project Python project preferences are set to use Grammar 3.0-3.5, with a Python 3.4 interpreter, and I open up mod2.py the line from mod1 import fun1 is highlighted with an error Unresolved import: fun1. If I change the Python project preferences to use Grammar 2.7, close the file mod2.py and reopen it, the error disappears. Just by changing the grammar back and forth, and closing/reopening the file, I can make the error appear/disappear.
So it seems that setting the Grammar to 3.x in PyDev causes intra-package imports to be incorrectly flagged as having an import error.
Any suggestions?
PyDev is doing right... on Python 3, relative imports must be written as:
from .mod1 import fun1
If the import doesn't start with a dot, it'll consider it an absolute import (and will properly show the error for you as with that absolute path the imported file cannot be resolved).
So my real issue was getting PyDev to not report errors about the imports, and to be able to debug modules buried in packages that have a main() in there for debugging. The solution for me was to use relative imports (as said in Fabio's answer), and then to do the following for debugging purposes. Let's say I want to run a module pack1.subpack2.subpack3.subpack4.modtodebug using PyDev, that has relative imports, and has a main() function. At the top level of my project, I have a module debugmain.py that reads
from pack1.subpack2.subpack3.subpack4.modtodebug import main as debugmain
if __name__ == '__main__':
debugmain()
I then have one run configuration for debugmain.py and each time I want to debug a different module, I just have to edit the code in debugmain.py to point to that module.
I hope this helps someone else with this kind of problem.
I have a mixed Python/C++ library with test files mixed in amongst source files in the same directories. The layout looks like
/home/irving/geode
geode
__init__.py
vector
__init__.py
test_vector.py
...
...
Unfortunately, the library is unusable in-place since it lacks .so extension modules. Question: Can I make py.test always use an installed version, even when run from /home/irving/geode or a subdirectory?
The test files have from __future__ import absolute_import, and run fine if executed directly as scripts. For example, if I do
cd geode/vector
./test_vector.py
which does import geode, it finds the installed version. However, if I run py.test in geode/vector, it finds the local copy of geode, and then dies.
I think you have two options:
run py.test --pyargs geode.vector.test_vector to make pytest interpretet the argument as an import path, deriving the file system path from it. This should run the test against the installed version.
move the tests out into a tests directory without an __init__.py file. This way you need to pip install -e . to work in-place or can do python setup.py install and the py.test tests to run tests against the installed version.
This (or similar) question has been asked many times before, but none of the solutions offered work in my case.
My project structure is like this :
| project_2
main.py
__init__.py
systems.py
| config
__init__.py
options.py
| database
__init__.py
database.py
entity.py
| tests
__init__.py
test_systems.py
test_options.py
test_database.py
test_entity.py
Obviously I need to import all the modules in the test modules under the tests package. I tried relative imports with the dot syntax:
from ..systems import System
from ..config import options
from ..database.entity import Entity
Returns a ValueError: Attempt relative import in non-package. I have tried that with a package structure where everything (including systems) is in its own package. It fails with the same message.
What really bothers me is that this is supposed to work: PEP 328, but it does not. I really want to avoid having to append the packages to $PYTHONPATH or to use some insane method such as loading the modules with imp from the file path.
I read that part of the problem might be that systems.py is in the main package, but that does not explain why the rest of the relative imports do not work either.
P.S. I actually recreated the example from PEP 328 just to test it and it does not work.
You get that when a python file does a relative import, but that file not loaded as a module via import in another module (but e.g. from the commandline). Given this structure:
.
├── main.py
└── test
├── __init__.py
├── a.py
└── b.py
main.py:
from test.a import A
print A
a.py:
from .b import B
A = B
if __name__ == '__main__':
print A
b.py:
B = 'b'
Now try:
python main.py
result is
b
and with
python test/a.py
you get:
Traceback (most recent call last):
File "test/a.py", line 1, in <module>
from .b import B
ValueError: Attempted relative import in non-package
What does work is:
python -m test.a
If you simply add . to your python path, if you run the script from the project_2 folder relative paths such as config.options will work. This requires an update to PYTHONPATH on every machine, unfortunately.
Tested on Python 2.7.14