How to import python modules from parent and sibling packages - import

This (or similar) question has been asked many times before, but none of the solutions offered work in my case.
My project structure is like this :
| project_2
main.py
__init__.py
systems.py
| config
__init__.py
options.py
| database
__init__.py
database.py
entity.py
| tests
__init__.py
test_systems.py
test_options.py
test_database.py
test_entity.py
Obviously I need to import all the modules in the test modules under the tests package. I tried relative imports with the dot syntax:
from ..systems import System
from ..config import options
from ..database.entity import Entity
Returns a ValueError: Attempt relative import in non-package. I have tried that with a package structure where everything (including systems) is in its own package. It fails with the same message.
What really bothers me is that this is supposed to work: PEP 328, but it does not. I really want to avoid having to append the packages to $PYTHONPATH or to use some insane method such as loading the modules with imp from the file path.
I read that part of the problem might be that systems.py is in the main package, but that does not explain why the rest of the relative imports do not work either.
P.S. I actually recreated the example from PEP 328 just to test it and it does not work.

You get that when a python file does a relative import, but that file not loaded as a module via import in another module (but e.g. from the commandline). Given this structure:
.
├── main.py
└── test
├── __init__.py
├── a.py
└── b.py
main.py:
from test.a import A
print A
a.py:
from .b import B
A = B
if __name__ == '__main__':
print A
b.py:
B = 'b'
Now try:
python main.py
result is
b
and with
python test/a.py
you get:
Traceback (most recent call last):
File "test/a.py", line 1, in <module>
from .b import B
ValueError: Attempted relative import in non-package
What does work is:
python -m test.a

If you simply add . to your python path, if you run the script from the project_2 folder relative paths such as config.options will work. This requires an update to PYTHONPATH on every machine, unfortunately.
Tested on Python 2.7.14

Related

import from parent directory python

I am trying to run a pytest test for filea.py using the following directory structure
test_filea.py
from filea import *
def test_one_p_one():
r = one_p_one()
assert r == 2
filea.py
def one_p_one():
return 1 + 1
When i have to following directory structure every thing works fine.
├── filea.py
├── test_filea.py
but when i move my tests into a sub directory like this
├── filea.py
└── tests
└── test_filea.py
i get the error:
test_filea.py:1: in <module>
from filea import *
E ModuleNotFoundError: No module named 'filea'
My editor seems to indicate the import in the file in the sub directory is ok.. (no read squiggly lines)
but when i run this using "pytest"
i get the error indicated above.
As per pytest documentation about test discovery, try like this:
add an empty __init__.py file in testsdirectory;
make sure that, when you run pytest ., the parent directory of filea.py and tests is the current working directory.
It depends where you run the tests from, and how you invoke pytest. Calling pytest tests is different than calling python -m pytest tests, the later adds the current working directory into the sys.path, which makes filea module importable.

How to run protoc correctly with different file import?

My main file to generate has these import :
import "protos/google_annotations.proto";
import "protos/nakama_annotations.proto";
import "protos/nakama_api.proto";
The folder structure :
├── lib
├── protos
├── google_annotations.proto
├── nakama_annotations.proto
├── nakama_api.proto
├── apigrpc.proto <--- this is the file to generate.
The highlight syntax is ok.(Android studio)
The 2 cases that i got error are :
1.
Command run in protos directory
Run protoc apigrpc.proto --java_out=. --proto_path=.
Get this error
protos/google_annotations.proto: File not found.
protos/nakama_annotations.proto: File not found.
protos/nakama_api.proto: File not found.
Specify all import files
Command run in protos directory
Run
protoc apigrpc.proto --java_out=. --proto_path=google_annotations.proto --proto_path=nakama_annotations.proto --proto_path=nakama_api.proto
Get this error apigrpc.proto: File does not reside within any path specified using --proto_path
What did i do wrong?
I just found what's wrong. It's about import.
I have to remove the prefex protos because the import file is in the same level of directory.
So the import become this :
import "google_annotations.proto";
import "nakama_annotations.proto";
import "nakama_api.proto";
The reason that I put protos in front before because the Android Studio plugin doesn't show red highlight when I put like that. Now After remove that, it highlight red, but it works.

learning python packaging, the old ModuleNotFoundErrro

What am I doing wrong here???
My structure :-
├── tst
│   ├── setup.py
│   └── tst
│   ├── __init__.py
│   ├── mre.py
│   └── start.py
contents of start.py
from mre import mre
def proc1():
mre.more()
return ('ran proc1')
if __name__ == "__main__":
print('test')
print(proc1())
contents of mre.py
class mre(object):
def more():
print('this is some more')
contents of setup.py
from setuptools import setup
setup(name='tst',
version='0.1',
description='just a test',
author='Mr Test',
author_email='test#example.com',
entry_points={'console_scripts': ['tst=tst.start:proc1']},
license='MIT',
packages=['tst'],
zip_safe=False)
nothing in __init__.py
When I run this from the command line all is fine, runs as expected.
However when I package this up using PIP and run using tst I get:-
Traceback (most recent call last):
File "/home/simon/.local/bin/tst", line 5, in <module>
from tst.start import proc1
File "/home/simon/.local/lib/python3.8/site-packages/tst/start.py", line 1, in <module>
from mre import mre
ModuleNotFoundError: No module named 'mre'
I've read numerous posts and I just can't seem to figure this out, if I go into the installed code and change the line
from mre import mre
to
from tst.mre import mre
then it works, but then that doesn't work when running it from the dir for development purposes... I'm obviously missing something obvious :) is it a path issue or am I missing a command in the setup.py?
If someone could point me in the right direction?
edit: do I need to do something different while developing a module thats going to be packaged, perhaps call the code some different way?
cheers
From my point of view, the absolute import from tst.mre import mre is the right thing. You could eventually use from .mre import mre, but the absolute import is safer.
For development purposes:
Use pip's editable mode:
path/to/pythonX.Y -m pip install --editable .
Similar to setuptools develop mode which is slowly going towards deprecation path/to/pythonX.Y setup.py develop.
And run the console script, or the executable module:
tst
path/to/pythonX.Y -m tst.start
Without installation, it is often sill possible to run the executable module:
path/to/pythonX.Y -m tst.start.

Cython project structure with dependent extension classes

I'm getting to the point with a project where I need a proper directory structure. I'm trying to arrange this and getting ImportErrors when using my cython extension classes.
The directory structure looks like:
.
├── __init__.py
├── Makefile
├── README.rst
├── setup.py
├── src
│   ├── foo.pxd
│   ├── foo.pyx
│   ├── __init__.py
│   └── metafoo.pyx
└── test
├── test_foo.py
└── test_metafoo.py
The contents of all files can be found (in commit e635617 at time of writing) of this github repo.
My setup.py looks like the following:
from setuptools import setup, Extension, Command
from Cython.Build import cythonize
SRC_DIR = "src"
PACKAGES = [SRC_DIR]
ext_foo = Extension(SRC_DIR + ".foo",
[SRC_DIR + "/foo.pyx"]
)
ext_meta = Extension(SRC_DIR + ".metafoo",
[SRC_DIR + "/metafoo.pyx"]
)
EXTENSIONS = cythonize([ext_foo, ext_meta])
setup(
name = 'minimalcriminal',
packages=PACKAGES,
ext_modules=EXTENSIONS
)
The complexity seems to lie in that extension classes in metafoo.pyx use extension classes from foo.pyx.
After building with python setup.py build_ext --inplace, the test_foo.py program runs ok:
import os
import sys
sys.path.insert(0, os.path.abspath('..'))
import src.foo as foo
somefoo = foo.Foo(2)
somefoo.special_print()
When run from both the cyproj/test and cyproj directories:
/cyproj$ python test/test_foo.py
The value of somefield is: 2
and
/cyproj/test$ python test_foo.py
The value of somefield is: 2
But the test_metafoo.py crashes when run in the cyproj/test directory:
import os
import sys
sys.path.insert(0, os.path.abspath('..'))
import src.foo as foo
import src.metafoo as metafoo
lotsafoo = [foo.Foo(i) for i in range(4)]
mf = metafoo.MetaFoo(lotsafoo)
mf.special_print()
With the message:
ubuntu#ubuntu-UX21E:/projects/cyproj/test$ python test_metafoo.py
Traceback (most recent call last):
File "test_metafoo.py", line 6, in <module>
import src.metafoo as metafoo
File "cyproj/src/foo.pxd", line 6, in init cyproj.src.metafoo (src/metafoo.c:1154)
ImportError: No module named cyproj.src.foo
But runs properly from the parent cyproj directory:
/cyproj$ python test/test_metafoo.py
The value of somefield is: 0
The value of somefield is: 1
The value of somefield is: 2
The value of somefield is: 3
I don't really get what's driving the different behaviour of these errors. If I can't use import src.foo in test_metafoo.py why does it work in test_foo.py?
Similarly if I open up an interactive session in the parent directory and try to import all:
In [1]: from src import *
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-1-7b8bc2c1dfb9> in <module>()
----> 1 from src import *
/projects/cyproj/cyproj/src/foo.pxd in init cyproj.src.metafoo (src/metafoo.c:1154)()
ImportError: No module named cyproj.src.foo
When src/__init__.py looks like:
__all__ = ["foo", "metafoo"]
Which I thought would allow importing all...
I was able to compile and test your package after removing the __init__.py file from the project root directory and changing test_foo.py and test_metafoo.py.
sys.path.append(os.path.abspath("."))
sys.path.append(os.path.abspath("../"))

How to run a module inside a package, using relative imports?

This question is specific to PyDev. The package structure looks like this:
app
├── __init__.py
├── sub1
│   ├── __init__.py
│   └── mod1.py
└── sub2
├── __init__.py
└── mod2.py
The mod1.py module:
from __future__ import print_function
def f():
print('It works!')
The mod2.py module:
from __future__ import absolute_import
from ..sub1 import mod1
if __name__ == '__main__':
mod1.f()
Everything works beautifully from the shell, the python -m app.sub2.mod2 command prints:
It works!
as expected, all is fine. (The from __future__ import absolute_import line seems to have no effect: I can comment it out and everything still works just fine.)
If I click on mod2 in the PyDev IDE and try to Run As > Python Run, I get
ValueError: Attempted relative import in non-package
which is not surprising as the -m switch is not turned on by default. If I edit the Run/Debug settings for mod2: Arguments > VM Arguments and add -m here; the -m is most likely passed to the python interpreter but now I get:
/usr/bin/python: Import by filename is not supported.
The from __future__ import absolute_import line seems to have no effect; it does not matter whether I comment it out or not; I am using Python 2.7.
I am out of ideas at this point.
In PyDev, how can I run a module inside a package that uses relative
imports?
How should I change the settings once (globally) such that whenever I
try to run a module inside a package, PyDev does the right thing?
(That is, I don't have to individually specify the settings for each
module that I wish to run.)
The developer in person confirmed that it is not possible in PyDev yet; I have opened a ticket for it.
Running a module inside a package, using relative imports
UPDATE: As of Dec 2, 2016, the issue is resolved, see the accepted answer.
Edit:
In PyDev 5.4.0, there's now an option to run using the -m flag (which will import the module through its regular name and not as it was __main__ so that relative imports will work there).
You can enable it at: Preferences > PyDev > Run (i.e.: this will enable it for all runs -- maybe in the future there'll be an option to make it per run, but for now it's set globally for all launches).
Original answer:
The problem is that you have relative imports in your main module and PyDev executes the file with python path/to/file_to_execute.py instead of python -m my.module.
A simple fix is doing a separate main module which in turn imports a main() function from that module and runs it (although again: it can't have relative imports in the module executed as __main__ (this happens because the module is called __main__ and thus cannot resolve a relative import because it wasn't actually imported with a name which can be used to resolve the relative import).
Another fix would be changing the launch configuration to add the '-m my.module' in the VM arguments (go to run > run configurations to do that -- but you have to do that for each main module you want to run, including unit-tests).
And the last fix would be changing PyDev itself (so, please create a ticket for that in the PyDev tracker: https://www.brainwy.com/tracker/PyDev/ -- or submit a pull request, which would make adding that feature much faster ;) )