VSCode settings for Pylance - visual-studio-code

I am running Vscode with the following components:
Version: 1.51.1 (user setup)
Commit: e5a624b788d92b8d34d1392e4c4d9789406efe8f
Date: 2020-11-10T23:34:32.027Z
Electron: 9.3.3
Chrome: 83.0.4103.122
Node.js: 12.14.1
V8: 8.3.110.13-electron.0
OS: Windows_NT x64 10.0.20270
Pylance 2.6
I have the following directory structure:
src
m1.py
.vscode
settings.json
lib
m2.py
.vscode
settings.json
I use several linters with this environment when developing Python code. Mypy does not have a problem but pylance is unable to resolve imports.
I am trying to import the module m2.py from m1.py when pylance fails. My settings.json file under the src directory is:
{
"python.autoComplete.extraPaths": [
"*.lib"
]
}
Can anyone see how to resolve this problem?

Pylance uses python.analysis.extraPaths as opposed to python.autoComplete.extraPaths.
{
"python.analysis.extraPaths": [
"*.lib"
]
}
Have you tried that?

If your VSCode workspace folder is the parent of the src folder it is normal to have Pylance complain because by default the root of your project is your workspace folder. You can see that if I import src.lib.m2 Pylance doesn't complain but it does if I use lib.m2:
Since you don't have a runtime error when running your code I would say you are inside the src folder when you run m1.py.
If my assumptions are not true, you'll need to add more details (code sample, how do you run the m1.py file)

Set the sub-folders up as proper Python Packages
This method provides conformance with standard Python project packaging guidelines
I recommend a setup that makes the subfolders all proper python packages. To do that, add a blank __init__.py file to each sub-folder with Python modules (i.e. files) in it.
With your original setup, ignoring the .vscode folders:
src/
__init__.py
m1.py
lib/
__init__.py
m2.py
In this case, the imports would need to be from the src folder (it would be considered a package itself, since it has a __init__.py file in it):
import src.m1
import src.lib.m2
Make a proper scripts packages
However, I recommend putting your scripts into their own package, not directly in the src folder:
src/
scripts/
__init__.py
m1.py
lib/
__init__.py
m2.py
This allows all your packages to be referenced with a proper package name rather than src like import scripts.m1 and import lib.m2.
Side Notes
If you want these packages to be "sub-packages", you can keep an __init__.py in the src folder to make it the root folder for everything.
With that change, imports would be import src.scripts.m1 and import src.lib.m2.
Python will go up in the parent folders until it finds a folder without an __init__.py file and then start the import statements in a chain from any sub-folders that are packages (i.e. have an __init__.py file).
Any folders chained together as packages after this process can be accessed locally without being added to the System or Python path.
How to import modules from the packages
Under this scheme, the m1.py script should be able to import the m2.py with something like the following. Since src is not a package, it is the root from Python's perspective, and not included in import statements.
# In scripts.m1
import lib.m2 as m2
m2.function_1()
a = m2.function_2(m2.symbol_1)
or
from lib.m2 import function_1, function_2, symbol_1
function_1()
a = function_2(symbol_1)
If you add test files in this setup (say within a tests directory inside of scripts), then you can import the script functions as import scripts.m1 as m1 or from script.m1 import *.
This setup makes the package conform to the standard for python packages and so if you want to make it installable or upload it to PyPi (or otherwise distribute it privately with zip files or through a git repo), you can define and build the project using the setuptools package using a standard setup.py file. See Packaging Python Projects

Your file structure seems to be the problem, why PyLance can't resolve the imports.
The best way out:
create a python virtual env and activate it.
Linux
python -m venv env
source env/bin/activate
Windows powershell
py -3.6 -m venv env
.\env\Scripts\Activate
Final Step
Having activated your virtual environment,
Just hit ctrl+shift+p
search for "python" and hit "restart language server"
That should resolve all imports, thanks to the virtual environemnt.

Related

python setup.py install won't keep data_files

I am currently so confused about installation of my own Python packages... Between setup.pys, sdists and wheels, no matter what I do, I can't seem to achieve what I want - which is to have a bunch of non-.py data files installed and kept in the virtual environment with the same structure I have in my project folder after the installation.
I've read all kinds of documentations, and created a setup.py file that has a data_files field that contains all the data files I need in my installation.
I have the following structure:
.
|__ requirements.txt
setup.py
hr_datapool
|__ __init__.py
|__ data_needed
|__ needed_folder
|__ needed_file.notpy
|__ one_module
|__ __init__.py
|__ misc_tools.py
|__tests
|__ test_tools.py
|__ other_module
...
And data_needed contains non-.py data files that are needed for misc_tools.py (and thus tests.py) to run.
Because of this, I added a data_files into my setup.py that contains all the folders and files I need. This I confirmed, everything is there what should be.
And yet, if I do any variation of pip install ., python setup.py install or the likes, the data_files are completely ignored, and/or placed in the wrong directory and don't appear anywhere in the created build or dist folders. And because of this, obviously all my tests fail, since they can't load files that are not there. Neither are they stored in the installation folder on the venv when I sometimes do succeed in copying them, but rather in the root of the venv.
The funny thing is, the files are handled while installing, I keep getting console output when installing with python setup.py install like:
copying data_needed/needed_folder/needed_file.notpy-> /Users/.../venv/hr_datapool/data_needed/needed_folder/
but only if I use python setup.py install, (not when using pip install .).
According to the documentation:
The directory should be a relative path. It is interpreted relative to the installation prefix (Python’s sys.prefix for system
installations; site.USER_BASE for user installations). Distutils
allows directory to be an absolute installation path, but this is
discouraged since it is incompatible with the wheel packaging format.
No directory information from files is used to determine the final
location of the installed file; only the name of the file is used.
Notice the highlights. However, in my example, it doesn't install relative to the directory containing the package, but it installs into its own folder in the root of the virtual environment, making it practically unreachable from within my code. I made sure I se relative paths in my setup.py, but still this happens.
How can I make sure the required data_files install within the target directory of the module, and not separately into the root of the virtual environment?

can not import correctly package when have the same package-name in vscode

In my workspace, there are several directores(projects). All the directories have the same stucture, like :
project1:
docs/
src/
__init__.py
code1.py
test/
projects2:
docs/
src/
__init__.py
code2.py
test/
projects3:
docs/
src/
__init__.py
code3.py
...
# .env file in workspace
# PYTHONPATH=project1:project2:project3
When i want import package from code2, it would fail, such as code3.py
# code3.py
# from src import code2
I know in pycharm , it is easy to do with this situation by just marketing the directories as source root directory.
How can i do with it?
VS Code version: Code 1.43.2 (0ba0ca5, 2020-03-24T07:34:57.037Z)
OS version: Darwin x64 18.0.0
I have solved it by install extensions: Python/Python extension Pack/magic python
Mainly the Python Extension Pack.
When this extension enabled, others enabled, also, when it is disabled, others disabled also.

Python importing from incorrect module (which bears the same name), VSC

I have two modules, both named connection.py in two separate environments listed below. Both of the folders containing connection.py are in my PYTHONPATH system environment variable.
However, if that of spec is not placed above that of bvbot, spec's test_connection.py attempts to import from the connection.py of bvbot.
If in cmd, I can resolve this by moving the path of spec above that of bvbot. But, in Visual Studio Code, spec's test_connection.py still imports from bvbot's connection.py.
The two environments of interest are:
C:\Users\You_A\Desktop\2016Coding\VirtualEnviroments\spec\spec_trading
C:\Users\You_A\Desktop\2016Coding\VirtualEnviroments\bvbot\Legacy_bvbot
Structure of the spec path above:
src/
spec_trading/
__init__.py
connection.py
tests/
__init__.py
connection.py
spec test_connection.py:
import pytest
from connection import Connection, OandaConnection
class TestConnection:
def test_poll_timeout(self):
connection = Connection()
timeout = 10.0
connection.set_poll_timeout(timeout)
assert connection.poll_timeout == timeout
What I am doing wrong here? How can I resolve this without resorting to manually faffing with my systems environment variables and resolve the VSC issue?
Easiest solution is to not use implicit relative imports (I assume this is Python 2.7). Basically use explicit relative imports and make sure the imports resolve within the package they are contained within instead of Python having to search sys.path for the module.
And if you are using Python 2.7, put from __future__ import absolute_import at the top of the file.

How to run a module inside a package, using relative imports?

This question is specific to PyDev. The package structure looks like this:
app
├── __init__.py
├── sub1
│   ├── __init__.py
│   └── mod1.py
└── sub2
├── __init__.py
└── mod2.py
The mod1.py module:
from __future__ import print_function
def f():
print('It works!')
The mod2.py module:
from __future__ import absolute_import
from ..sub1 import mod1
if __name__ == '__main__':
mod1.f()
Everything works beautifully from the shell, the python -m app.sub2.mod2 command prints:
It works!
as expected, all is fine. (The from __future__ import absolute_import line seems to have no effect: I can comment it out and everything still works just fine.)
If I click on mod2 in the PyDev IDE and try to Run As > Python Run, I get
ValueError: Attempted relative import in non-package
which is not surprising as the -m switch is not turned on by default. If I edit the Run/Debug settings for mod2: Arguments > VM Arguments and add -m here; the -m is most likely passed to the python interpreter but now I get:
/usr/bin/python: Import by filename is not supported.
The from __future__ import absolute_import line seems to have no effect; it does not matter whether I comment it out or not; I am using Python 2.7.
I am out of ideas at this point.
In PyDev, how can I run a module inside a package that uses relative
imports?
How should I change the settings once (globally) such that whenever I
try to run a module inside a package, PyDev does the right thing?
(That is, I don't have to individually specify the settings for each
module that I wish to run.)
The developer in person confirmed that it is not possible in PyDev yet; I have opened a ticket for it.
Running a module inside a package, using relative imports
UPDATE: As of Dec 2, 2016, the issue is resolved, see the accepted answer.
Edit:
In PyDev 5.4.0, there's now an option to run using the -m flag (which will import the module through its regular name and not as it was __main__ so that relative imports will work there).
You can enable it at: Preferences > PyDev > Run (i.e.: this will enable it for all runs -- maybe in the future there'll be an option to make it per run, but for now it's set globally for all launches).
Original answer:
The problem is that you have relative imports in your main module and PyDev executes the file with python path/to/file_to_execute.py instead of python -m my.module.
A simple fix is doing a separate main module which in turn imports a main() function from that module and runs it (although again: it can't have relative imports in the module executed as __main__ (this happens because the module is called __main__ and thus cannot resolve a relative import because it wasn't actually imported with a name which can be used to resolve the relative import).
Another fix would be changing the launch configuration to add the '-m my.module' in the VM arguments (go to run > run configurations to do that -- but you have to do that for each main module you want to run, including unit-tests).
And the last fix would be changing PyDev itself (so, please create a ticket for that in the PyDev tracker: https://www.brainwy.com/tracker/PyDev/ -- or submit a pull request, which would make adding that feature much faster ;) )

Force py.test to use installed version of module

I have a mixed Python/C++ library with test files mixed in amongst source files in the same directories. The layout looks like
/home/irving/geode
geode
__init__.py
vector
__init__.py
test_vector.py
...
...
Unfortunately, the library is unusable in-place since it lacks .so extension modules. Question: Can I make py.test always use an installed version, even when run from /home/irving/geode or a subdirectory?
The test files have from __future__ import absolute_import, and run fine if executed directly as scripts. For example, if I do
cd geode/vector
./test_vector.py
which does import geode, it finds the installed version. However, if I run py.test in geode/vector, it finds the local copy of geode, and then dies.
I think you have two options:
run py.test --pyargs geode.vector.test_vector to make pytest interpretet the argument as an import path, deriving the file system path from it. This should run the test against the installed version.
move the tests out into a tests directory without an __init__.py file. This way you need to pip install -e . to work in-place or can do python setup.py install and the py.test tests to run tests against the installed version.