VSCode: how to structure a simple python package with few modules and tests, debugging and linting? - visual-studio-code

I'm having more trouble than I'd like to admit to structure a simple project in Python to develop using Visual Studio Code.
How should I structure in my file system a project that is a simple Python package with a few modules? Just a bunch of *.py files together. My requisites are:
I must be able to step debug it in vscode.
It has a bunch of unit tests using pytest.
I can select to debug a specific test from vscode tab and it must stop in breakpoints.
pylint must not show any false positives.
The test files must be in a different directory of the main module files.
I must be able to run all the tests from the console.
The module is executed inside a virtual environment using python standard lib module venv
The code will use type hints
I may use another linter, even another test framework.
Nothing fancy, but I'm really having trouble to get it right. I want to know:
How should I organize my subdirectory: a folder with the main files and a sibling folder with the tests? Or a subfolder with the code and a subsubfolder with the tests?
Which dirs must have a init.py file?
How the tests should import the files from the module? Should I use relative imports?
Should I create a pytest.ini file?
Should I create a .env file?
What's the content of my launch.json the debugger file config in vscode?

Common dir structure:
app
__init__.py
yourappcode.py
tests (pytest looks for this)
__init__.py
test_yourunittests.py
server.py if you have one
.env
.coveragerc
README.md
Pipfile
.gitignore
pyproject.toml if you want
.vscode (helpful)
launch.json
settings.json
Or you could do one better. Ignore my structure and look at the some of famous python projects github page. Like fastAPI, Flask, asgi, aiohttp are some that I can think of right now
Also:
I think absolute imports are easier to work with compared to relative imports, I could be wrong though
vscode is able to use pytest. Make sure you have a testing extension. Vscode has a built in one im pretty sure. You can configure it to pytest and specify your test dir. You can also run your test from command line. If youre at the root, just running ‘pytest’ will recognise your tests dir if it’s named that by default. Also your actual test files need to start with prefix test_ i think.
The launch.json doesn’t need to be anything special. When you click on the settings button next to play button in the debug panel. Vscode will ask what kind of app is it. I.e If its a flask app, select python then select flask and it will auto generate a settings file which you can tweak however you want in order to get your app to run. I.e maybe you want to expose a different port or the commands to run your app are different
It sounds to me like you just need to spend a bit of time configuring vscode to your specific python needs. For example, you can use a virtualenv and linting in whichever way you want. You just need to have a settings.json file in the .vscode folder in your repo where you specify your settings. Configurations to specify python virtualenv and linting methods can be found online

Related

how to import a Cmake Project which used .sh scripts for build to Vscode?

I have a legacy code base which has CMake configuration and .sh files which calls build commands with respect to build type ( release, relwithdebinfo etc.) as well as does a lot of things.
I have to work over codebase. So far I used STM32CubeIDe. I imported the project as existing Makefile project and then I changed C/C++ Build directory to where makefile outputs converted.
So whenever I did a change on the code, I hit the build command over UI and it calls make command at where I modifed the path above.
This is working but In case of a need of debug then I had to use .elf outputs and OZONE J link Debug program.
I have to do build+debug in same environment but eclipse is slow and making me struggle.
How can I make VSCode host to my legacy code in order to build and debug and also navigate in code i.e go to the definition of code in VSCode, not only building.
Please navigate me if anything needs to share from existing code base, if there is still unclear.

Ignore submodules for pytest in vsc

I am using VSC with pytest to debug my code. This worked fine until I added a submodule which I used in the past. The Codebase in the submodule has not the same standard as my main module and causes the vsc discovery to fail.
I already added:
"python.testing.pytestArgs": [
"--ignore=${workspaceFolder}/main_package/submodule/",
]
to my vsc settings, but it seems vsc doesn't care about the ignore.
This is slowing my progress extremely down because I don't really understand why the discovery fails. I might be something with the imports but the code and also pylance have no issue with the import.
Output/Python results in
E ModuleNotFoundError: No module named 'core'
which is in the root structure of the submodule and the submodule is a subpackage of the main package.
Is it better to move the submodule out of the main package and use it as a seperate package?
Your setup works for me in Visual Studio code version 1.70.2.
Make sure that the path is typed correct and check the python output (Output -> Python). There you see the command which VS code actually runs to discover tests. It will look something like this:
Note that your-args at the end of the command is what you defined in "python.testing.pytestArgs".
You could also try to define the folder to search for tests instead of ignoring a specific one. The positional arguments of pytest are used for that.
If your tests are located in the folder tests at the top level of your repository, this should do it:
"python.testing.pytestArgs": [
"tests",
]

Intellisense Support for Airflow Plugins in VSCode

I'm trying to find a way to make VSCode Python intellisense work with Airflow Plugins.
Following the code example the import path of plugin operators could be:
from airflow.operators import MyPluginOperator
VSCode cannot resolve this import because it will only be valid at runtime through the airflow plugin system.
Is there any way to configure VSCode to resolve this import?
Airflow loads plugins dynamically by searching the airflow/plugins folder for AirflowPlugin subclasses and add them in airflow namespace in a runtime. Here is the code from airflow/operators/__init__.py:
# Imports operators dynamically while keeping the package API clean,
# abstracting the underlying modules
...
def _integrate_plugins():
"""Integrate plugins to the context"""
from airflow.plugins_manager import operators_modules
for operators_module in operators_modules:
sys.modules[operators_module.__name__] = operators_module
globals()[operators_module._name] = operators_module
VS Code can't handle it. Even "big" Python IDEs like PyCharm has problems with it. It is impossible for VS Code to know that a piece of code in particular folder will transform in airflow.operator later. "python.autoComplete.extraPaths" will not help too. You should only hope that someone will write a VS Code extension for Airflow somewhere :)
Since version 2.0.0 the way that Airflow handles plugins imports has changed:
Importing operators, sensors, hooks added in plugins via airflow.{operators,sensors,hooks}.<plugin_name> is no longer supported, and these extensions should just be imported as regular python modules. For more information, see: Modules Management and Creating a custom Operator
The next important thing to consider is that Airflow adds dags/, plugins/, and config/ directories to PYTHONPATH env. source
Following the specifications from the docs, you could create your MyCustomOperator
in airflow_home/plugins/custom_operator/ directory. Then you could use it like this:
from custom_operator.my_custom_operator import MyCustomOperator
with dag:
sample_custom_task = MyCustomOperator(task_id='sample-task')
So far so good, the dag will run, but according to my experience, the VSCode IntelliSense won't work yet. To make it work, you need to add the path to /plugins as same as Airflow will do when it runs. Depending on your setup there are a few ways to do this. The goal is to add an extra path to the python interpreter VSCode is "using" (make sure to select the interpreter related to your project)
A common solution could be to use the env PYTHONPATH to append our path to the path the interpreter knows. In my case, I'm using a virtual environment, so following the explained in this post, I created a .pth file with the path I wanted to add and locate that file on airflow_home/venv/lib/my_python_version/site-packages/
Following the path on our example, this will create such a file:
cd $(python -c "from distutils.sysconfig import get_python_lib; print(get_python_lib())")
echo airflow_home/plugins > airflow_plugins_example.pth
Once that is done, reload VSCode (could also just change to another interpreter and then come back) and you should see the intelliSense working properly.
Hope that works!

importing go files in same folder

I am having difficulty in importing a local go file into another go file.
My project structure is like something below
-samplego
--pkg
--src
---github.com
----xxxx
-----a.go
-----b.go
--bin
I am trying to import a.go inside b.go. I tried the following,
import "a"
import "github.com/xxxx/a"
None of these worked..I understand I have to meddle up with GOPATH but I couldn't get it right. Presently my GOPATH is pointing to samplego(/workspace/samplego).I get the below error
cannot find package "a" in any of:
/usr/local/go/src/pkg/a (from $GOROOT)
/workspace/samplego/src/a (from $GOPATH)
Also, how does GOPATH work when these source files are imported into another project/module? Would the local imports be an issue then? What is the best practice in this case - is it to have just one go file in module(with associated tests)?
Any number of files in a directory are a single package; symbols declared in one file are available to the others without any imports or qualifiers. All of the files do need the same package foo declaration at the top (or you'll get an error from go build).
You do need GOPATH set to the directory where your pkg, src, and bin directories reside. This is just a matter of preference, but it's common to have a single workspace for all your apps (sometimes $HOME), not one per app.
Normally a Github path would be github.com/username/reponame (not just github.com/xxxx). So if you want to have main and another package, you may end up doing something under workspace/src like
github.com/
username/
reponame/
main.go // package main, importing "github.com/username/reponame/b"
b/
b.go // package b
Note you always import with the full github.com/... path: relative imports aren't allowed in a workspace. If you get tired of typing paths, use goimports. If you were getting by with go run, it's time to switch to go build: run deals poorly with multiple-file mains and I didn't bother to test but heard (from Dave Cheney here) go run doesn't rebuild dirty dependencies.
Sounds like you've at least tried to set GOPATH to the right thing, so if you're still stuck, maybe include exactly how you set the environment variable (the command, etc.) and what command you ran and what error happened. Here are instructions on how to set it (and make the setting persistent) under Linux/UNIX and here is the Go team's advice on workspace setup. Maybe neither helps, but take a look and at least point to which part confuses you if you're confused.
No import is necessary as long as you declare both a.go and b.go to be in the same package. Then, you can use go run to recognize multiple files with:
$ go run a.go b.go
./main.go (in package main)
./a/a.go (in package a)
./a/b.go (in package a)
in this case:
main.go import "./a"
It can call the function in the a.go and b.go,that with first letter caps on.
If none of the above answers works,
Just try,
go run .
for production,
go build
This will take care of all the .go files in the folder.
I just wanted something really basic to move some files out of the main folder, like user2889485's reply, but his specific answer didnt work for me. I didnt care if they were in the same package or not.
My GOPATH workspace is c:\work\go and under that I have
/src/pg/main.go (package main)
/src/pg/dbtypes.go (pakage dbtypes)
in main.go I import "/pg/dbtypes"
As people mentioned previously, there is no need to use any imports.
A lot of people mention that using go run is possibe when you mention most files, however when having multiple .go-files in the same dir it can be cumbersome.
Therefore using go run *.go is what I usually do.
As I understand for packages in your project subfolders it's possible now to do, just need to add "." in front of module, like
. "github.com/ilyasf/deadlock-train/common"
where github.com/ilyasf/deadlock-train my main module name and common is just package from /common subfolder inside project.
go1.19.1 version here. I've just had the same issue, discovered that you must simply do: import (a "github.com/xxxx")
You can go to the correct folder and execute: $ go run *.go
As long as the code only 1 main function in all files it works perfect!

run pydev project from file-system (with imports from different packages)

I want to run my working pydev project python code by double clicking the main module (outside of eclipse): xxx.py
The problem is that due to my imports being in different packages:
from src.apackage.amodule import obj
when xxx.py is double clicked it complains it doesn't know where the imports are (even though when I run xxx.py in pydev it magically knows what I'm importing).
A simple workaround is to remove all of the packages and move all of the modules into one directory (that obviously works but is very inconvenient)
How can I run my code in the file system without doing that work around?
This page answers my question excellently:
http://blog.habnab.it/blog/2013/07/21/python-packages-and-you/
Bottom line is always execute your code from the top, highest level, root directory (e.g. using a minimal main.py file that executes the main script of your program). Then, use absolute imports always and you never have a missing module issue since you start the program from the top directory and all imports are based off that 'home' path.
The problem you encountered is the natural behavior of most languages. A programm only knows about its working path (the path it is started in), the paths which are registered in the environment variables and at least relative paths.
The "magic" of the executable you created is therefore: It collects all scripts/modules needed, and copies/combines them next to/in the executable. The Executable then runs within the directory where all other scripts also reside and voila ...
If you are not happy with your workaround of creating an executable every time you want to run your project without PyDev there are two alternatives.
First but not the one I would suggest is registering the working path into in the environment variables.
Second and the one I think is much better: Create a link to the python executable and alter the calling string of the textfield "Target:". Append the path to your script you would like to run. Then alter the textfield "Start in:" and enter the project directory. After you did this you should be able to start your project with a simple double click.
(If you rely on external libraries which are neither on the path nor in you project you could search for appending paths temporarily to the pythonpath via the sys module.)
I hope I could help a bit.