When I activate a virtual environment via poetry shell, GPG signing gets messed up because $GPG_TTY is set for the parent shell. So every time I use poetry shell I have to do GPG_TTY=$(tty). Is there a way to do this automatically? Ideally I would set this up once globally rather than for every poetry project.
Poetry does include a plugin system from version 1.2 onwards, and .env support - if that works for your use case - has been one of the first plugins being written. You can activate it with
poetry plugin add poetry-dotenv-plugin
If plugins are not for you, this pattern works with most shells and will work globally for your user:
~/.bashrc
alias poetry="GPG_TTY=$(tty) poetry"
The answer for Arne is correct, however you should use poetry self add poetry-dotenv-plugin with more recent versions of poetry.
Related
We have to use micromamba for our app because conda is prohibitively slow for installing our packages. We use a devcontainer to install micromamba and its packages. This works for the VS Code terminal but the editor still cannot find my packages.
I only see a way to activate the micromamba environment with a shell script snippet or shell rc file. The works for the terminal, but I dont see a way to activate it for the editor processes. The closest setting I found is specific to venvs.
The solution was to add a .env file setting PYTHONPATH to the modules. Then setting "python.envFile" in .vscode/settings.json to point to that .env file.
VSCode wouldn't show venvs created with poetry in "change kernel" window.
I tried this thread but it didn't work. As well as the topic being 2 years old.
Eventhough I would like to avoid installing all the dependencies in local folder for each project (one of the suggested answers), I still tried it but VSCode doesn't see it either (.venv folder created inside the project).
Right now poetry saves all the venvs in: ~/.cache/pypoetry/virtualenvs after poetry shell any of the venvs. I added it to settings.json with
"python.venvPath": "/home/gobsan/.cache/pypoetry/virtualenvs",
"python.venvFolders": [
"/home/gobsan/.cache/pypoetry/virtualenvs"
],
2x times just in case, but it is greyed out and doesn't seem to work.
I have also tried to change poetry config virtualenvs.path to
~/.local/share/pypoetry/venv/bin
~/.local/share/pypoetry/venv
~/.local/share/pypoetry
hoping VSCode will see it since it can see something there
My main goal is to be able to see and switch between different venvs inside Jupyter. Switching poetry venvs for python scripts is no problem.
Thank you for any help.
ps. working through wsl2
You should end your path with a forward slash, like "/home/gobsan/.cache/pypoetry/virtualenvs/".
I was just researching Poetry and VSCode and ended up doing this as well.
Here's my own path and what my interpreter looks like after the settings update
The settings have changed for the Python extension in VS Code. I was able to select my Poetry virtual environment for my interpreter/ipynb kernel again after changing the dated python.pythonPath setting (yours might be python.venvPath) to python.defaultInterpreterPath in the VS Code settings.json file.
I don't know if changing this setting will allow VS Code to automatically pick up other Poetry virtual environment options listed under poetry env info --path in your CLI. The guidance here will hopefully be of use for that: https://code.visualstudio.com/docs/python/environments#_manually-specify-an-interpreter
{
"python.defaultInterpreterPath": "/Users/myname/Library/Caches/pypoetry/virtualenvs/projectname-randomnumbers-py3.9/bin/python",
}
(my work computer is a Mac)
Assuming that VSCode is installed and an anaconda environment is setup. The default conda environment contains language analysis and compilation tools like stack (for haskell) and yamllint (for yaml). However, since conda 3.4 they are under <conda_install_dir>/bin, which is not included in system PATH environment variables. The only executable is conda itself.
After installation of VSCode and its plugin, I frequently encounter error messages indicating that executables of these tools are not found. E.g. for stack, the message is:
Project requires Cabal but it isn't installed
It appears that the only way to make it work is to override the PATH variable in VSCode to make it different from the one used by OS. Is there a option to allow this overriding?
Thanks a lot for your help
have you tried updating your PATH to simply include it ?
I'm using a Python library which uses click autocompletion. Since I've installed the library in a conda env, I'd like the autocomplete to be associated with it. (Also, since it isn't installed in my primary Python env, adding eval "$(_FOO_BAR_COMPLETE=source_zsh foo-bar)" to my .zshrc doesn't work.) The documentation for the library I'm using says "if gradient was installed in a virtual environment, the following has to be added to the activate script":
eval "$(_GRADIENT_COMPLETE=source gradient)"
I originally added this to ~/miniconda3/envs/my_env/lib/python3.6/venv/scripts/common/activate, but the autocompletion didn't work. Running
source ~/miniconda3/envs/my_env/lib/python3.6/venv/scripts/common/activate
does work, but my shell prepends via __VENV_DIR__ to the prompt, and the fact that this doesn't happen automatically when I run conda activate myenv makes me think this is the wrong way to do it (for one, it isn't disabled when I do conda deactivate my_env).
What I'm looking for is the canonical way to add a script to run upon conda activate x, then end upon conda deactivate x. This seems very close, but it's for adding shell variables with export and unset. Is there a way to do it with click's autocomplete?
Following a small modification of the instructions in the docs seemed to work for me - I placed the eval statement in env_vars.sh, and nothing in deactivate.d.
My understanding is that export is persistent in the shell throughout sessions, and so must be undone with a corresponding unset. Whereas eval only works for that session, so as soon as the conda env is deactivated it no longer has an effect.
Would be happy to hear more from someone with a deeper understanding of bash/conda under the hood!
Background
I use Anaconda's IPython on my mac and it's a great tool for data exploration and debugging. However, when I wish to use IPython for my programs that require virtualenv (e.g. a Django web app), I don't want to have to reinstall IPython every time.
Question
Is there a way to use my local IPython while also using the rest of my virtualenv packages? (i.e. just make IPython the exception to virtualenv packages so that the local IPython setup is available no matter what) If so, how would you do this on a mac? My guess is that it would be some nifty .bash_profile changes, but my limited knowledge with it hasn't been fruitful. Thanks.
Example Usage
Right now if I'm debugging a program, I'd use the following:
import pdb
pdb.set_trace() # insert this to pause program and explore at command line
This would bring it to the command line (that I wish was IPython)
If you have a module in your local Python and not in the virtualenv, it will still be available in the virtualenv. Unless you shadow it with another virtualenv version. Did you try to launch your local IPython from a running virtualenv that didn't have an IPython? It should work.
Will, I assume you are using Anaconda's "conda" package manager? (Which combines the features of pip and virtualenv). If so you should be aware that many parts of it does not work completely like the tools it is replacing. E.g. if you are using conda create -n myenv to create your virtual environment, this is different from the "normal" virtualenv in a number of ways. In particular, there is no "global/default" packages: Even the default installation is essentially an environment ("root") like all other environments.
To obtain the usual virtualenv behavior, you can create your environments by cloning the root environment: conda create -n myenv --clone root. However, unlike for regular virtualenv, if you make changes to the default installation (the "root" environment in conda) these changes are not reflected in the environments that were created by cloning the root environment.
An alternative to cloning the root is to keep an updated list of "default packages" that you want to be available in new environments. This is managed by the create_default_packages option in the condarc file.
In summary: Don't treat your conda environments like regular python virtualenvs - even though they appear deceptively similar in many regards. Hopefully at some point the two implementations will converge.