Install conda package from Google Datalab - jupyter

I'm looking to use the ospc taxcalc package in a Google Datalab notebook. This package must be installed via conda.
Datalab doesn't have conda by default, so this method (from https://stackoverflow.com/a/33176085/1840471) fails:
%%bash
conda install -c ospc taxcalc
Installing via pip also doesn't work:
%%bash
pip install conda
conda install -c ospc taxcalc
ERROR: The install method you used for conda--probably either pip install conda or easy_install conda--is not compatible with using conda as an application. If your intention is to install conda as a standalone application, currently supported install methods include the Anaconda installer and the miniconda installer. You can download the miniconda installer from https://conda.io/miniconda.html.
Following that URL, I tried this:
%%bash
wget https://repo.continuum.io/miniconda/Miniconda2-latest-Linux-x86_64.sh
bash Miniconda2-latest-Linux-x86_64.sh
wget works, but the bash command to install just kept in "Running..." state for seeming perpetuity.
This seems to be due to the conda installer prompting for several Enter keystrokes to review the license, and then for a yes indicating acceptance of the license terms. So conda's silent mode installation looked promising:
%%bash
bash Miniconda2-latest-Linux-x86_64.sh -u -b -p $HOME/miniconda
This produced the following warning:
WARNING: You currently have a PYTHONPATH environment variable set. This may cause unexpected behavior when running the Python interpreter in Miniconda2. For best results, please verify that your PYTHONPATH only points to directories of packages that are compatible with the Python interpreter in Miniconda2: /content/miniconda
And doesn't make available the conda command:
%%bash
conda install -c ospc taxcalc
bash: line 1: conda: command not found

There is a pending github issue tracking this work - https://github.com/googledatalab/datalab/issues/1376
I believe we will need to install conda and use that for python, pip and all other python packages, and in the interim it may not be possible to mix the two python environments. However someone with more experience with conda might know otherwise.

As of the 2018-02-21 release, Datalab supports Conda and kernels are each in their own Conda environment.

Related

I can not install OpenMDAO

I have installed Anaconda since it was recommended to use in the OpenMDAO's website. After that I typed pip install 'openmdao[all]' as instructed. However I get an error message saying, "ERROR: Invalid requirement: "'openmdao[all]'"" I was wondering how can I solve this issue. I have no knowledge about the Python or anaconda therefore I don't have a clue what to do about this situation. I searched the internet however I didn't find a solution to this. Thanks in advance!
It's possible you're running the wrong pip executable. Try which pip - it should be located under your home directory instead of a system path such as /bin/pip or /usr/bin/pip.
Make sure you've created and activated an environment in Anaconda:
conda create -n myenv pip
conda activate myenv
python -m pip install 'openmdao[all]'
That command sequence will give the highest chance of success. The single quotes are required for shells like zsh.
Remove the single quotes:
pip install openmdao[all]

Not able to install feature -Engine Module

I am trying to install feature-engine module on anaconda
this is the error i am getting
Package is not available from current channels
repo.anaconda win 64 , noarch etc.
Can you please help me with the problem?
Thanks,
RD
to install from anaconda:
conda install -c conda-forge feature_engine
I believe that feature-engine is not available through anaconda channels for installation with conda install. I was able to install it via pip. Here is how I did it (in Windows):
open a CMD and run conda activate <<VIRTUALENV>>. This is the environment you create for your project. If you have not created one, then use base, the default one.
cd to the location of your pip installation within that activated conda Virtual environment (mine was within my user folder in \AppData\Local\Continuum\anaconda3\envs\<<VIRTUALENV>>\Scripts).
in there, run pip install feature-engine
you should now be able to see it listed under pip freeze or pip list, but not under conda list.
Finally, go to your code location and run the code. remember to activate that same <> each time you open a new CMD to run it.
Hope it helps.
If you are using Jupyter Notebooks, it might be the case that your Jupyter Notebook is not actually running the kernel in your (activated!) Anaconda environment (via this answer), but the generic Python3 kernel that only can import packages from your global Anaconda environment.
You can check for this by importing a package that is installed in your global environment (e.g., pandas), while running a notebook:
import pandas
pandas.__file__
If you see something likes this (on Windows), you are indeed running the wrong kernel (as you would expect the packages to be loaded from the activated environments):
'C:\\Users\\<user>\\Anaconda3\\lib\\site-packages\\pandas\\__init__.py'
Therefore, in your Anaconda Prompt, you have to create a new kernel within ipykernel (assuming cenv is your environment of interest):
$ conda activate cenv # . ./cenv/bin/activate in case of virtualenv
(cenv)$ conda install ipykernel
(cenv)$ ipython kernel install --user --name=<any_name_for_kernel>
(cenv)$ jupyter notebook
Now, in the restarted Jupyter Notebook you can change the kernel via the menu: Kernel > Change kernel > <any_name_for_kernel>
Importing the same package, like pandas, should show the following file path:
'C:\\Users\\<user>\\Anaconda3\\envs\\<cenv>\\lib\\site-packages\\pandas\\__init__.py'
and you should be able to import any package installed in that Anaconda environment.

jupyterhub and conda environments

I'm a little bit confused about relations between conda environments and jupyterhub.
As jupyterhub documentation says it can be installed from conda. So it is possible to use some conda envirnment (for example environment "root") and do "conda install jupyterhub" from it.
In the same environment will "live" jupyter. And installation of nb_conda from in this environment gives ability to select kernels and other conda environments in notebooks.
My question is about software like nbextensions and ipywidgets. Where should they be? In the same environment as jupyterhub or in an environment correponds new notebook?
The relationship between conda and jupyter can be a confusing one. Think of conda as your environment, and jupyter as just any other package. A package that you can start a process with and then serve.
To answer your question, they should be installed in your conda environment. Unfortunately it is a little more complicated than that. These extensions will be available to all users. I haven't personally tested a single user having more extensions in a different environment (if it is possible, but will update answer if I do).
If it helps, this is what the docs have to say for the matter:
To install the jupyter_contrib_nbextensions notebook extensions, three steps are required. First, the Python pip package needs to be installed. Then, the notebook extensions themselves need to be copied to the Jupyter data directory. Finally, the installed notebook extensions can be enabled, either by using built-in Jupyter commands, or more conveniently by using the jupyter_nbextensions_configurator server extension, which is installed as a dependency of this repo.
Assuming you installed the extensions via conda:
conda install -c conda-forge jupyter_contrib_nbextensions
then the --sys-prefix was used, which is good. From the docs:
--sys-prefix to install into python's sys.prefix, useful for instance in virtual environments, such as with conda.
So, to add an extension, the process should look like this:
$ sudo su -
$ pip install fileupload
$ jupyter nbextension install --sys-prefix --py fileupload
$ jupyter nbextension enable fileupload --py --sys-prefix
Since the title is asking about conda environments, I'll go into that a little bit too. I've tested these methods on Ubuntu 18.04LTS.
Very often you will want to allow users to share user created environments, never having access to root privileges. You have two good options that I have seen (someone please comment if you know other methods): 1) share an environment 2) duplicate an environment from requirements files. Don't forget you'll have to add the environment as a kernel as well.
method 1 - shared environment
Create a environment in a shared location, and then have both users add it as a kernel.
conda create -p /home/envs/test --clone root
can clone root to copy the root env, or base for the base environment. /home/envs/test will create a "test" environment, in the "envs" directory. Make sure envs has all necessary permissions for users that will be using the files.
From there as another user, just add the environment as a kernel.
$ sudo su - <user-to-install-kernel-to>
$ conda activate <test>
$ python -m ipykernel install --user --name test \
--display-name "Python (test)"
Note.. I believe I had to update the kernelspec manually to get it to point to the correct python environment 🤦‍♂️
method 2
Alternatively, just create a copy of the environment
$ conda env export --name test > environment.yml
$ sudo su - customer
$ conda env create --name cust-env-copy --file environment.yml
$ python -m ipykernel install --user --name cust-env-copy \
--display-name "Python (test)"

jupyterhub: command not found

I am trying to run the jupyterhub locally on Ubuntu 16.04; however, I cannot seem to run the jupyterhub command on the terminal, and I get the command not found error. I have installed jupyterhub by running the following commands:
sudo npm install -g configurable-http-proxy
pip3 install jupyterhub
pip3 install --upgrade notebook
All of the above packages install successfully. My PATH variable in /etc/environment has been set as follows:
PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/root/bin"
You need to start a Jupyter service, if you create environments with anaconda you can use them later inside the jupyter as kernels, so you can have a variety of library environments as you like.
There are also other functions to give resource guarantee or limits, or initial folders for your hub.
You can edit all this in your jupyterhub_config.py file.
I leave you here my github with a complete guide for the configuration and installation.
Github/Jupyter
If you can first install Anaconda.
Then install the configurable proxy.
Run commands -
conda install -c conda-forge jupyterhub # installs jupyterhub and proxy
conda install notebook # needed if running the notebook servers locally
Now try jupyterhub

There is no activate when I am trying to run my virtual env

1) I installed virtualenv using pip.
2) I ran the command virtualenv venv
3) Then I ran source venv/bin/activate but it says that there is no such file or directory.
When I cd into venv/bin I find 3 things - python, python 2.7, and python 3.5. Does anyone know the problem?
i have had the same problem. and what i did is just run the command
virtualenv env
again. And then more files were generated under directory env/bin, including the activate file. it's so weird.
I solved the similar problem running python3.7 -m venv venv, you can change for your version of python that is installed in your enviroment.
According to Python doc, the installation step is
$ python3 -m pip install --user virtualenv
$ python3 -m venv env
The last command gives a warning message,
The virtual environment was not created successfully because ensurepip is not
available. On Debian/Ubuntu systems, you need to install the python3-venv
package using the following command.
apt-get install python3-venv
You may need to use sudo with that command. After installing the python3-venv
package, recreate your virtual environment.
$ sudo apt-get install python3-venv
Now, activate is available.
I solved a similar problem by naming it venv2 when I ran virtualenv. I already had a virtual environment named venv for another project. This allowed me to proceed.
I experienced this problem when using the --upgrade option. Removed the option, and all ran as expected.
I double it is caused by some networking issue, I run it twice to get 'activate' script installed. Maybe first it can't connect to some source so it just abort installation.
I had this happen on rasbian when I hadn't installed python3-pip before creating the venv.