I'm trying to import a self-defined module in a Jupyter notebook using PyCharm (2016.1). However, I always get "ImportError: No module named xxx". Importing packages like NumPy or Matplotlib works fine. The self-defined module and the notebook are in the same directory and I've tried to set the directory as sources root. How can I fix this? Thanks a lot!
If you run the following in your notebook...
import sys
sys.path
...and you don't see the path to the directory containing the packages/modules, there are a couple ways around it. I can't speculate why this might happen in this example. I have seen some discrepancies in the results of sys.path when running Jupyter locally from PyCharm on OS X vs. on a managed Linux service.
An easy if hacky workaround is to set the sys path in your notebook to reflect where the packages/modules are rooted. For example, if your notebook was in a subdirectory from where the packages or modules are and sys.path only reflects that subdirectory:
import sys
sys.path.append("../")
The point is that sys.path must include the the directory the packages and modules are rooted in so the path you append will depend on the circumstances.
Perhaps a more proper solution, if you are using a virtualenv as your project interpreter, is to create a setup.py for your project and install the project as an editable package with pip. E.g. pip install -e . Then as long as Jupyter is running from that virtualenv there shouldn't be any issues with imports.
One ugly gotcha I ran into on OS X was Jupyter referencing the wrong virtualenv when started. This should also be apparent by inspecting the results of sys.path. I don't really know how I unintentionally managed set this but presume it was due to futzing around my first time getting Jupyter working in PyCharm. Instead of starting Jupyter with the local virtual env it would run with the one defined in ~/Library/Jupyter/kernels/.python/kernel.json. I was able to clear it by cleaning out that directory, e.g. rm -r ~/Library/Jupyter/kernels/.python.
As stated by Thomas in the comments make sure that your notebook serving path and project path are same. When you start your notebook in pycharm you should get something like this :
Serving notebooks from local directory: <path to your project root folder>
Related
i installed simpleitk in anaconda using command
conda install -c simpleitk simpleitk then followed link https://github.com/SimpleITK/SimpleITKCondaRecipe to build it but it's not connecting to itk.org to build.
import SimpleITK as sitk on jupyter notebook is working but sitk.show() is not working. moreover when i tried to follow the commands from http://insightsoftwareconsortium.github.io/SimpleITK-Notebooks/Python_html/00_Setup.html,
from downloaddata import fetch_data, fetch_data_all not working.
even the command fetch_data_all(os.path.join('..','Data'), os.path.join('..','Data','manifest.json')) is not working. i am very new to simpleitk and don't know whether it is due to build not processed. please tell me how to solve my problems. i have been trying from many days, pl help me. moreover how to make imagej as default for simpleitk. i know lots of questions but i would be greatful if solved.
You seem to be having multiple problems, all of which have to do with installing a working environment and less specific to SimpleITK.
You installed SimpleITK using the conda install command, so there was no need to build it using the conda build command. Check that you have it installed correctly and see which version you have:
import SimpleITK as sitk
print(sitk.Version())
The functions fetch_data and fetch_data_all are part of a module found in the SimpleITK notebooks repository. To use the code from that repository you will need to clone it using git:
git clone https://github.com/InsightSoftwareConsortium/SimpleITK-Notebooks.git
Then you can run the notebooks or copy the relevant modules to your directory and work with them there.
The sitk.Show() command assumes that you have the ImageJ/Fiji program installed which is likely why it is not working (I am guessing here as you did not provide sufficient detail).
The missing libcublas.so problem has been around for some time. The most common problem is that the $PATH and $LD_LIBRARY_PATH environment variable is not set properly. And solutions for command line scenarios have been posted in the NVIDIA forum and here.
But no specific solution has been out for similar symptoms in ipython or notebook. Here is my own work around.
The problem is still due to environment variables: ipython and notebook cannot propagate the settled $PATH and $LD_LIBRARY_PATH. So when this happens, the first thing to check is
import os; print(os.environ['PATH']); print(os.environ['LD_LIBRARY_PATH'])
Most probably the bin and lib paths are not in these environs.
To solve this for ipython, use sudo PATH=$PATH LD_LIBRARY_PATH=$LD_LIBRARY_PATH ipython when starting ipython.
And for notebook, add these lines to the end of jupyter_notebook_config.py:
import os
os.environ['PATH'] += ':/usr/local/cuda/bin'
os.environ['LD_LIBRARY_PATH'] = '/usr/local/cuda/lib64'
I'm using Python virtualenv and I'm having a problem with a module (installed inside the virtualenv).
First of all, I activate the virtualenv:
source path_to_virtualenv/bin/activate
The virtualenv is correctly activated (its name appears in the shell). Inside that virtualenv, I installed mininet (a network simulator): I'm sure it is correctly installed (it is listed by the command pip list).
However, when I try to run my application, I obtain the following error over a module of Mininet API:
from mininet.net import Mininet
ImportError: No module named net
How is it possible? Using an IDE, it correctly detects all Mininet's modules (in the same virtualenv); does someone have some ideas?
Thanks
Check to see if your project is in the same directory as your virtual environment.
If not start your virtual env. in the the command prompt and then cd to the project.
hope that helps a little
I am running IPython Notebook on Enthought's Canopy 64 bit distribution, Ubuntu 14.04.
I've tried install libtiff, but when I import it in IPython Notebook, the kernel always dies at the import statement. What could possibly be causing this? Canopy is my default Python distribution, my paths all seem like they're set up appropriately, although I'm convinced that something in my Python setup is borked.
Any advice is appreciated.
EDIT: I'll be more specific. Output of sys.path:
['',
'/home/joe/Enthought/Canopy_64bit/User/src/svn',
'/home/joe/Canopy/appdata/canopy-1.4.1.1975.rh5-x86_64/lib/python27.zip',
'/home/joe/Canopy/appdata/canopy-1.4.1.1975.rh5-x86_64/lib/python2.7',
'/home/joe/Canopy/appdata/canopy-1.4.1.1975.rh5-x86_64/lib/python2.7/plat-linux2',
'/home/joe/Canopy/appdata/canopy-1.4.1.1975.rh5-x86_64/lib/python2.7/lib-tk',
'/home/joe/Canopy/appdata/canopy-1.4.1.1975.rh5-x86_64/lib/python2.7/lib-old',
'/home/joe/Canopy/appdata/canopy-1.4.1.1975.rh5-x86_64/lib/python2.7/lib-dynload',
'/home/joe/Enthought/Canopy_64bit/User/lib/python2.7/site-packages',
'/home/joe/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/PIL',
'/home/joe/opencv-2.4.9',
'/home/joe/Canopy/appdata/canopy-1.4.1.1975.rh5-x86_64/lib/python2.7/site-packages',
'/home/joe/Canopy/appdata/canopy-1.4.1.1975.rh5-x86_64/lib/python2.7/site-packages/IPython/extensions']
As for how to install Python packages, I assume I go to ~/Enthought/Canopy_64bit/User/lib/python2.7/site-packages and run pip, setup.py, or a shell script, per the specific package's instructions. Is that correct? The article that I linked has the following line: "To install a package which is not available in the Canopy / EPD repository, follow standard Python installation procedures from the OS command line.", which seems to imply that I install per package instructions.
In .bashrc, I have the following:
VIRTUAL_ENV_DISABLE_PROMPT=1 source /home/joe/Enthought/Canopy_64bit/User/bin/activate
export PYTHONHOME=/home/joe/Enthought/Canopy_64bit/User/bin
export PATH=/home/joe/Enthought/Canopy_64bit/User/bin
export PYTHONPATH=/home/joe/Enthought/Canopy_64bit/User/bin
From what I understand of the linked articles, this means I'm setting Canopy User as my default Python distribution. I'm sure I'm doing something a bit over my head here, but I can't understand what else I need to do to fix this issue.
Worse yet, now I'm getting an "ImportError: No module named site" with these .bashrc settings, when trying to start IPython notebook or python from the command line. I can run only from the Canopy GUI.
Closing this. I made it harder than necessary.
It turns out, the PYTHONHOME and PYTHONPATH .bashrc variables were causing some conflicts. Commenting them out seems to have resolved the issue.
Installing outside packages does, indeed, happen from the home (~) directory.
Background
I use Anaconda's IPython on my mac and it's a great tool for data exploration and debugging. However, when I wish to use IPython for my programs that require virtualenv (e.g. a Django web app), I don't want to have to reinstall IPython every time.
Question
Is there a way to use my local IPython while also using the rest of my virtualenv packages? (i.e. just make IPython the exception to virtualenv packages so that the local IPython setup is available no matter what) If so, how would you do this on a mac? My guess is that it would be some nifty .bash_profile changes, but my limited knowledge with it hasn't been fruitful. Thanks.
Example Usage
Right now if I'm debugging a program, I'd use the following:
import pdb
pdb.set_trace() # insert this to pause program and explore at command line
This would bring it to the command line (that I wish was IPython)
If you have a module in your local Python and not in the virtualenv, it will still be available in the virtualenv. Unless you shadow it with another virtualenv version. Did you try to launch your local IPython from a running virtualenv that didn't have an IPython? It should work.
Will, I assume you are using Anaconda's "conda" package manager? (Which combines the features of pip and virtualenv). If so you should be aware that many parts of it does not work completely like the tools it is replacing. E.g. if you are using conda create -n myenv to create your virtual environment, this is different from the "normal" virtualenv in a number of ways. In particular, there is no "global/default" packages: Even the default installation is essentially an environment ("root") like all other environments.
To obtain the usual virtualenv behavior, you can create your environments by cloning the root environment: conda create -n myenv --clone root. However, unlike for regular virtualenv, if you make changes to the default installation (the "root" environment in conda) these changes are not reflected in the environments that were created by cloning the root environment.
An alternative to cloning the root is to keep an updated list of "default packages" that you want to be available in new environments. This is managed by the create_default_packages option in the condarc file.
In summary: Don't treat your conda environments like regular python virtualenvs - even though they appear deceptively similar in many regards. Hopefully at some point the two implementations will converge.