Reputation: 8550
I'm using Jupyter Notebook and I'm trying to import tensorflow. Here's the error I get:
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-15-64156d691fe5> in <module>()
----> 1 import tensorflow as tf
ModuleNotFoundError: No module named 'tensorflow'
I'm hosting the notebook by typing jupyter notebook
in my virtual environment:
(labs) Sahands-MBP:part1 sahandzarrinkoub$ jupyter notebook
tensorflow is definitely installed in the virtual environment:
(labs) Sahands-MBP:part1 sahandzarrinkoub$ python
Python 3.6.4 |Anaconda, Inc.| (default, Jan 16 2018, 12:04:33)
[GCC 4.2.1 Compatible Clang 4.0.1 (tags/RELEASE_401/final)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow
>>>
So why isn't it found? Where does jupyter look for packages? I've even installed tensorflow outside of my virtualenv.
Upvotes: 3
Views: 11186
Reputation: 31
It could be that your Jupyter notebook is looking into a different path than where you have installed tensorflow. You want to make sure Jupyter opens the correct Python through the correct kernel. One way to fix this is to install nb_conda by typing the following in your terminal:
conda install -c anaconda-nb-extensions nb_conda
You can then run Jupyter and when selecting a notebook from the New dropdown menu, make sure you select the Python that is connected to the kernel you want (kernel where you have all the libraries and dependencies such as tensorflow)
Upvotes: 3
Reputation: 60321
It seems that you are trying to use the Jupyter installation that comes from your baseline Python, i.e. outside of your labs
virtual environment.
A quick and easy way to remedy this is simply to additionally install Jupyter inside your virtual environment, i.e.
pip install jupyter
from inside labs
.
A more general way, in order also to avoid multiple Jupyter installations (in each virtual environment), is the use of Jupyter kernels; see my detailed answer here for the case of PySpark, which is straightforward to adapt for your case.
Upvotes: 3