Reputation: 501
I am trying to install tensorflow on a spark HDInsight cluster. but facing issues.
I used pip install tensorflow from the headnode.
I am able to import tensorflow from the python.
Python 2.7.12 (default, Dec 4 2017, 14:50:18)
[GCC 5.4.0 20160609] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow
>>> exit()
But when I try to import tensorflow from head node or pyspark console it is throwing an error
SPARK_MAJOR_VERSION is set to 2, using Spark2
Python 2.7.12 |Anaconda custom (64-bit)| (default, Jul 2 2016, 17:42:40)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://anaconda.org
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 2.0.2.2.5.6.2-9
/_/
Using Python version 2.7.12 (default, Jul 2 2016 17:42:40)
SparkSession available as 'spark'.
>>> import tensorflow
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named tensorflow
Can anyone please help me with installing tensorflow on spark HDinsight cluster.
Thanks.
Upvotes: 0
Views: 1090