Reputation: 13581
I've created a spark cluster on EMR. But I'm unable to access pyspark when I open it with a notebook.
Configuration:
Example:
from pyspark import SparkContext
I also cannot access sc
which I was under the impression would be available.
sc.list_packages()
NameError: name 'sc' is not defined
I feel like I'm missing something very basic here but I'm completely new to EMR and have spent a bunch of time on this already.
Are there any ideas I should try to debug this?
Upvotes: 2
Views: 759
Reputation: 13581
When I opened my notebook with "JupyterLab" instead of "Jupyter" all libraries were available.
Upvotes: 1