Reputation: 339
I have just started learning spark and have been using R & Python on Jupyter notebook in my company.
All spark and Jupyter are installed on my computer locally and function perfectly fine individually.
Instead of creating .py script for pyspark in cmd every single time, could I possibly connect it to my Jupyter notebook live and run the scripts there? I have seen many posts on how to achieve that on Linux and Mac but sadly I will have to stick with Window 7 at this case.
Thanks! Will
Upvotes: 0
Views: 447
Reputation: 317
You could use the Sandbox from Hortonworks (http://hortonworks.com/downloads/#sandbox) and run your code in Apache Zeppelin. No setup necessary. Install virtual box and run the sandbox. Then access zeppelin and ambari via your host (windows) browser and you are good to go to run your %pyspark code. Zeppelin has a look an feel like Jupyter.
Upvotes: 1