Sleepyhead
Sleepyhead

Reputation: 1021

"execution_count" error when running a job on a remote IPython cluster

I am running an IPython cluster (SSH) on a remote Linux machine, and I am using Mac OS X with IPython to use that cluster. In IPython on Mac I write:

from IPython.parallel import Client
c = Client('~/ipcontroller-client.json', sshserver="me@remote_linux_machine")
dview=c[:]
dview.scatter('m', arange(100))

where '~/ipcontroller-client.json' is the file copied from remote_linux_machine. Everything works up to this point.

When I try to use parallel magic %px I get an error:

/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/IPython/parallel/client/client.pyc
in __init__(self, msg_id, content, metadata)
     80         self.msg_id = msg_id
     81         self._content = content
---> 82         self.execution_count = content['execution_count']
     83         self.metadata = metadata
     84 

KeyError: 'execution_count'

Same idea, but when I run the cluster on localhost it works perfectly.

Should parallel magic work at all for remote SSH cluster case?

Upvotes: 1

Views: 902

Answers (1)

Sleepyhead
Sleepyhead

Reputation: 1021

The problem is now fixed: one needs to make sure the IPython versions are the same (mine are 0.13.2) on the cluster and on the machine you are using it.

On the Linux machine I had to specify the version I needed to install as the standard IPython was installed with version 0.12.1:

sudo apt-get install ipython=0.13.2-1~ubuntu12.04.1

Upvotes: 2

Related Questions