coolio
coolio

Reputation: 419

py2neo Batch Insert timing out for even 2k nodes

I'm just trying to do a simple batch insert test for 2k nodes and this is timing out. I'm sure it's not a memory issue because I'm testing with a ec2 xLarge instance and I changed the neo4j java heap and datastore memory parameters. What could be going wrong?

Upvotes: 0

Views: 217

Answers (3)

Nigel Small
Nigel Small

Reputation: 4495

There is an existing bug with large batches due to Python's handling of the server streaming format. There will be a fix for this released in version 1.5 in a few weeks' time.

Upvotes: 1

coolio
coolio

Reputation: 419

Hey so apparently there are some bugs associated with neo4j version 1.8.X, which was what I was using. The below link may provide light support.

https://groups.google.com/forum/?fromgroups=#!topic/neo4j/Nqc9g1FZSD8

EDIT: nevermind, upgrading didn't help.

Upvotes: 0

Evgenii
Evgenii

Reputation: 3420

Can you inserting by 300 nodes by one batch insert?

For example:

nodes_to_insert = []
for n in my_nodes:
    nodes_to_insert.append(n)
    if len(nodes_to_insert) == 300:
        func_insert_by_batch(nodes_to_insert)
        nodes_to_insert = []
if nodes_to_insert:
    func_insert_by_batch(nodes_to_insert)

Upvotes: 0

Related Questions