Ido Nadler
Ido Nadler

Reputation: 1

Neo4j outrageous write performance

I'm doing performance tests for neo4j 3.0.7 using the new bolt client and I see some strange results. I'm trying a very simple scenario: upserting 1000 nodes (using merge command).

I tried several approaches:

  1. 1000 synchronous transactions of 1 command
  2. 1 transaction of 1000 command
  3. 1000 asynchronous transactions of 1 command (using 10 threads)

This is the query I am executing (I have a uniqueness constraint on person.id): "Merge (n:person {id:'123'}) SET n.name='Diana Kesha', n.address='aaa' .... RETURN n.id"

Here is the execution plan of my query:

Execution plan

Here are the results:

  1. ~3sec per 1000 synchronous transactions (notice it not 1000k)
  2. ~2sec per 1 transaction of 1000 commands
  3. ~2.5sec per 1000 asynchronous transactions

Absolutely unacceptable results for a 64 cores with 128G of RAM machine!

Now, going deeper, I noticed that Neo4j is using 25% of RAM (which is fine I think) but only 1 core when using the 1st 2 options, and 10 cores when using 10 threads.

also I noticed that changing Neo4j's working threads has no impcat on the amount of actualy cores being used.

What am I missing ?

Upvotes: 0

Views: 1773

Answers (1)

Michael Hunger
Michael Hunger

Reputation: 41676

You have 2 conflicting numbers, do you mean 1000 or 1000k = 1M ?

2 s for 1M updates is too slow ?

I would recommend doing 1k to 10k updates per tx and then parallelize the tx

  1. You are not using parameters
  2. don't return anything
  3. use capitalized labels (adapt the constraint)

like:

MERGE (n:Person {id:{id}}) 
 ON CREATE SET n.name={name} n.address={address}

better to batch the (2) into a list parameter

UNWIND {data} as row 
MERGE (n:Person {id:row.id}) 
ON CREATE SET n.name=row.name n.address=row.address

So test 1000 requests (parallel) with 1000 updates each, sending a list with 1000 maps each time.

UNWIND {data} as row 
MERGE (n:Person {id:row.id}) 
ON CREATE SET n.name=row.name n.address=row.address

Upvotes: 2

Related Questions