Reputation: 357
I am considering using Memgraph for low latency graph queries. My graph is pretty huge, with more than 100M nodes and edges. Can this size of the graph be stored in memory? How can I estimate the amount of memory needed? Is there a way to spill over to disk?
Upvotes: 0
Views: 394
Reputation: 357
Here is a simple guide for how to calculate memory usage: https://memgraph.com/docs/memgraph/under-the-hood/storage If you have around 100M nodes and edges, the approximate amount of memory needed would be at least 22GB. Right now, there is no way to spill over to disk, but Memgraph will be adding this feature at some point in the future. Also, Memgraph's GQLAlchemy library provides an on-disk storage solution for large properties not used in graph algorithms. This is useful when nodes or relationships have metadata that doesn’t need to be used in any of the graph algorithms that need to be carried out in Memgraph, but can be fetched after.
Upvotes: 1