Reputation: 4181
Is there any open-source alternative for Terracotta BigMemory?
Actually I didn't even manage to find any commercial alternative. I'm interested in pure Java solution which will work inside JVM without any JNI and C-backed solution.
Upvotes: 29
Views: 23418
Reputation: 1211
This implementation of a java off-heap cache uses direct memory and provides good performance in a light-weight java library:
Take a look at the benchmarking section for performance numbers. It's licensed under Apache 2.
Upvotes: 4
Reputation: 170
I've been having this question myself so I'm just going to update the previous answers with my findings.
I found this thread from quora which also talks about the same question:
http://www.quora.com/JVM/Whats-the-best-open-source-solution-for-java-off-heap-cache
The different solution that seem to be a good fit, besides the directmemory (which has not really been updated in the last year) are
However, I would be interested furthermore to find a big enough application that is using any of these three: directmemory, SpyMemcached, xmemcached. Should I find one I will update this answer.
Upvotes: 11
Reputation: 5745
There is a very good cache solution named MapDB(JDBM4 formerly). It supports HashMap
and TreeMap
But it is only application embedded. It also support persistent file based cache.
Example for off heap cache:
DB db = DBMaker.newDirectMemoryDB().make();
ConcurrentNavigableMap<Integer, String> map = db.getTreeMap("MyCache");
Or persistent file based cache:
DB db = DBMaker.newFileDB(new File("/home/collection.db")).closeOnJvmShutdown().make();
ConcurrentNavigableMap<Integer,String> map = db.getTreeMap("MyCache");
Upvotes: 19
Reputation: 1169
Although it isn't a solution, a guide to how to make use of ByteBuffers for your use case has been written about by Keith Gregory. Take a look at http://www.kdgregory.com/programming/java/ByteBuffer_JUG_Presentation.pdf for an overview and http://www.kdgregory.com/index.php?page=java.byteBuffer for the nitty-gritty details.
Upvotes: 4
Reputation: 533820
I am developing a solution to be much faster, but I wouldn't suggest you use it just yet as it just a proof of concept at this stage.
http://vanillajava.blogspot.com/2011/09/new-contributors-to-hugecollections.html
However if you have a specific requirement, it may be easier to code it yourself, to use direct ByteBuffers or memory mapped files.
e.g.
// using native order speeds access for values longer than a byte.
ByteBuffer bb = ByteBuffer.allocateDirect(1024*1024*1024).order(ByteOrder.nativeOrder());
// start at some location.
bb.position(0);
bb.put((byte) 1);
bb.putInt(myInt);
bb.putDouble(myDouble);
// to read back.
bb.position(0);
byte b = bb.get();
int i = bb.getInt();
double d = bb.getDouble();
You can do similarly for memory mapped files. Memory mapped files don't count towards you direct memory limit and don't use up swap space.
Are you sure BigMemory won't do the job for you?
Upvotes: 9
Reputation: 9934
Looks like there is a proposal at apache:
http://wiki.apache.org/incubator/DirectMemoryProposal
Upvotes: 9