Void Star
Void Star

Reputation: 2521

Avoid paging when Allocating big blocks of memory in C?

I am writing an N-body simulation in C using the Barnes-Hut algorithm which requires using big blocks of memory. I am going for speed and efficiency. Is there any way to guarantee that these blocks of memory will stay in RAM and not get paged to the hard drive?

Edit: I would like to allocate as many as 2GB, however it is conceivable that I may end up running some simulations with much more memory.

Edit: Solution should support Windows7 (maybe Windows8 when it comes out?) and Ubuntu

Upvotes: 1

Views: 208

Answers (2)

Sergey L.
Sergey L.

Reputation: 22542

For Linux: mlock(2) will do the job.

https://www.kernel.org/doc/man-pages/online/pages/man2/mlock.2.html

But beware that the amount of user mlockable memory is normally limited on standard systems ulimit -l.

The Windows version is VirtualLock. I do not know if there is a limit and how it can be queried.

http://msdn.microsoft.com/en-us/library/windows/desktop/aa366895%28v=vs.85%29.aspx

Upvotes: 1

zwol
zwol

Reputation: 140659

There are operating system primitives that do what you want: mlock on Unix (of which Ubuntu is but one example¹), and VirtualLock on Windows. (Ignore the quibbling in the comments over the exact semantics of VirtualLock; they're irrelevant for your use case.)

The Unix primitive requires root privilege in the calling process (some systems permit locking down a small amount of memory without privilege, but you want far more than that). The Windows primitive appears not to require special privileges.

¹ "Linux is not UNIX" objection noted and ignored with prejudice.

Upvotes: 2

Related Questions