Reputation: 386
This question is not really related to any specific code or even language.
If you allocate huge (exceeding phisical memory) amount of memory on Windows it causes entire operating system to become fully unresponsive - including mouse cursor which typically was able to move even with entire system crashed.
Working Set API seem to not solve the problem - it seems that all applications have an initial max working set size already set to a rather low level.
I hoped memory mapped files (via boost api) would help OS make better decisions about page loading/unloading - but again even single pass trough large data freezes the system.
Are there any magic WinAPI calls or other good programming practice (other than manual management of entire commited memory and manual data caching in files) that would keep the operating system and other applications reasonably stable while using such huge amount of data?
Upvotes: 0
Views: 36