I need to allocate lots of memory to emulate memory consumption by .NET app. I've expected that new byte[1000*1024*1024] would allocate all memory at once, but it is not happening.
For example, consider following code:
var bytes = 1000*1024*1024;
var memory = new byte[bytes];
memory[bytes - 1] = 16;
//Step 1
Console.ReadLine();
for (int i = 0; i < memory.Length / 2; i++)
{
memory[i] = 16;
}
//Step 2
Console.ReadLine();
for (int i = memory.Length / 2; i < memory.Length; i++)
{
memory[i] = 16;
}
According to Process Explorer, no memory is allocated till Step 1, and before Step 2 only half of the expected memory is allocated. Only after Step 2 all 1000*1024*1024 bytes are allocated. Compiled both in Debug and Release VS configurations.
So the questions are:
- Why memory is not allocated in process at once?
- How to force process to allocate memory (except of iterating whole array)?
UPDATE:
I've inspected the memory consumption via Resource Monitor tool, and the "Commit" section shows the correct 1000 Mb, however "Working set" behaves as I've described above. As my task is to emulate real load, I suppose I need to load actual physical memory, not virtual one.
Console.ReadLine()so the compiler can't reorder the instructions in such a way as to violate that, at least not the .NET compiler (which I doubt would so aggressively optimize -- indeed, in this program, it could simply never allocate the array or run the loops since they have no side effects!).