18

I have to create a fairly large double array 12000ish x 55000ish. Unfortunately, I get an out of memory exception. I used to develop in Java and could change the memory settings. Is this possible with C# or is it just impossible? I am using VS 2008.

4
  • I would suggest you consider an "out of main memory" data structure (i.e. database). Why do you have to store such a large double array? Commented Jan 27, 2011 at 11:09
  • 1
    I did chuckle at the choice of word "fairly" too. Commented Jan 27, 2011 at 11:09
  • No worries I will persist stuff in a database. Commented Jan 27, 2011 at 11:15
  • Maybe one could also solve the problem when setting the LARGE_ADDRESS_AWARE-PE-Flag Commented Jan 1, 2016 at 11:02

4 Answers 4

31

Each double is 8 bytes, so you're trying to allocate a single array with just over 5GB. The CLR has a per-object limit of around 2GB IIRC, even for a 64-bit CLR. In other words, it's not the total amount of memory available that's the problem (although obviously you'll have issues if you don't have enough memory), but the per-object size.

I suggest you split it into smaller arrays, perhaps behind a facade of some description. I don't believe there's any way to workaround that limit for a single array.

EDIT: You could go for an array of arrays - aka a jagged array:

double[][] array = new double[12000][];
for (int i = 0; i < array.Length; i++)
{
    array[i] = new double[55000];
}

Would that be acceptable to you?

(You can't use a rectangular array (double[,]) as that would have the same per-object size problem.)

Sign up to request clarification or add additional context in comments.

10 Comments

Instead of a single array of doubles, I think he means 2D array
Virtual Memory is not the issue. You're trying to allocate 5GB of memory with a 2GB limit. :)
Also, due to memory fragmentation, OP might experience problems as soon as they try to allocate a contiguous block of several hundreds of Mb.
@csetzkorn If the data is very sparse then perhaps they shouldn't be in an array at all. Use a Dictionary<int, double> to store the data. It's probably going to use less space so long as about half of the values are empty.
@Servy: Given the acceptance, I suspect the OP was actually fine with the fact that the jagged array used quite a lot of memory. These days 5GB isn't that much :)
|
11

Since you can't create objects larger than 2GB you can try to use MemoryMappedFile to work with chunk of memory of the required size.


var data = MemoryMappedFile.CreateNew("big data", 12000L * 55000L);
var view = data.CreateViewAccessor();
var rnd = new Random();

for (var i = 0L; i < 12000L; ++i)
{
    for (var j = 0L; j < 55000L; ++j)
    {
        var input = rnd.NextDouble();
        view.Write<double>(i * 55000L + j, ref input);
    }
}

Comments

6

Providing that your total memory is sufficient, you can prevent Out of memory exceptions resulting from LOH fragmentation by creating a bunch of smaller arrays, and wrapping them in a single IList<T>, or some other indexed interface.

Here is a link which describes it:

BigArray<T>, getting around the 2GB array size limit

Credits: this post (C# chunked array).

Comments

0

Well either you are out of memory (close some programs) or you're hitting the memory allocation limit (about 2Gb), this memory needs to be a contiguous block. You could use a 64bit machine in which case you'll have more memory available or I think you can make the application large address aware (googling will tell you how to do this if it's possible in this case).

Believe you add a /3GB switch to the Boot.ini file for the large address awareness.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.