3

Im trying to compress a large 8gb file and upload the compressed file into Blob. The Compressed File Size comes to around 800MB. Now when i try to upload into Azure, I get exception' System.OutOfMemoryException" Im Compressing atleast 3-4 Files while are of size 3Gb, 4GB, 8GB in parallel and Keep the uploading into BLOB.

Here is the code for Compressing

 public string UploadFile(string fileID, string fileName, string choice,Stream CompressedFileStream)
    {
       byte[] data = new byte[CompressedFileStream.Length];
        CompressedFileStream.Read(data, 0, data.Length);
        long fileSize = CompressedFileStream.Length;
        inputStream.Dispose();   

   }

        blob.ServiceClient.WriteBlockSizeInBytes = 4 * 1024 * 1024;
        blob.ServiceClient.ParallelOperationThreadCount = 5; 
        //this will break blobs up automatically after this size
        blob.ServiceClient.SingleBlobUploadThresholdInBytes = 12582912;
        startTime = DateTime.Now;
                    using (MemoryStream ms = new MemoryStream(data))
                    {
                        ms.Position = 0;
                        blob.UploadFromStream(ms);
                    }

Im runnning on 64 bit windows 2k8 server and 4GB ram. Is it RAM Issue or Any Address Space Issue. Please help me on this issue

-Mahender

-Mahender

1
  • It looks like you are trying to read the entire 8 GB file into memory into a single byte array. That probably isn't ever going to work. Instead I would see if you can open a stream to your file and work in chunks. Commented Oct 21, 2012 at 17:26

2 Answers 2

5

You very rarely load a huge file into memory like that. What you should usually be doing is looping with a smallish buffer (8k, 16k, etc), and uploading it as a stream. Depending on what the full scenario is, perhaps just:

blob.UploadFromStream(compressedStream);

If you need to do pre-processing work (and you can't just pass the stream as-is), then work with a temporary file (again, doing the compression/decompression via a smallish buffer), then just hand the file-stream to the upload API.

The entire point of a stream is that it is a hose not a bucket. You should not try to have it all in-memory at once.

There are object-size and array-size limits that prevent you from having an 8GB byte-array. When there are requirements to do so, there are evil ways of loading huge objects, but: that would be entirely inappropriate in this case. Simply - you just need to use the streaming API correctly.

Sign up to request clarification or add additional context in comments.

Comments

1

There could be 2 issues, as it's not really clear form your post.

1) This is could be .NET Framework issure, if your code is written not targeting 64bit architecture. In this case you have 3GB (very approximately) memory limit for your CLR process.

2) Your code is a "native" 64bit application: so you allocate too much memory. Possible solutions:

a) Reduce data transmission into the chunks of the memory (if this is possible in your case)

b) Use stream and do not load all data (if it's possible in your case)

EDIT

In addition, as Marc noticed: there is another memory limit on array in CLR (independent from the architecture) , that potentially could exceed in your case too.

Hope this helps.

1 Comment

Even x64 still has a 2 GB limit on a single byte-array (even with the allow-large-objects flag set)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.