5

BACKGROUND:

In running my app through a profiler, it looks like the hotspots are all involved in allocating a large number of temporary new byte[] arrays.

In one run under CLR Profiler, a few short (3-5 seconds worth of CPU time outside the profiler) produced over a gigabyte of garbage, the majority of it byte[] allocation, and this triggered over 500 collections.

In some cases it appears that the application is spending upwards of 10% of its CPU time performing collections.

Clearly a rewrite is in order.

So, I am thinking of replacing the new byte[] allocations with a pool class that could reuse the buffer at a later time.

Something like this ...

{
 byte[] temp = Pool.AllocateBuffer(1024);
 ...
 }

QUESTION:

How can I force the application to call code in the routine Pool.deAllocate(temp) when temp is no longer needed.

In the above code fragment, when temp is a Pool allocated byte[] buffer, but when it goes out of scope it gets deleted. Not a real problem, but doesn't get reused by the pool.

I know I could replace the "return 0;" with "Pool.deAllocate(temp); return 0", but I'm trying to force the recovery to occur.

Is this even remotely possible?

3
  • Are they your own code's array allocations or are they caused by .NET Framework methods? Commented Sep 20, 2009 at 21:50
  • My own code array allocations Commented Sep 20, 2009 at 21:52
  • +1 for profiling before optimizing! :) Commented Sep 21, 2009 at 1:09

1 Answer 1

4

You could implement a Buffer class which implements IDisposable and returns the buffer to the pool when it's disposed. You can then give access to the underlying byte array, and so long as everyone plays nicely you can take advantage of reuse.

Be warned though:

  • Your buffers will quickly end up in gen 2, which may not be ideal for other reasons
  • If a malicious piece of code keeps a reference to the byte array, they could spy on data used by other code
  • You need to remember to dispose of buffers at the right time.

I actually have some code in MiscUtil to do this - see CachingBufferManager, CachedBuffer etc. I can't say I've used it much, mind you... and from what I remember, I made it a bit more complicated than I really needed to...

EDIT: To respond to the comments...

  • You can't force application code to release buffers, no. There's no automatic release mechanism in C# - a using statement is the closest we've got.
  • You could implement an implicit conversion to byte[] in your buffer class to allow you to call methods which have byte array parameters. Personally I'm not much of a fan of implicit conversions, but it's certainly available as an option.
Sign up to request clarification or add additional context in comments.

6 Comments

The disposable makes it more obvious that the programmer shall do something after use (dispose), but the original poster asked for a possibility to FORCE that. I don't believe that is possible.
I was trying to get around using a different class, because I call a large number of routines with byte[] in the parameter list. When I tried to use a u8Buffer class, I would get the "Cannot implicitly convert type 'byte[]' to 'u8Buffer[]' error. Is there an easy workaround?
@Noah: You could provide an implicit conversion to byte[] in your class. I'm not sure I'd recommend it, mind you.
Time I learned a bit more about GC generations. Any good resource on the subject?
CLR via C#? I haven't checked, but I'd be mortified if it didn't explain the GC in a lot of detail :)
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.