0

I am trying to export SQL table data to a text file with '~' delimiter in C# code. When data is small its fine. When it's huge, it is throwing an Out of memory exception.

My Code:

public static void DataTableToTextFile(DataTable dtToText, string filePath)
{
   int i = 0;
   StreamWriter sw = null;

   try
   {
       sw = new StreamWriter(filePath, false); /*For ColumnName's */
       for (i = 0; i < dtToText.Columns.Count - 1; i++)
       {
           sw.Write(dtToText.Columns[i].ColumnName + '~');
       }
       sw.Write(dtToText.Columns[i].ColumnName + '~');
       sw.WriteLine(); /*For Data in the Rows*/

       foreach (DataRow row in dtToText.Rows)
       {
          object[] array = row.ItemArray;
          for (i = 0; i < array.Length - 1; i++)
          {
              sw.Write(array[i].ToString() + '~');
          }
          sw.Write(array[i].ToString() + '~');
          sw.WriteLine();
       }
       sw.Close();
    }
    catch (Exception ex)
    {
       throw new Exception("");
    }
 }

Is there a better way to do this in a stored procedure or BCP command?

2
  • have you tried writting say 1000 lines out, closing then opening the file? Commented Jan 24, 2011 at 1:15
  • What's huge? whats the size of the record set when it bails? Commented Jan 24, 2011 at 6:20

2 Answers 2

1

If there's no specific reason for using the ~ delimiter format, you might try using the DataTable WriteXml function (http://msdn.microsoft.com/en-us/library/system.data.datatable.writexml.aspx)

For example: dtToText.WriteXml("c:\data.xml")

If you need to convert this text back to a DataTable later you can use ReadXml (http://msdn.microsoft.com/en-us/library/system.data.datatable.readxml.aspx)

If you really need to make the existing code work, I'd probably try closing and calling Dispose on the StreamWriter at a set interval, then reopen and append to the existing text.

Sign up to request clarification or add additional context in comments.

Comments

0

I realize that this question is years old, but I recently experienced a similar problem. The solution: Briefly, I think that you're running into problems with the Windows Large Object Heap. A relevant link: https://www.simple-talk.com/dotnet/.net-framework/the-dangers-of-the-large-object-heap/

To summarize the above article: When you allocate chunks of memory more than 85K long (which seems likely to happen behind the scenes in your StreamWriter object if the values in your DataTable are large enough), they go onto a separate heap, the Large Object Heap (LOH). Memory chunks in the LOH are deallocated normally when their lifetime expires, but the heap is not compacted. The net result is that a System.OutOfMemoryException is thrown, not because there isn't actually enough memory, but because there isn't enough contiguous memory in the heap at some point.

If you're using .NET framework 4.5.1 or later (which won't work on Visual Studio 2010 or before; it might work on VS2012), you can use this command:

System.Runtime.GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;

This command forces LOH compaction to happen at the next garbage collection. Just put that command as the first line in your function; it will be set to CompactOnce every time this function is called, which will cause LOH compaction at some indeterminate point after the function is called.

If you don't have .NET 4.5.1, it gets uglier. The problem is that the memory allocation isn't explicit; it's happening behind the scenes in your StreamWriter, most likely. Try calling GC.Collect(), forcing garbage collection, from time to time--perhaps every 3rd time this function is called.

A warning: Lots of people will advise you that calling GC.Collect() directly is a bad idea and will slow down your application--and they're right. I just don't know a better way to handle this problem.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.