0

I wrote this code:

SqlConnection Conn = new SqlConnection("ConnectionString");

SqlCommand cmd = new SqlCommand("select * from tablename", conn);

SqlDataReader dr = cmd.ExecuteReader();
datatable dt = new datatable();
dt.load(dr);

But I get an exception as shown below when I load data into the datatable because I have an xml column with a big size (102 MB).

Exception of type 'System.OutOfMemoryException' was thrown.

I'm very grateful if someone can give me the solution for this exception.

12
  • Do you need the XML column? Commented Jan 20, 2021 at 15:30
  • It breaks because data is too big to fit your memory. select * grabs all the data in the table, you can't handle huge datasets in such a simple way. Commented Jan 20, 2021 at 15:31
  • @DavidG yes, I wan't to retreive data from the content of Xml Column Commented Jan 20, 2021 at 15:31
  • 1
    basically this happens when it exceeds the limited memory which varys from devloper edition to entrprise edition SSMS is a 32-bit process. Therefore, it is limited to 2 GB of memory. SSMS imposes an artificial limit on how much text that can be displayed per database field in the results window. This limit is 64 KB in "Grid" mode and 8 KB in Text mode. If the result set is too large, the memory that is required to display the query results may surpass the 2 GB limit of the SSMS process. Therefore, a large result set can cause the error that is mentioned in the Symptoms section. Commented Jan 20, 2021 at 15:43
  • 4
    You should be using the reader to read one row at a time in a loop. Process each row as you go along. And dispose your connection and reader objects with using Commented Jan 20, 2021 at 15:58

2 Answers 2

1

Solutions to your problem:

  1. Either normalize your database such that rather than raw XML, appropriate relational model is stored. That is how SQL works in its essentials. Hopefully, that will solve your problem.
  2. However, no matter how well your database is normalized, there are limits beyond which data simply does not fit available memory. If it is your case, you need to abandon tough select * way and reimplement it in fetch-next-batch size, i.e. repeatedly fetch batches of fixed predefined size, process them, mark as processed somewhere and go on.
Sign up to request clarification or add additional context in comments.

3 Comments

Or work with it in stream fashion i.e. use the reader how is supposed to be: one row at a time
@Charlieface that’s exactly the second point.
Not really, it sounds more like you are saying to do lots of selects one after another, which can be dog slow. You should clarify. Another option is to pull out the relevant XML values with XQuery direct in the SQL
0

Here is a conceptual example. T-SQL shreds XML data type column into a rectangular format. The c# side of the equation will not have any problem.

SQL

-- DDL and sample data population, start
DECLARE @tbl TABLE (ID INT IDENTITY PRIMARY KEY, product VARCHAR(20), xmldata XML);
INSERT INTO @tbl (product, xmldata) VALUES
('vase',  N'<root>
    <r>red</r>
    <r>blue</r>
    <r>yellow</r>
</root>'),
('phone',  N'<root>
    <r>black</r>
    <r>white</r>
</root>');
-- DDL and sample data population, start

SELECT id, product
, c.value('(./text())[1]', 'VARCHAR(20)') AS color
FROM @tbl CROSS APPLY xmldata.nodes('/root/r') AS t(c);

Output

+----+---------+--------+
| id | product | color  |
+----+---------+--------+
|  1 | vase    | red    |
|  1 | vase    | blue   |
|  1 | vase    | yellow |
|  2 | phone   | black  |
|  2 | phone   | white  |
+----+---------+--------+

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.