1

I am generating XML from 60 tables, and storing this xml in a table.

Table Name : Final_XML_Table

PK   FK    XML_Content (type xml)
1     1     "XML that I am generating from 60 tables"

When I am running below query , it gives memory exception :

Select * from Final_XML_Table

Things I have tried : 1. Results to text : I am getting only few lines from XML as text in output window 2. Results to file : I am getting only few lines from XML in file.

Please suggest, and also if there is any change , will I have to do this on server's SQL server as well while deployment.

I have also set XML_Data to unlimited :

check snapshot

4
  • Try to right-click the query, select "options" from the context menu and open the "grid" area in "results". There is an option for XML data. You can set this to "unlimited". This will use all memory which can be handled. The default might be to small for your XML... Commented Aug 8, 2018 at 10:31
  • I have set XML_DATA as unlimited. still it is giving same error Commented Aug 8, 2018 at 11:06
  • did you restart SSMS? Commented Aug 8, 2018 at 11:31
  • Yes, but no luck :( ... getting same error Commented Aug 8, 2018 at 12:12

2 Answers 2

1

This is not an answer, but to much for a comment...

The fact, that you are able to store the XML, shows clearly, that the XML is not to big for the database.

The fact that you get an out-of-memory exception with Select * from Final_XML_Table shows clearly, that SSMS has a problem on reading/displaying your XML.

You might try to check the length like here:

DECLARE @tbl TABLE (x XML);
INSERT INTO @tbl VALUES('<root><test>blah</test><test /><test2><x/></test2></root>');

SELECT * FROM @tbl;              --This does not work for you
SELECT DATALENGTH(x) FROM @tbl;  --This returns just "82" in this case

Might be, that due to a logical error in your XML's creation (a wrong join?) the XML contains multiple/repeated elements. You might try a query like this to get a count of nodes in order to check if this number is realistic:

SELECT x.value('count(//*)','int') FROM @tbl

For the exampe above this returns "5"

You might do the same with your original XML.

With a query like the following you can retrieve all node names of the first level, the second level and so on. You can check if this looks okay:

SELECT firstLevel.value('local-name(.)','varchar(max)') AS l1_node
      ,SecondLevel.value('local-name(.)','varchar(max)') AS l2_node
      --add more
FROM @tbl
OUTER APPLY x.nodes('/*') AS A(firstLevel)
OUTER APPLY A.firstLevel.nodes('*') AS B(SecondLevel)
--add more

And - of course - you might open the ResourceMonitor to look at the actual usage of memory...

Come back with more details...

Sign up to request clarification or add additional context in comments.

5 Comments

thank you so much .... datalength - 37708889 Count - 928720 I did see till 11th level , it looks good and till 11th level it is giving around 710145 records.
1st and second level are exact same that I am expecting.
@JFI The fact, that my statements work, shows clearly: SSMS can deal with larger XML, as this is done by SQL Server, but it cannot display this size, as it is limited to 32 bit addresses.
yeah..that's true... but in that case how I am able to see original XML on ssms ,which I am using to insert data in tables ...and expectation is this xml is same as that of I am generating
@JFI Did you check DATALENGTH of the original?
0

That error isn't a SQL Server error, it's from SSMS. It means that SSMS has run out of memory.

SSMS is only a 32bit application, so can only address 2GB of RAM. If it tries to address more than that, the error will occur. if you've had SSMS open and returned some very large datasets, that RAM is going to get used up.

In all honesty, if you're running a query like SELECT * FROM Final_XML_Table then I would hazard a guess that the dataset is huge. Add a WHERE clause, or don't return the dataset on screen. if you really need to view the data (all of it), export it to something else. But I very much doubt you need to look at every row, if you're returning around 2GB of data.

3 Comments

table is only having 1 record , and it contains xml which is of around 58k lines of code.
@JFI if you've been running other queries, that would also increase the RAM the SSMS is using (as it sometimes has a habit of not releasing it). 58K of lines means nothing on it's own though. If every line only had 1-10 characters it wouldn't be that big, but if each one had 500-1,000 (white space (' ') counts as a character too) then the size is significantly bigger. Just like a table with 1 row, and 1 column, which is a nvarchar(MAX) which filled to capacity is far larger that a table with 8 columns of the datatype bit with 80 million rows. (The former is 2GB, while the other ~80MB)
I have a 3 mb xml file, I have inserted data of this xml in 60 tables , and now I am re generating xml from table, I have generated 59 tables xml correctly and when i am generating for final table which contains full xml as expected , it is giving error. So the expected size of xml is 3mb which is inserted in table in xml type column.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.