0

I'm currently hitting out of memory errors with the code shown here, and I want to move my SqlDataReader to SequentialAccess to see if that helps.

I originally stumbled across this via the following answer - https://stackoverflow.com/a/15146498

public static void PopulateDataTable(DataTable dt, SQLiteDataReader reader, TableSchema schema)
{
    while (reader.Read())
    {
        var row = dt.NewRow();

        for (int i = 0; i < schema.ColumnCount; ++i)
        {
            if (schema[i].IsBigInteger)
            {
                row[i] = new BigInteger((byte[])reader[i]);
            }
            else
            {
                row[i] = reader[i];
            }
        }

        dt.Rows.Add(row);
    }
}

I'm pretty sure I can use

row[i] = reader.GetString(i);

but I'm unsure how best to convert the BigInt read line. Should I actually be utilising GetChars() with a buffer offset to be seeing any benefit?

Any advice is greatly appreciated.

2
  • What is the actual data type of the column coming in, is it BLOB? Commented May 14, 2024 at 19:53
  • What you are doing currently is inefficient for other reasons, and your purported solution of GetString is worse. Instead maybe you can try GetStream although I don't think it will be more efficient. Note the linked answer is for SqlClient for SQL Server, not SQLite. DataTable is already inefficient because it boxes everything, and it dynamically resizes. You would be better off using a proper object model. Creating a BigInteger from a byte array is going to cause an array copy, but there is nothing you can do about that (unless you use nasty reflection). Commented May 14, 2024 at 20:07

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.