I'm currently hitting out of memory errors with the code shown here, and I want to move my SqlDataReader to SequentialAccess to see if that helps.
I originally stumbled across this via the following answer - https://stackoverflow.com/a/15146498
public static void PopulateDataTable(DataTable dt, SQLiteDataReader reader, TableSchema schema)
{
while (reader.Read())
{
var row = dt.NewRow();
for (int i = 0; i < schema.ColumnCount; ++i)
{
if (schema[i].IsBigInteger)
{
row[i] = new BigInteger((byte[])reader[i]);
}
else
{
row[i] = reader[i];
}
}
dt.Rows.Add(row);
}
}
I'm pretty sure I can use
row[i] = reader.GetString(i);
but I'm unsure how best to convert the BigInt read line. Should I actually be utilising GetChars() with a buffer offset to be seeing any benefit?
Any advice is greatly appreciated.
BLOB?GetStringis worse. Instead maybe you can tryGetStreamalthough I don't think it will be more efficient. Note the linked answer is for SqlClient for SQL Server, not SQLite.DataTableis already inefficient because it boxes everything, and it dynamically resizes. You would be better off using a proper object model. Creating aBigIntegerfrom a byte array is going to cause an array copy, but there is nothing you can do about that (unless you use nasty reflection).