If I execute the following Powershell command:
Invoke-Sqlcmd `
-ServerInstance '(local)' -Database 'TestDatabase' `
-Query "select top 1000000 * from dbo.SAMPLE_CUSTOMER" | Out-Null
I see my memory usage go through the roof. It uses 1GB of memory.
If I start the command again, memory increases to 1.8GB of memory, then it gets reduced to 800MB (garbage collect?) and starts growing again.
I tried to reduce memory footprint for the PowerShell shell and plugins to 100MB by following the steps in the article http://blogs.technet.com/b/heyscriptingguy/archive/2013/07/30/learn-how-to-configure-powershell-memory.aspx, but I still see memory increase far above the configured 100MB.
I have some questions:
- Why does PowerShell not respect the memory limitations given by the setting MaxMemoryPerShellMB?
- Why does
Invoke-Sqlcmdeat memory and doesn't it "forget" the records processed in the pipeline - Why does the PowerShell process not reclaim memory automatically when finished processing?
- How can I process many SQL records without a large memory footprint?
Invoke-SqlCmdwithout severely pressuring memory. That cmdlet will instantiate every record in memory before it passes it toOut-Null. It has to put that data somewhere, so it will use memory to do it. Objects won't disappear until garbage collection gets them, so you're going to see a lot of memory usage. If you really need to do this, then I suggest you useSystem.Data.SqlClient.SqlDataReaderinstead.Invoke-SqlCmdjust works this way (it packs all retrieved rows into a singleDataSetobject, which is why they don't get "discarded on the fly"). Regarding MaxMemoryPerShellMB: The behavior is unsupported if the MaxMemoryPerShellMB is set to a value that is less than the default.