I have an performance issue with the below code. I want to parse some information from a JSON file to a CSV. The JSON itself has around 200k lines. The performance of this conversion is not good as it takes over 1h to process such a file.
I think the problem might be with the Add-Content function as I'm using a normal HDD for it. Could you please let me know if you see any improvements of the code or any changes that I could do?
$file = "$disk\TEMP\" + $mask
$res = (Get-Content $file) | ConvertFrom-Json
$file = "$disk\TEMP\result.csv"
Write-Host "Creating CSV from JSON" -ForegroundColor Green
Add-Content $file ("{0},{1},{2},{3},{4}" -f "TargetId", "EventType", "UserId", "Username", "TimeStamp")
$l = 0
foreach ($line in $res) {
if($line.EventType -eq 'DirectDownloadCompleted' -and $line.TargetDefinition -eq 'GOrder') {
#nothing here
} elseif($line.EventType -eq 'DirectDownloadCompleted' -and $line.TargetDefinition -eq 'GFile') {
Add-Content $file ("{0},{1},{2},{3},{4}" -f
$line.AssetId, $line.EventType, $line.UserId, $line.UserName, $line.TimeStamp)
$l = $l + 1
} else {
Add-Content $file ("{0},{1},{2},{3},{4}" -f $line.TargetId, $line.EventType, $line.UserId, $line.UserName, $line.TimeStamp)
$l = $l + 1
}
}