I need to parse some CSV data and store it to the database. The input file is about 15000 rows. It is written in Laravel framewrok. The request to DB spend about 0.2s so it seems the problem is in CSV parser. Can somebody tell me how to optimize this code in PHP. The code looks like this:
protected function csv2array(string $csv)
{
try {
$return = [
'headers' => [],
'rows' => [],
];
$rows = explode(PHP_EOL, $csv);
$headers = str_getcsv(strtolower(array_shift($rows))); // Headers + strtolower()
$return['headers'] = $headers;
foreach ($rows as $row) {
$items = str_getcsv($row);
if ( count($items) !== count($headers) ) continue;
$items[2] = new Carbon($items[2]); // Third item is UTC datetime
$items[3] = new Carbon($items[3]); // Fourth item is UTC datetime
$items = array_combine($headers, $items);
$return['rows'][] = $items;
}
return $return;
} catch (Exception $e) {
Log::error($e->getMessage());
throw $e;
}
}
The parent code which call csv2array() looks like this
$csv = $request->getContent();
$csv = trim($csv);
$csvArray = $this->csv2array($csv);
$insertArray = $this->addDeviceIdToArray($csvArray['rows'], $device); // This is fast 0.2s
$rows = explode(PHP_EOL, $csv);will make it keep the complete file content, in array form, in memory. Usingfgetcsv, which reads line by line from a file, might perform better. Looks like you don't have a file here, but get the CSV content posted to your app directly? Then you could perhaps either usephp://inputto access it as a readable stream (depends on what exactly you are sending), or write the data to a temporary file first.