0

I'm running a command that should run a Laravel Excel import but it gets exhausted after a while. I'm using chunks and it worked before but now I'm struggling to make it work. It's a group of files that are located in a folder within the filesystem.

This is the command that I run with artisan:

public function handle()
    {
        //
        $directory = 'pv';
        $files = Storage::allFiles($directory);
        \Log::info('Process started.');
        $start = microtime(true);
        ini_set('max_execution_time', 600);
        foreach($files as $file)
        {
            $fname = basename($file);
            \Log::info('Processing',[$fname]);
            $arr = explode(" ", $fname);
            $day = substr($arr[2], 0, 10);
            $date = Carbon::parse($day);
            Excel::queueImport(new POSImport($date), $file);
        }
        $time = microtime(true) - $start;
        $me = '[email protected]';
        $msg = 'Process finished in '. $time.' secs.';
        Mail::to($me)->queue(new TasksFinished($msg));
        $this->call('calcular:previos', [
        '--queue' => 'default'
        ]);
    }

It gets out of memory.

This is the import.

<?php

namespace App\Imports;

use Illuminate\Support\Collection;
use Maatwebsite\Excel\Concerns\ToCollection;
use Maatwebsite\Excel\Concerns\WithHeadingRow;
use Maatwebsite\Excel\Concerns\WithChunkReading;
use Illuminate\Contracts\Queue\ShouldQueue;
use App\Pos;
use App\Device;
use \Datetime;

class POSImport implements ToCollection, WithHeadingRow, WithChunkReading, ShouldQueue
{


   public $tries = 3;

   function __construct(Datetime $date) {
       $this->date = $date;
   }

   /**
    * Importa datos de la planilla de puntos vigentes de Banco Estado.
     * 
     * Actúa sobre Device (equipos) y POS 
     * 
    * @param Collection $rows
    */

    public function collection(Collection $rows)
    {
        //
        ini_set('max_execution_time', 600);
        foreach($rows as $row)
        {
            // crea o modifica POS

            if(!isset($row['marca'])) {
                return null;
            }
            // Busca el POS (lugar) en la base
            $pos = Pos::where('id', $row['pos'])->first();
            // si no hay un "pos" registrado lo crea
            if(!$pos) {
                $pos = new Pos;
                $pos->id = $row['pos'];
                $pos->vigente = ($row['estado'] == 'VIGENTE' ? true : false);
                $pos->save();
            } else {
                $pos->vigente = ($row['estado'] == 'VIGENTE' ? true : false);
                $pos->save();
            }
            // limpia serial de ceros a la izquierda
            $serial = ltrim($row['serie_equipo'], '0');
            // busca serial en la base de datos
            $device = Device::where('serial', $serial)
                    ->where('fecha_recepcion', '<', $this->date)
                    ->where('customer_id', 1)
                    ->orderBy('fecha_recepcion', 'asc')
                    ->first();

            if($device && $device->pos_id != $row['pos'] && $device->fecha_instalacion != $this->date){
                // busca el dispositivo anterior

                $device->pos_id = $pos->id;
                $device->fecha_instalacion = $this->date;
                $device->save();

                $device->pos()->attach($pos);
            } 
        }

    }

    public function chunkSize(): int {
        return 2000;
    }

}

As you can see I'm using WithChunkReading and ShouldQueue. When I started this process in the past it just processed the chunks but now the queue shows lots of QueueImport entries.

I'm using the database as the queue driver.

I hope you can help me out with this.

Error in the command:

Symfony\Component\Debug\Exception\FatalErrorException  : Allowed memory size of 536870912 bytes exhausted (tried to allocate 175747072 bytes)

  at C:\laragon\www\reportes\vendor\laravel\framework\src\Illuminate\Queue\Queue.php:138
    134|
    135|         return array_merge($payload, [
    136|             'data' => [
    137|                 'commandName' => get_class($job),
  > 138|                 'command' => serialize(clone $job),
    139|             ],
    140|         ]);
    141|     }
    142|


   Whoops\Exception\ErrorException  : Allowed memory size of 536870912 bytes exhausted (tried to allocate 175747072 bytes)

  at C:\laragon\www\reportes\vendor\laravel\framework\src\Illuminate\Queue\Queue.php:138
    134|
    135|         return array_merge($payload, [
    136|             'data' => [
    137|                 'commandName' => get_class($job),
  > 138|                 'command' => serialize(clone $job),
    139|             ],
    140|         ]);
    141|     }
    142|

It's a lot of data, that's why I'm using chunks and queues but I still have this problem.

1 Answer 1

2
class POSImport implements ShouldQueue
{
    /**
     * The number of seconds the job can run before timing out.
     *
     * @var int
     */
    public $timeout = 120;
}

Also, if you want your queue worker to increase the timeout, you can use a --timeout flag (I think default one is 30 seconds):

php artisan queue:work --timeout=300


I am not sure about this, but also may work:

$this->call('calcular:previos', [
    '--queue' => 'default',
    '--timeout' => '300'
]);

Sign up to request clarification or add additional context in comments.

9 Comments

The problem is it would also get exhausted for memory size but it didn't before even though the files themselves haven't changed. [2019-03-18 14:56:59][312] Processed: Maatwebsite\Excel\Jobs\QueueImport [2019-03-18 14:57:04][313] Processing: Maatwebsite\Excel\Jobs\ReadChunk [2019-03-18 14:57:56][313] Processed: Maatwebsite\Excel\Jobs\ReadChunk [2019-03-18 14:58:00][314] Processing: Maatwebsite\Excel\Jobs\QueueImport whereas before only readchunk was executed.
Whoops\Exception\ErrorException : Allowed memory size of 536870912 bytes exhausted (tried to allocate 175747072 bytes) on the command and Symfony\Component\Process\Exception\ProcessTimedOutException : The process "C:\laragon\bin\php\php-7.2.11-Win32-VC15-x64\php.exe artisan queue:work --once --queue=default --delay=0 --memory=512 --sleep=3 --tries=3" exceeded the timeout of 60 seconds. on the queue
Hmm, I suspect it might be fastcgi issue - In your nginx config, inside location ~ \.php$ { } add fastcgi_read_timeout 3600; at the bottom
I'm using apache. Now, the chunks are getting read but the command gets memory exhausted.
ini_set('memory_limit', '512M'); Can you try this? It's related with php's memory allocation
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.