0

I'm tryin to insert datas (160,000+ rows) using INSERT INTO and PHP PDO but i have a bug.

When I launch the PHP script, i see more than the exact number of lines in my CSV inserted in my database.

Can someone say me if my loop is not correct or something ?

Here the code I have :

$bdd = new PDO('mysql:host=<myhost>;dbname=<mydb>', '<user>', '<pswd>');

        // I clean the table
        $req = $bdd->prepare("TRUNCATE TABLE lbppan_ticket_reglements;");
        $req->execute();

        // I read and import line by line the CSV file
        $handle = fopen('<pathToMyCsvFile>', "r");

        while (($data = fgetcsv($handle, 0, ',')) !== FALSE) {
            $reqImport = 
                "INSERT INTO lbppan_ticket_reglements 
                (<my31Columns>) 
                VALUES 
                ('$data[0]','$data[1]','$data[2]','$data[3]','$data[4]','$data[5]','$data[6]','$data[7]','$data[8]',
                    '$data[9]','$data[10]','$data[11]','$data[12]','$data[13]','$data[14]','$data[15]','$data[16]',
                    '$data[17]','$data[18]','$data[19]','$data[20]','$data[21]','$data[22]','$data[23]','$data[24]',
                    '$data[25]','$data[26]','$data[27]','$data[28]','$data[29]','$data[30]')";

             $req = $bdd->prepare($reqImport);
             $req->execute();
        }

        fclose($handle);

The script works a little because datas are in the table but i dunno why it bugs and inserts more datas. I think maybe, due to the file size (18 Mo) maybe the script crash and attempts to relaunch inserting same rows again.

I can't use LOAD DATA on the server I'm using.

Thanks for your help.

8
  • Unlikely to be it but you have <my30Columns> and 31 values. Commented Dec 21, 2015 at 13:27
  • instead of working with 1 big file break it up and see if you can catch the error Commented Dec 21, 2015 at 13:27
  • @NeilMasters my bad, I have 31 columns Commented Dec 21, 2015 at 13:29
  • can you import an .sql file on your server? Commented Dec 21, 2015 at 13:34
  • @tq i tried with a smaller size, works perfectly. maybe I will search a solution around that ... Commented Dec 21, 2015 at 13:36

2 Answers 2

1

This is not an answer but adding this much into comments is quite tricky.

Start by upping the maximum execution time

If that does not solve your issue, start working your way through the code line by line and handle every exception you can think of. For example, you are truncating the table BUT you say you have loads more data after execution, could the truncate be failing?

try {
    $req = $bdd->prepare("TRUNCATE TABLE lbppan_ticket_reglements;");
    $req->execute();
} catch (\Exception $e) {
    exit($e->getMessage()); // Die immediately for ease of reading
}

Not the most graceful of try/catches but it will allow you to easily spot a problem. You can also apply this to the proceeding query...

try {
    $req = $bdd->prepare($reqImport);
    $req->execute();
} catch (\Exception $e) {
    exit($e->getMessage());
}

and also stick in some diagnostics, are you inserting 160k rows? You could optionally echo out $i on each loop and see if you can spot any breaks or abnormalities.

$i  = 0;
while (($data = fgetcsv($handle, 0, ',')) !== FALSE) {
    // ... your stuff
    $i++;
}

echo "Rows inserted " . $i . "\n\n"; 

Going beyond that you can the loop print out the SQL content for you to look at manually, perhaps its doing something weird and fruity.

Hope that helps.

Sign up to request clarification or add additional context in comments.

3 Comments

I have a global try catch on my script, but the page stop with no errors, that's why I was stuck with that. I think i will generate smaller files to avoid execution time problem. Not a perfect solution but it will probably work
Try it out and see if you spot the issue with just one of the smaller files; if you can narrow it down it would help you massively. Trying to spot an issue in 160k sql statements with 31 cols is fairly brutal :D
that's when you don't have nothing to say and deal with a weird boss, by the way thanks for you help, smaller files are working perfectly
0

Assuming $data[0] is the unique identifier then you can try this to spot the offending row(s):

$i = 0;

while (($data = fgetcsv($handle, 0, ',')) !== FALSE) {
    echo 'Row #'.++$i.' - '.$data[0];
}

Since you are not using prepared statements, it is very possible that one of the $data array items are causing a double-insert or some other unknown issue.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.