3

I'm running a foreach loop in php which takes longer to execute than my maximum execution time of 30 seconds. The loop sends individual emails to users.

Instead of running cron jobs every 30 seconds and creating queues for records is it unethical to just restart the counter in the loop using set_time_limit(30) ?

$i = 0; //start count from 0

foreach ($users as $user): 

    //limit emails sent
    if(++$i == 100) break; //ends execution of loop

    set_time_limit(30); //restart timeout counter

    send_email($user); //send email to user

endforeach;

I'm new to this but with the code above I think I'm giving each email 30 seconds to complete but also breaking the loop when 100 emails are sent so the script doesn't run forever.

Update: set_time_limit(0) goes against hosting TOS, I believed that restarting the timeout counter restarts the script as well as would CRON

2
  • 2
    Unethical? No. Against your hosts TOS? Maybe. Commented Jul 24, 2012 at 17:32
  • Are you able to run cronjob? Mail queue? Commented Jul 24, 2012 at 17:43

3 Answers 3

4

Running set_time_limit in foreach loop with brings and solves few problems at the same time.

I see the greatest pro of this solution in making sure that no request will take more than 30 seconds long (and when you have a full cue I believe it's even desirable to cut every script that takes that long) .

The problem it brings it's that no all jobs will be executed necessarily. Maybe you'll experience some problems in the middle of jobs queue and it will all fail.

I would go with this:

# crontab
0,30 * * * * php /path/to/your/script.php

And would use your script.

If you need to execute jobs as fast as possible, I'd create a bash script that would execute (without any timeout) php script as long as it wouldn't finish with exit(0) (all jobs executed successfully) or wouldn't return "Done!" or whatever you like.

Exampe bash script: 1,2

#!/bin/bash
# Note that false sets $? to 1
false
while [ $? -ne 0 ]; do
    php /path/to/your/script.php >> log.log
done

And if you need to make sure no two instances will run at the same time, google one of those (just from the top of my head):

  • .pid file
  • mysql LOCK TABLE

PS: If you'll use some of the method make sure that your script will work if it crashes in the middle

Sign up to request clarification or add additional context in comments.

2 Comments

Thank you for your answer! If an error occurs for one email, the loop continues to the next email and I'm recording the errors. I will look into "bash scripts" that you suggested I don't have experience with that.
@CyberJunkie I've added bash example and some relevant sources
2

Just disable the time limit all together at the start of your script:

set_time_limit(0);

2 Comments

Or set to a more reasonable length of time.
I like his original solution better, it solves any possible problems with hanging mail \server.
0

If your host's TOS preclude using unlimited scripts, they're almost certainly going to object to resetting the script. The only choice would be to send the emails in parallel, or move to a different host.

1 Comment

I think I'll eventually use frequent cron jobs.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.