I have an array stored in the site options in WordPress which will potentially have 3000 email addresses in it. Every night I need to process this list with some checks on those email addresses and I'm worried that if it fails I could be in trouble.
I decided for safety I should do this in batches. At midnight I create a scheduled event which runs every 2 minutes in batches of 50.
Currently, in the function, this is what happens. I make a new array of 50 items which I then use in a loop to run the various processing bits I need to do.
function tw_process_daily_list_chunk(){
$tw = get_option("tw_settings");
$daily_list_chunk = array_splice($tw["daily_list"], -50,
count($tw["daily_list"]));
foreach ($daily_list_chunk as $playeremail){
do_various_things();
}
if (count($tw["daily_list"]) == 0 ){
do_stop_processing_stuff();
}
}
I'm now concerned that if this failed then the next time the scheduled event runs it thinks it processed the last batch and it might not have. Is there any downside to directly accessing the larger array from get_option, and then also updating updating that option instead?
function tw_process_daily_list_chunk(){
$tw = get_option("tw_settings");
$iteration = 0;
foreach ($tw["daily_list"] as $playeremail){
do_various_things();
$key = array_search($playeremail, $tw_settings['daily_list']);
if ($key !== false) {
unset($tw_settings['daily_list'][$key]);
}
update_option("tw_settings", $tw_settings);
$iteration ++;
if ($iteration == $chunksize){
break;
}
}
if (count($tw["daily_list"]) == 0 ){
do_stop_processing_stuff();
}
}