0

I'm trying to take a list of 40,000 items and run a json request for each item. The process is super slow which I suspect is because each json request is finishing before the next request starts.

$result = mysql_query("SELECT * FROM $tableName");

while($row = mysql_fetch_array($result))
{
 checkRank($row['Domain']);
}

checkRank is running something to the effect of:

$json = file_get_contents($jsonurl,0,null,null);

I'm thinking I could run 10 checkRanks at a time to speed the process up? Any other ideas / suggestions?

UPDATE:

For example this loop runs through my array in 27 secs.

for ($i=0; $i<=100; $i++) {
checkRank($domains[$i]);
$i++;
checkRank($domains[$i]);
$i++;
checkRank($domains[$i]);
$i++;
checkRank($domains[$i]);
$i++;
checkRank($domains[$i]);
$i++;
echo "checking " . $i . "<br/>";

}

The loop below takes over 40 seconds with the same array.

for ($i=0; $i<=100; $i++) {
checkRank($domains[$i]);
echo "checking " . $i . "<br/>";

}

6
  • Is $jsonurl under your control or is it an external service? Commented Sep 11, 2012 at 19:29
  • stackoverflow.com/questions/124462/asynchronous-php-calls Commented Sep 11, 2012 at 19:30
  • 1
    @user1647347: Your code is vulnerable and not safe. Please read my recent post about SQL injections, you could find it interesting: stackoverflow.com/questions/11939226/… Commented Sep 11, 2012 at 19:32
  • $tableName is not a post or get variable..its listed in my config.php and gets included..I don't think sql injection is an issue here Commented Sep 11, 2012 at 19:37
  • $jsonurl is external from compete.com api Commented Sep 11, 2012 at 19:42

4 Answers 4

1

Not sure if this will help much because I don't work with PHP but I did find this.

How can one use multi threading in PHP applications

Sign up to request clarification or add additional context in comments.

1 Comment

Looks interesting, I'll have to test it out.
1

Unless there is something else that you haven't mentioned, the best way to do this would be to do one JSON request and pass your items to it, and get the equal number of results back. That way, you minimize the server response time. I am not sure if you want to send 40,000 items in though, you might want to divide it into 2 parts, but that you can test this later.

so you checkRank() would look something like

function  checkRank($domainsArray) {
$json = file_get_contents($jsonurl,$domainsArray);
}

6 Comments

Not sure exactly what you mean? Isn't that what I'm doing now? Can you elaborate please?
Right now you are calling a web service (or a server page) 40,000 times which makes your call very slow. I would suggest modifying your server page that you call using JSON to return multiple results, not one.
I dont have control over that. The jsonurl is compete.com API I wrote the same script using ajax and jquery and it was much faster because I performed 10 ajax requests at once. I was hoping to be able to do the same thing here.
well 10 requests is not 40,000. Consider changing it since your scenario changed quantitatively speaking.
no, I'm not saying make 10 requests. Make 10 at a time instead of 1 at a time.
|
1

http://www.phpied.com/simultaneuos-http-requests-in-php-with-curl/

This seems to be a nice way to speed up the processing. Thanks for all the input guys.

Comments

1

... it's a shame, I read on Stack Overflow today that PHP could not support threading as it would require fundamental changes to the language ...

https://github.com/krakjoe/pthreads

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.