0

I'm trying to figure out a way to make some web pages more efficient when processing and formatting the data of very large queries (just from the coding side of things). Currently php is being used to process data and my theory is that incorporating javascript and jquery will significantly speed up the processing of this data because it will offset processing from server-side to client-side. Is this true?

Here is a sample of the kind of processing that is going on in my webpage:

$query = "Select * from large_table";

// run the query
$result = maxdb_query($link, $query); 

// very large number of rows...
while($row = maxdb_fetch_assoc($result))
{
  echo "<tr>";

  $calculation1 = $row['col1'] + $row['col2'] / $row['col3'];
  $calculation2 = row['col1'] / row['col2'] - row['col4'];
  //... more calculations

  echo "<td> " + $row['col1'] + "</td>";
  echo "<td> " + $calculation1 + "</td>";
  echo "<td> " + $calculation2 + "</td>";
  // ... etc.

  echo "</tr>";
}

Thanks for the help.

1
  • If you are certain that Javascript is going to be enabled on the client-side then I would definitely off-load the calculation processing to the client but just understand that if they truly have an awful and old CPU then they will blame your site for being "slow" Commented Mar 12, 2014 at 19:34

4 Answers 4

2

The processing involved in the basic maths is insignificant compared to the processing involved in shifting large amounts of data around. You may see some improvement in bandwidth usage if you just send numbers (a JSON string?) and have JavaScript render it into a table, however as far as processing goes it will make little to no difference.

Consider paginating your data instead, with LIMIT X,Y in MySQL.

Sign up to request clarification or add additional context in comments.

1 Comment

+1! OP, also consider not doing SELECT * because most of the time you do not need all of the columns and it unnecessarily causes significantly more reading from the HDD
1

Honestly the biggest improvement is going to be in limiting the number of rows you get back in your query. You can obviously do this with a WHERE clause if you can filter it down. But that might not help much if you have hundreds of thousands or millions of rows.

Instead I recommend limiting the output to 100 to 500 rows per call. Then if you need to see more you can simply page to the next set of results in several different ways. From an SQL query standpoint you can use LIMIT and OFFSET (or use "LIMIT {offset}, {row count}").

From a front-end side you can do any of several things. You can just do simple pagination where PHP builts each page of results and lists a link to the various page numbers at the bottom.

You could has a single page that simulates this by pulling in the data via a JavaScript AJAX call to an end-point that returns each set of data and then just display it on the page as needed.

Or you could do essentially the same thing with JavaScript but in a continuous scrolling format where the list would just get more and more items added to the end of it as the user scrolls down.

All of these options would greatly reduce the amount of time before the user is shown certain the data and improve the experience greatly.

To do the AJAX options you can use jQuery's ajax function (http://jqapi.com/#p=jQuery.ajax) or one of it's other send/receive functions to send the data request to the server. Then just process the request in PHP and return a JSON object for jQuery to process and add to the DOM.

On a side note, you should really just stuff the data into an output array and then output the HTML in a part of code that is dedicated to outputting the HTML to the end user instead of combining logic and presentation in the same place.

2 Comments

there is no LIMIT clause in maxDB which is a huge bummer
There isn't? I thought it was touted as an enterprise level RDBMS soluation.
0

Both ways are going to be accessing the same amount of data and require same interaction with SQL DB. If you are having issues and want to serve it quicker and have less SQL interactions I would look into caching.

Comments

0

Assuming you need to transfer all of the data anyway (for display) then there could be a small efficiency saving in doing the calculations client-side.

More importantly, minimising the data generated and transferred by php could deliver a much bigger saving. So fire your query output straight into a json string (or similar) then render the full HTML, including calculations on the client side.

But before doing that I'd recommend running the query without the calculations, just to see if the bottleneck is in the SQL query itself.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.