0

Let's say I want to lazy load comments on a blog, such that only the 10 newest ones are shown initially, and the next batch of 10 is lazy loaded on scroll.

Option A: send the entire array of results from the server on page load, and handle rendering 10 at a time client side.

Option B: load 10 on page load and send ajax requests for 10 more as needed.

Suppose the number of objects can reach 40,000 - is it ok to send such a large json array to the client on page load?

I see the trade-offs as: with Option A, you minimize trips to the database, but you send potentially too much data to the client (a 40,000 record json array ≅ 7mb), which is not ideal for mobile users. With option B, you may have too many trips to the database.

Is there a best practice on this? Maybe compromise somewhere in the middle? That is, eager load 200 objects to the client, only rendering 10 initially and rendering 10 more until then 200 objects are used up, and then lazy load from the server as needed? Wanted to see how others are addressing this.

3
  • You have 40,000 comments on a blog post? And what does "too many trips to the database" mean - is that an issue for you right now? Commented Sep 23, 2017 at 17:30
  • The blog post is just an example. Trips to the database are expensive operations: to have thousands of users re-querying the database for small chunks of data is not ideal when it can be handled client-side, but I'm trying to find the middle ground without loading too much upfront. Commented Sep 23, 2017 at 17:34
  • With a 7MB JSON, you'll definitely freeze the browser! Mobile and desktop. This takes a while to parse. Please try that on your computer. Commented Sep 23, 2017 at 17:57

2 Answers 2

1

I would go exactly as you suggested with small amortisation improvement.

First you pre-load 200 items and render them in groups of 10. When half of the loaded commends read you pre-load next 200.

Depending on your task you can go even further and dynamically calculate size of the next comments batch. If user is a slow reader, you decrease batch size to 100. If they scroll fast, batch increased to 1000.

Sign up to request clarification or add additional context in comments.

Comments

0

Somewhat Option B. You can make a dedicated api endpoint in your server, if you are using Node.js or similar technology its very easy. You just need to invoke a scrollfire event after you are about to reach the end of the page. You can ping server with the index of next 10 or 20 elements or so to get the desired data. Key here is to estimating the size and type of data you are calling. Also the Architecture of database. If your database is well engineered you wont be facing any performance issues even when scaled upto thousands of parallel users.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.