0

So, I got a big menu with a couple of sub-items under each menu-item. Each sub-item needs javascript-includes and in some (pretty often) cases there are 20~ includes. This obviously sucks considering the HTTP-request time blah blah.

My thoughts are the following. I'm creating a merger-file (in PHP) that will handle all the js-includes and make them into one big file. But I got some questions.

There are two cases.

  1. A merger file is created for each sub-item and stored in a special folder. Each sub-item will then include that file, and only that file that is "it's own js-file"

Pros: Very clear that a certain sub-item got it's own js-file. Cons: A lot of files and also each time you make a change in a js-file you have to update the merged js-file manually

  1. A merger file is created on the fly for each sub-item depending on some parameters (js.php?modules=jquery-jqueryui-plugin1-plugin2)

Pros: Easy to work with, all changes will be updated instantly since it's generated in runtime. Cons: The possibility that the processing time and energy to create the merger-file will even out the loss of 20~ HTTP requests.

So, this is a question about performance. Thoughts?

1 Answer 1

1

You could also cache the dynamic file. For a given, unique URL you'd only generate the file once, and write it to a physical file. You could then redirect any subsequent request to that physical file.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.