0

I was thinking about creating script that would do the following:

  1. Get all javascripts from JS directory used on server
  2. Combine all scripts to one - that would make only one request instead of multiple
  3. Minify combined script
  4. Cache the file

Let's say that the order in which the files need to be loaded is written in config file somewhere.

Now when I load myexamplepage.com I actually use jQuery, backbone, mootools, prototype and few other libraries, but instead of asking server for these multiple files, I call myexamplepage.com/js/getjs and what I get is combined and minified JS file. That way I eliminate those additional requests to server. And as I read on net about speeding up your website I found out that the more requests you make to server, the slower your web become.

Since I'm pretty new to programming world I know that many things that I think of already exists, I don't think that this is exception also.

So please list what you know that does exactly or similar to what I described.(please note that you don't need to use any kind of minifiers or third party software everytime you want your scripts to be changed, you keep original files structure, you only use class helper)

P.S. I think same method could be used for CSS files also.

I'm using PHP and Apache.

5
  • 1
    What server-side technology are you using for your website? Commented Apr 19, 2012 at 12:25
  • possible duplicate of What do you use to minimize and compress JavaScript libraries? Commented Apr 19, 2012 at 12:28
  • @Richard Ev - at the time PHP, but I'm thinking about concept it self and I would like to read about any solution for any platform, what their approach is. Commented Apr 19, 2012 at 12:35
  • @epascarello you probbably didn't read the whole post, because I told that it's not about that. Commented Apr 19, 2012 at 12:36
  • Do you not understand the idea behind Make, Ant, Maven? They control the process of building files to push out. If you change one file, the make script takes care of it and changes only the files that are effected. Doing this at runtime will lead to bad performance, and unprimed caches. Commented Apr 19, 2012 at 18:03

3 Answers 3

2

Rather than having the server do this on-the-fly, I'd recommend doing it in advance: Just concatenate the scripts and run them through a non-destructive minifier, like jsmin or Google Closure Compiler in "simple" mode.

This also gives you the opportunity to put a version number on that file, and to give it a long cache life, so that users don't have to re-download it each time they come to the page. For example: Suppose the content of your page changes frequently enough that you set the cache headers on the page to say it expires every day. Naturally, your JavaScript doesn't change every day. So your page.html can include a file called all-my-js-v4.js which has a long cache life (like, a year). If you update your JavaScript, create a new all-in-one file called all-my-js-v5.js and update page.html to include that instead. The next time the user sees page.html, they'll request the updated file; but until then, they can use their cached copy.

If you really want to do this on-the-fly, if you're using apache, you could use mod_pagespeed.

Sign up to request clarification or add additional context in comments.

4 Comments

I understand what you are talking about and I know about 3rd party minfiers and combiners but if I decided to add another line to JS file I would need to do the same operation for all files - that takes some time, doesnt it? edit: I read a little bit about mod_pagespeed, seems quite interesting I will come to it later. But this wouldn't be really the same approach because that would require to install this mod on the server and I suppose if you are sitting behined shared server you couldn't achieve this.
@VytautasButkus: "that takes some time, doesnt it?" Not in any real way. As an experiment, I just ran 8M of (largely unrelated) JavaScript files through jsmin. Took a quarter of a second. Closure Compiler was slower, 34 seconds, but that's because it does a lot more work (and gets better results). And I'm guessing you have a lot less than 8M of JavaScript. (It does the jQuery 1.7.2 file in 6.5 seconds.) This is on my dual-core 2.4GHz machine, so not a hyper-charged monster at all.
I wasn't talking about time software takes to compress file, I was talking about time you need to put all 5 or more files into software and then upload those files. And suppose you do dayly updates you need to do it everyday. Then how much time do you waste on that per week? Suppose you can skip entirely this process.
@VytautasButkus: If you have a script that triggers the process and increments a build number, the time spent is near zero. But no, I'm not aware of any solution that makes it completely automatic, including (say) updating page.html to refer to the newest version. (If your page is created via PHP, of course you can use PHP to do that part automatically by reading the version number from whatever file your script uses.)
1

If you're using .NET, I can recommend Combres. It does combination and minification of JavaScript and CSS files.

Comments

0

I know this is an old question, but you may be interested in this project: https://github.com/OpenNTF/JavascriptAggregator

Assuming you use AMD modules for your javascript, this project will create highly cacheable layers on demand. It has other features you may be interested in as well.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.