2

I have a website that has multiple external javascript files, I would like to reduce these to increase performance.

Some of the external scripts include typekit, google analytics, linkedin js api, twitter api and some others. With these third party external javascript files what is the best approach to including them without increasing HTTP requests, keep in mind some of these libraries like to be loaded in the head tag ( typekit ).

I have tried downloading, concatenating and hosting them locally but they don't seem to play nice with my minified javascript or with other vendors javascript. Is there a specific approach to this problem.

I have looked into libraries like requirejs but I'm not sure if this would solve this problem.

1
  • you can defer many of the scripts (like GA, twitter, etc), which would give the same "speedup" as using something like yepnope/requireJS/whatever. Commented May 18, 2015 at 21:27

1 Answer 1

2

Don't aggregate them!

Actually even if you have more HTTP requests the content is added to the browser cache and stay there as they never change.

Moreover, even when you come on the website the first time, the file could be already in the cache since it could have been loaded from another website.

Finally, keep in mind that your own script is subject to change each time you release a new version, and consequently you will need to evict the script from the browser cache. External scripts won't change each time you release a new version, so you will keep them in the browser cache, which is possible only is they are not aggregated with your own scripts.

Sign up to request clarification or add additional context in comments.

4 Comments

good view, but overly dogmatic in something as complicated as web perf. each external source could require a DNS, which can slow down loading more than waiting on an xfer. gzip also tends to work stronger when it consumes bigger files. most importantly, we should guard strongly against SPOFs (head scripts especially) that can stop our whole site from showing up, or waiting 28 seconds for a timeout on an external CDN. in short, while i tend to agree with you, perf is just not as one-sided as you make it seem, and requires testing and homework to max out.
Well I agree with you: "best practices" need to be challenged by real world perf testing. I'm still convinced that those tests must consider the most common case: a page that already caches those scripts. Now client cache could not be enough, I agree.
hmmm. it would be interesting to use performance.getEntries() to see how many cdn scripts are indeed pre-cached when visitors arrive on a given site. i'm not saying it wouldn't be most, but real data on that "cache hit rate" would be interesting.
Definitely a strong argument I have been able to concat and minify these scripts into one inclusive of my own excluding only typekit however it does add to the load time which I expect is due to this new massive version not being cached.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.