Update

[8/12/2010: I've posted a performance benchmark test to try and gather some numbers from the crowd-at-large on how single large concat'd files perform compared to chunking that big file into a couple of smaller files to download in parallel. I encourage you to go right now and click each of the buttons a few times, and then spread the link on to others. Concat Performance Benchmark Test #1]

By far the most prevalent question circulating right now in the wake of href="http://labjs.com">LABjs‘ initial public launch (IPL, I guess!?) is the understandable: “Why do I need a loader like LABjs for multiple files when I can just concat everything into one file?”

Fair enough, this is an important question to address. You shouldn’t just take my implied assumption at face value; we need to dig into this so you understand why I feel so strongly that LABjs offers something better than what you are probably currently doing on your sites.

The short answer is, BOTH! You should concat files together when possible, and you should load your file(s) with a loader like LABjs. If you only do one or the other, you have missed out on the bigger picture of page-load optimization.

YSlow/PageSpeed only value reducing HTTP requests

Before I get into specifics about why I think LABjs brings something new and beneficial to the table, let’s tackle what I consider to be an interesting but perhaps misleading “truth”. I am firmly in a minority camp in my skepticism that reduction of HTTP requests is the only answer to the question of how to load a site more efficiently — actually, to be more accurate, how to load a site’s JavaScript more efficiently.

I bet that you are recalling the YSlow rules and the tried-and-true practice that concat’ing all your JS files into one file for production is the surest way to speed up your page loads. And you’re probably doubting my credibility because I would question such a fundamental “truth” as HTTP request reduction.

It is true that, especially for the largest sites on the internet (millions of page-views per month and more), the sheer volume of HTTP requests from a typical page profile (dozens of images, multiple style sheets, half a dozen JS files, etc) quickly overloads internet connections, server load-balancers, the user’s device, and even the browser itself. And anything you can do to reduce that overload will go a long way toward helping the site load with acceptable performance.

Sites like Yahoo and Google have taught us the basic principles of HTTP request reduction as an effective way to cut down on the overall page loading delays. For instance, consider Spriting to combine images, CSS concatenation, and, yes, JavaScript concatenation. Some sites, like Digg, have even experimented with the extreme — inlining all page resources, even binary image data, into a single multi-part request! And just recently, we saw a proposal for Resource Packaging as another way to combine multiple types of data into single responses for HTTP request reduction.

If you run a site that has well over 50 million page views per month (I sure don’t!), you probably would assert that this is the easiest and most effective page-load optimization. And I’d be hard-pressed to go up against the brightest and the best at the world’s largest web properties to suggest that it’s not true, at least at their page-view volume.

What’s true is true. Period.

Here’s what I will suggest: maybe, just maybe, what’s true for Google at a hundred-million page views may not identically and exactly be true for the vast majority of us running sites in the tens-of-thousands page view realm. The “big boys” are up in rarefied air, and from there they certainly have the request volume to test, re-test, and “prove” these various rules of optimization. But in the same way that water behaves in strange, almost contradictory ways at extremes of hot or cold temperature, maybe the art of optimizing page loads isn’t as simple as “1 http request is always better than 2.”

Even for small sites (like the one you’re currently reading!), it’s still vitally important to reduce page load time and optimize for faster page views. So, I’ll readily and eagerly admit that a key strategy in that effort is for all of us to work to reduce our HTTP request loads.

OK, so you agree concat’ing is best?

Not so fast. Like I said, this issue is more complex than I think most of us are ready to admit. Too easily, we run the YSlow plugin tests on our site, and when we reduce HTTP requests by concat’ing our JS files, because it tells us to, not only does our perceived page-load times decrease, but our YSlow grade goes up! Success! We’ve proved the hypothesis, there’s nothing more to say, right?

I don’t think this in and of itself means you have achieved the pinnacle of page-load performance, you’ve simply taken the first of many steps toward nirvana.

In my opinion, loading your JavaScript in an efficient and optimized way cannot be boiled down into a simple set of “rules” — you can take such “truths” as good places to start, but you still have to refine the strategies to fit your particular site and audience. Indeed, this will start to look more like an art than a science, if done correctly.

I’m not so sure…

Maybe by this point, you’re still not convinced there’s more to the story, and you’re probably starting to feel like LABjs doesn’t really have that much to offer you.

Before you go, let me just say this: even if you only ever have one JavaScript file on your site, because concat’ing is the way to go for you, and you think that by putting a single <script> tag at the bottom of your page, you’ve achieved the ultimate JavaScript loading solution — there is still some benefit to using LABjs to load that one file.

Why? A regular <script> tag in the main HTML, even at the foot of your document, will always block the page’s loading completion, which means that users will still be “stuck” waiting for that file to complete before they can do anything with your page. Now, for some pages, you may really not want the users to do anything until the JavaScript fully processes everything, but most people agree that designing sites for progressive-enhancement is a positive thing.

So, if you can design your sites so that there’s something meaningful for the users to see or even do while they wait for the JS to kick in, they will probably be happier with your site than if you just make them wait. And that’s where LABjs comes in. Loading JavaScript dynamically unblocks the rest of the page’s resources (and DOM-ready!), freeing the user from the iron grip of the hanging “almost loaded but not yet” feeling that most sites force upon their users.

Include the small 4k of LABjs (via an inline <script> tag) into your page, and use it to load your single JavaScript file, and I believe your users will see an improved, even slightly, user-experience on page-load, as long as you’re careful about it.

OK, so maybe there is more to this loading thing

Keep reading if you’re ready to take the Red Pill and dive deeper down the rabbit hole.

(… more …)

Pages: 1 2

This entry was written by getify , posted on Friday November 27 2009at 11:11 pm , filed under JavaScript, Performance Optimization, UI Architecture and tagged , , , , , , . Bookmark the permalink . Post a comment below or leave a trackback: Trackback URL.

8 Responses to “LABjs: why not just concat?”

  • Hello. I’m not convinced with that “huge is ok” approach.
    I’m fond of the download-the-code-when-needed approach. Meaning, you click a button and the needed code is downloaded and executed. If the button isn’t clicked the code is not downloaded. The REST architecture itself praises “Code-On-Demand”.

    What’s your opinion on this?

  • getify says:

    I’m definitely a big fan of load-on-demand and load-when-needed approaches. For instance, Facebook now has a “bootstrapper” script (called “Primer”) that they load at the beginning of the page, which can handle default actions, and then they layer in additional functionality with futher downloads as the page loads and gets more complicated.

    You do have to balance on-demand with potential slow-downs that can occur if it takes a second to download code when someone clicks a button to do some action. If the user can see these delays when they are interacting with elements, it can detract from the UX. So, I’m in favor of an approach that progressively downloads more behavioral JS in the background so there’s less chance that a user will have to wait on the JS.

  • You make a good point, but I’m not sure that’s a problem because we’re talking about a one second delay (give or take, on a bad day) only on the first click. But that’s something for me think about. Thanks for responding.

    Filipe Martins

  • getify says:

    “One second” was kind of an exaggeration. It’s been shown that users can perceive a delay if it’s over 200ms, which is quite possible if there’s a decent sized JavaScript file (or several) that need to be loaded when a user clicks a button. Of course, even a really small file could take 200ms if there’s high network latency involved. Because you can’t really guarantee someone’s connection speeds, I’d tend to assume loading files when a button is clicked could have a noticeable delay and would avoid it.

    But certainly if you are going to do this, give some sort of visual indication (like a loader/spinner) even for the short loads (so that it’s present if the load is really long) so users aren’t multiple-clicking things because they click and don’t see a response quickly enough.

  • I’m not claiming that a 200ms or 1s delay isn’t noticeable. I’m saying that it probably won’t be important because it’s a one time only thing (don’t we love the cache?).
    I do agree with you about the visual feedback, though.

  • klevak says:

    For developers with web sites that DO get over 160,000,000 page views a month, this post appears to concede that using LAB.js versus concatenating and minifying JavaScript into a single file adds little value. Considering that the majority of our visitors spend 10-15 minutes surfing our site, it’s it enough to rely on the browsers caching mechanisms? Am I missing something?

  • Kevin says:

    @Klevak

    If you scroll to the beginning of the article, you’ll see this paragraph:

    The short answer is, BOTH! You should concat files together when possible, and you should load your file(s) with a loader like LABjs. If you only do one or the other, you have missed out on the bigger picture of page-load optimization.

    That way you reduce HTTP requests, and you prevent the single, large JS file from blocking the rendering of the page as it loads.

  • Dan says:

    Surely the other benchmark you need to run here for the full picture is when caching is used. I notice that you’ve deliberately disabled caching in your requests to show the actual download time between a concatenated file and your load on demand. Interestingly, my results are sometimes longer for the labJS solution.

    A more realistic scenario IMHO would be to compare any typical homepage which may have around 10 different JS files (eg. jquery, jquery UI, Knockout, modernizr, etc.etc) and illustrate that as one file vs many files when caching is optimised. I think you’ll see that for all but the first request, a single concat file will be quicker the more individual files are included. There are only so many that will load in parallel.

    Where I think this way of loading really wins is only loading what you actually need. Personally, I’m less worried about how long a script takes to download (given I can optimise that) than I am about whether I actually need the script at all. There’s obviously no point spending time processing a script I’m not using. I quite like the fact that you can can customize many libraries, for instance jQuery UI lets you build a JS file containing only those parts of the UI you’re actually using on your site.

Leave a Reply

Consider Registering or Logging in before commenting.

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Notify me of followup comments via e-mail. You can also subscribe without commenting.