Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Even Faster Web Sites 171

Michael J. Ross writes "Slow Web page loading can discourage visitors to a site more than any other problem, regardless of how attractive or feature-rich the given site might be. Consequently, many Web developers hope to achieve faster response times using AJAX (Asynchronous JavaScript and XML), since only portion(s) of an AJAX page need to be reloaded. But for many rich Internet applications (RIAs), such potential performance gains can be lost as a result of non-optimized JavaScript, graphics, and CSS files. Steve Souders — a Web performance expert previously at Yahoo and now with Google — addresses these topics in his second book, Even Faster Web Sites: Performance Best Practices for Web Developers." Read on for the rest of Michael's review.
Even Faster Web Sites
author Steve Souders
pages 254 pages
publisher O'Reilly Media
rating 8/10
reviewer Michael J. Ross
ISBN 978-0596522308
summary Advanced techniques for improving website performance.
The book was published by O'Reilly Media on 18 June 2009, under the ISBN 978-0596522308. The publisher makes available a Web page, where visitors can purchase the print and electronic versions of the book (as well as a bundle of the two), read the book online as part of the Safari library service, and check the reported errata — comprising those confirmed by the author (of which there are currently two) and any unconfirmed errors (all six of which are valid, though the fifth one may be a coincidence). In a break with traditional practice among technical publishers nowadays, there is no sample chapter available, as of this writing.

In many ways, this second book is similar to Steve's previous one, High Performance Web Sites: It presents methods of enhancing the performance of websites, with a focus on client-side factors. It is fairly slender (this one is 254 pages), relative to most programming books nowadays, and the material is organized into 14 chapters. However, unlike its predecessor, Even Faster Web Sites emphasizes generally more advanced topics, such as script splitting, coupling, blocking, and chunking (which to non-developers may sound like a list of the more nefarious techniques in professional hockey). This second book also has employed a team approach to authorship, such that six of the chapters are written by contributing authors. In his preface, Steve notes that the 14 chapters are grouped into three broad areas: JavaScript performance (Chapters 1-7), network performance (Chapters 8-12), and browser performance (Chapters 13-14). The book concludes with an appendix in which he presents his favorite tools for performance analysis, organized into four types of applications: packet sniffers, Web development tools, performance analyzers, and some miscellaneous applications.

In the first chapter, "Understanding Ajax Performance," guest author Douglas Crockford briefly describe some of the key trade-offs and principles of optimizing applications, and how JavaScript now plays a pivotal role in that equation — as websites nowadays are designed to operate increasingly like desktop programs. On pages 2 and 3, he uses some figures to illustrate fixed versus variable overhead, and the dangers of attempting to optimize the wrong portions of one's code. By the way, the so-called "axes" are not axes, or even Cartesian grid lines, but simply levels. Aside from its choppy narrative style and a pointless religious reference in the first paragraph, the material serves as a thought-provoking springboard for what follows. Chapter 2, titled "Creating Responsive Web Applications," was written by Ben Galbraith and Dion Almaer, who discuss response times, user perception of them, techniques for measuring latency, browser threads, Web Workers, Google Gears, timers, and memory issues. The material is neatly explained, although Figure 2-2 is quite confusing; moreover, both of the figures on that page should not have been made Mac- and Firefox-specific.

In the subsequent four chapters, Steve dives into the critical topic of how to optimize the performance of JavaScript-heavy pages through better script content and organization — specifically, how and when to split up large scripts into smaller ones, how to load scripts without blocking one another or breaking dependencies within the code, and how to best in-line scripts, when called for. Each of the four chapters follows an effective methodology: The first author delineates a particular performance mistake made by even some of the most popular websites, with the statistics to back it up. He presents one or more solutions, including any relevant tools, again with waterfall charts illustrating how well the solutions work. Lastly, he explains any browser-specific issues, oftentimes with a handy chart showing which possible method would likely be optimal for the reader's given situation, such as expected browser choices in the site's target audience. When there are potential pitfalls, Steve points them out, with helpful workarounds. He generally provides enough example source code to allow any experienced developer to implement the proposed solutions. Unfortunately, the example code does not appear to be available for download from O'Reilly's website.

The discussion of JavaScript optimization is capped off by the seventh chapter, written by Nicholas C. Zakas, who explains variable scope within JavaScript code, the advantages of choosing local variables as much as possible, scope chain augmentation, the performance ramifications of the four major data types (literal values, variables, arrays, and objects), optimizing flow control statements, and string concatenation. He outlines what sorts of problems can cause the user's Web browser to freeze up, and the differing responses she would see depending upon her chosen browser. Nicholas concludes his chapter by explaining how to utilize timer code to force long-running scripts to yield, in order to avoid these problems. By the way, in Figures 7-2 and 7-3, the data point symbols need to be enlarged so as to be distinguishable; as it is, they are quite difficult to read. More importantly, on page 93, the sentence beginning "This makes array lookup ideal..." is either misworded or mistaken, since array lookup cannot be used for testing inclusion in ranges.

With the eighth chapter, the book shifts gears to focus on network considerations — namely, how to improve the site visitor's experience by optimizing the number of bytes that must be pushed down the wire. In "Scaling with Comet," Dylan Schiemann introduces an emerging set of techniques that Steve Souders describes as "an architecture that goes beyond Ajax to provide high-volume, low-latency communication for real-time applications such as chat and document collaboration" — specifically, by reducing the server-side resources per connection. In Chapter 9, Tony Gentilcore discusses a rather involved problem with using gzip compression — one that negatively impacts at least 15% of Internet users. Even though videos, podcasts, and other audiovisual files consume a lot of the Internet's bandwidth, images are still far more common on websites, and this alone is reason enough for Chapter 10, in which Stoyan Stefanov and Nicole Sullivan explain how to reduce the size of image files without degrading visible quality. They compare the most popular image formats, and also explain alpha transparency and the use of sprites. The only clear improvement that could be made to their presentation is on page 157, where the phrase "named /favicon.ico that sits in the web root" should instead read something like "usually named favicon.ico," since a favicon can have any filename, and can be located anywhere in a site's directory structure.

The lead author returns in Chapter 11, in which he explains how to best divide resources among multiple domains (termed "sharding"). In the subsequent chapter, "Flushing the Document Early," Steve explores the approach of utilizing chunked encoding in order to begin rendering the Web page before its full contents have been downloaded to the browser. The third and final section of the book, devoted to Web browser performance, consists of two chapters, both of whose titles neatly summarize their contents: "Using Iframes Sparingly" and "Simplifying CSS Selectors." That last chapter contains some performance tips that even some of the most experienced CSS wizards may have never heard of before. As with most of the earlier chapters, the narrative tends to be stronger than the illustrations. For instance, Figure 14-5, a multiline chart, is quite misleading, because it appears to depict three values varying over time, when actually each of the ten x-axis coordinates represents a separate major website. A bar chart would obviously have been a much better choice.

Like any first edition of a technical book, this one contains a number of errata (aside from those mentioned earlier): In Figure 1-1, "iteration" is misspelled. On page 23, in the sentence beginning "Thus, if...," the term "was" should instead read "were." In Figures 7-1 and 7-4, the "Global object" box should not contain "num2." On page 95, in the phrase "the terminal condition evaluates to true," that instead should read "false." On page 147, in the sentence beginning "However, the same icon...," the "was" should instead read "were." On page 214, "Web-Pagetest. AOL" should instead read "Web-Pagetest, then AOL," because the first sentence is one long absolute phrase (i.e., lacking a finite noun and verb).

All of these defects can be easily corrected in future printings. What will probably need to wait for a second edition, are improvements to the figures that are in need of replacement or clarification. What the publisher can rectify immediately — should the author and O'Reilly choose to do so — would be to make all of the example source code available for download.

Even though this book is decidedly longer than High Performance Web Sites, and has many more contributing authors, it does not appear to contain as much actionable information as his predecessor — at least for small- to medium-sized websites, which probably make up the majority of all sites on the Web. Even though such methodologies as Comet, Doloto, and Web Workers appear impressive, one has to wonder just how many real-world websites can justify the development and maintenance costs of implementing them, and whether their overhead could easily outweigh any possible benefits. Naturally, these are the sorts of questions that are best answered through equally hard-nosed experimentation — as exemplified by Steve Souders's admirable emphasis upon proving what techniques really work.

Fortunately, none of this detracts from the application development and optimization knowledge presented in the book. With its no-nonsense analysis of Internet performance hurdles, and balanced recommendations of the most promising solutions, Even Faster Web Sites truly delivers on its title's promise to help Web developers wring even more speed out of their websites.

Michael J. Ross is a freelance Web developer and writer.

You can purchase Even Faster Web Sites from Slashdot welcomes readers' book reviews — to see your own review here, read the book review guidelines, then visit the submission page.


This discussion has been archived. No new comments can be posted.

Even Faster Web Sites

Comments Filter:
  • Slow sites lose me (Score:5, Interesting)

    by WiiVault ( 1039946 ) on Wednesday July 22, 2009 @12:54PM (#28784675)
    I used to go to VGChartz all the time until a few months ago they "updated" their site with seeming dozens of constant flash ads, talking, moving, popping up on the forums, ect. It got to the point where my Core2Duo desktop was litterally pausing for 5 seconds everytime I navigated to a new page. Pretty quickly I realized that the content on the site wasn't worth my time. I see this happening more and more. Sure I could have used adblock but frankly they were asking for a price (getting attacked by ads) and I chose not to pay it (leaving the site). I was glad to see a few ads, but abuse is a sure sign you have little respect for your readers.
  • by Phroggy ( 441 ) <slashdot3&phroggy,com> on Wednesday July 22, 2009 @12:56PM (#28784715) Homepage

    It's fast enough for me.

    The CSS is horribly broken, but I have no complaints about the speed. Posting is certainly faster for me than it used to be before they switched to AJAX. It just looks like crap.

  • by intx13 ( 808988 ) on Wednesday July 22, 2009 @01:00PM (#28784777) Homepage
    In my opinion, tuning a Javascript laden website for speed is an exercise in futility. The Web's great difference is user-extensibility; practically anyone can throw together some HTML and make a website. It may be a MySpace page, but hey, it's something, and it contributes to the culture of the Internet. Speed, however, is not a feature of the HTML/Javascript/CSS world.

    Were I to be developing a new AJAX-driven Web application I would focus first on simplicity. I feel that if you have so much AJAX stuff going on that you need to resort to crazy tricks you have already lost. Take the following, quoted from the review:

    In the subsequent chapter, "Flushing the Document Early," Steve explores the approach of utilizing chunked encoding in order to begin rendering the Web page before its full contents have been downloaded to the browser.

    While good advice, something like this should be implemented as a natural part of the specification, or not at all. This rings to me as an attempt to manhandle HTML/Javascript/CSS into a use case for which it is not intended.

    I want to see a real protocol for webpages - something between Postscript (except less document-oriented; and yes I know about NeXT's work) and a windowing environment (except more constrained). Then, to preserve the ease of user-input, a simple HTML/XML-like layer. For 99% of the sites that are constructed directly by the user, original HTML with italics, colors, fonts, etc. is sufficient. For projects beyond the scope of Joe Facebook, a true system is needed that allows seperation of design and content. But all attempts thus far to do both, frankly, suck.

  • "Preview Post" lag (Score:5, Interesting)

    by electrosoccertux ( 874415 ) on Wednesday July 22, 2009 @01:03PM (#28784827)

    The only part that bothers me is the "Preview Post" lag lasting for 20-35 seconds. I love everything else about the navigation on the site, though.

  • Re:No... (Score:3, Interesting)

    by mcgrew ( 92797 ) on Wednesday July 22, 2009 @01:13PM (#28785015) Homepage Journal

    I'd say the greatest barrier to website use is advertising, which isn't information; it's useless data. As to the book, AJAXing your site makes it slow. If you want a fast site, don't put anything on any page that doesn't absolutely need to be there, and have fast pipes and fast and enough servers.

    If I see a really fancy layout on a web site, I always wonder why its developer thought the content was so bad that they needed to disguise it with slickness?

    bing comes to mind. I've tried it quite a few times, and after seeing the "information overload" ad I finally got it - it doesn't index as many pages as Google and its results aren't as relevant. If you have an inferior product, making your product pretty and marketing the hell out of it is your only chance to obtain any market share.

    Google could indeed be improved on, and there will indeed be a better search engine eventially, but bing ain't it. Google is an excellent example of good web design.

    BTW, shouldn't that be TLTR instead of TLDR? Oh, too long DIDN'T read. Gotcha. You wouldn't like most of my journals, or anybody's books.

  • Re:AJAX (Score:3, Interesting)

    by Chabo ( 880571 ) on Wednesday July 22, 2009 @01:35PM (#28785337) Homepage Journal

    Conversely, some of the fast websites use basic TEXT and skimp on the graphics.

    As I said in another thread the other day: Whether or not you like his writing, I think Maddox hit the peak of usable web design: dark background, with large-font bright text. It's the easiest webpage on the internet to read, and despite having some graphics, it loads very quickly because he uses the graphics as actual content, not just filler.

  • by commodore64_love ( 1445365 ) on Wednesday July 22, 2009 @01:42PM (#28785419) Journal

    >>>>>begin rendering the Web page before its full contents have been downloaded to the browser.

    >>This rings to me as an attempt to manhandle HTML/Javascript/CSS into a use case for which it is not intended.

    I disagree. Today's websites don't do it, but in the simpler 1990s era of pure HTML, the website DID render before completing download. The browser was expected to grab the HTML first, render the page with "X" placeholders, and then download the images last. That way the user could read the website even with the image only partially present.

    So yes prerendering the webpage before download was completed *was* the original intent of the web. It is only lately that webpages have shifted away from that, and I for one would like to see them restore it.

  • Not to mention on dialup all the extra offsite loading is uberslow! Not everybody in the USA can get broadband you know, and sadly I have members of my family trapped on the evil 28k dialup. It is getting to the point that even with ABP and Noscript page loading is God awful. And whatever they did to FF3 with the 'upgrade" to 3.5 makes it completely unusable for dialup. I am gonna have to bring out my family FF3.0.1.2 and hope that they fix the problems with the 3.5 branch before 3.0 is abandoned in Jan, otherwise it is Kmeleon and Opera for them.

    You know, there really ought to be code added to detect dialup like they do IE. Something like "If speed = 64k then offsite linking =0" or something. Because until/unless Obama rolls out nationwide broadband many like my mom will never ever see it. Nearly 40 homes are on the 1 mile stretch she lives on, and she can see the end of the cable from her front door. It was a block away when she and dad built the house 29 years ago with a "we plan to be out to run line that way in about six months" well 29 years later it has moved exactly 0 inches from where it was. So until we actually get nationwide broadband like most western countries it would be nice if the websites didn't bloat the shit out of everything with offsite linking. sorry about the rant, but with Internet access becoming more and more of a requirement to get anything done, it really sucks how websites are bloating the living hell out of their code expecting everyone to have high speed.

  • by WebmasterNeal ( 1163683 ) on Wednesday July 22, 2009 @04:56PM (#28788531) Homepage
    My personal website has been dying a slow death, not because of the fancy graphics I use on it but because I'm stuck on shared hosting with 100+ other websites (1and1). That alone can make the difference. To download 100K to 300K isn't a big deal now days.

    Also, for instance try tallying up all of Facebook's JS, CS + HTML files, and you'll discover the site clocks in over 1MB. Now consider how fast the pages on the site load (super fast) See the difference how your site is hosted can make?

"The C Programming Language -- A language which combines the flexibility of assembly language with the power of assembly language."