Facebook: Sea Cow of the Internet

Update: Schrep (I’m far back right, he’s in the middle…note the ripped shirt, sorry Schrep!) was kind enough to point out that, as of bug 423377 being resolved, Firefox 3 defaults to 6 simultaneous connections.  Modern browsers all use different numbers, the lowest being IE 7 with 2 (all older browsers also use 2).

One of my projects here at Mozilla (and, coincidentally, a past project at Yahoo!) was improving ySlow scores.  ySlow is a utility that measures load time and analyzes page performance, assigning you a final letter grade based on various performance metrics.  It’s a neat little Firebug plugin, and I highly suggest that any web developer install it.

Occasionally I like to play with this tool on other big sites, just to see how many of them actually care about such things.  So I went through and ran ySlow on some of the more common Facebook pages.  Here’s what I found:

With most aspects Facebook does a decent job: with the exception of advertiser scripts and some application-specific code they use etags, minify their JS, and use long expires headers.  What amazed me is the number of JS and CSS files on each page, all listed one after another in the header:

Page JS Files CSS Files CSS BG Images
Homepage (logged out) 5 5 14
home.php 23 24 32
profile.php 26 21 36
photo_search.php 11 7 23
photo.php 18 8 26
friends 15 13 34

And for those curious, here’s the count for the new facebook design (summary: significantly worse):

Page JS Files CSS Files CSS BG Images
home.php 27 24 60
profile.php 45 13 67
photo.php 25 9 46
friends 26 19 57

Really, I can’t think of any context in which 47 external files would be necessary! I understand breaking files up by purpose to make coding and revision management easier, but I wonder if someone at some point considered the speed impacts. I’m fortunate enough to almost always have a broadband connection, but the experience for their dial-up users is probably deplorable. Especially considering that they now localize the site and are pushing to expand overseas, you would think this would be a much higher priority. And don’t even get me started about the lack of spriting!

Here’s how I setup Mozilla’s JS/CSS concatenation (see the Build Process):

  1. Add a configuration setting for site state (example: a flag set to “production” on production servers, “dev” on everything else)
  2. All CSS/JS calls use these flags to decide if they go to the concatenated files or the actual development files
  3. Create a build script that generates the concatenated files (profile.js, photo.js, etc), run before pushing to production

Ironically enough I would bet Facebook already has #1 and #2 setup, since they use Akamai for production servers, and can’t use that for development.

Alternatively they could use the method YUI uses for serving JS files.  Basically call a script that will return the concatenated files.  It’s a less elegant solution, and is heavier on the server, but still better than nothing.

Note that this doesn’t only affect dial-up users.  While broadband users usually have a fast enough connection to offset the slowdown, a large file count is the biggest slowdown for broadband.  This is because of the 2-simultaneous-connection limit that most browsers obey.  From rfc2068:

Clients that use persistent connections SHOULD limit the number of simultaneous connections that they maintain to a given server. A single-user client SHOULD maintain AT MOST 2 connections with any server or proxy.

Facebook.com is the 8th most popular site on the web, while Mozilla.org is the 258th (note that this is all of mozilla.org, not just addons.mozilla.org).  They should be able to devote a lot more to the tail end of their users, especially considering the residual benefits for their main audience.

Just as bad is their lack of proper fallback for those with JS disabled. For example, if you were to disable JS, you can still login fine, but once you login, head back to facebook.com. That’s right, they use JS to redirect users from their homepage, with no <noscript> fallback, meaning the average joe with JS disabled can easily lock himself out of Facebook.  In addition, pretty much every new feature added since poking doesn’t have a non-JS fallback.  Status updates, “People you may know”, dropdowns, the entire “Friends” page, and so on.  All completely useless.

9 Responses to “Facebook: Sea Cow of the Internet”

  1. Jesse Farmer says:

    Conclusion: Issues like page-load time and JavaScript fallback do not matter for most Facebook users.

  2. Oh, the huge manatee!

  3. Jamie MacKinnon says:

    Have you had a chance to test it with the new (www.new.facebook.com) layout yet?

  4. Brian says:

    Jamie: yes, my second table is for the new site…it’s worse!

  5. schrep says:

    Just FYI we upped the max connection limit for Firefox 3 from 2 to 6 (https://bugzilla.mozilla.org/show_bug.cgi?id=423377)

  6. monk.e.boy says:

    Average Joe turns off his javascript ROFL! Have you ever met a ‘real’ person? They are still using IE6 dude, they aren’t turning of JS and expecting Facebook to still work. Oh, man, you are so out of touch with the real world ;-) ivory towers etc…

    BTW my friends fall into 2 camps, those who have myspace and love it and those who put up with facebook because it is seen as more nerdy (and clever)…. none of them install new browsers.

    monk.e.boy

  7. Brian says:

    I know a decent chunk of people who use NoScript and only enable pages when absolutely necessary. I also know people who like to browse full websites (not mobile) from cell phones. And let’s not forget about the accessibility angle, screen readers don’t like JS. Most use cases for disabled JS Facebook is OK to ignore, but they’re basically saying “our site doesn’t work with screen readers, sorry blind people”.

    I admit that it’s a very small tail end of their audience, but there really isn’t a whole lot of effort to fixing the problem, and accessibility is something that shouldn’t be ignored.

  8. So I think a part of the huge number of files for the “new” Facebook has to do with the fact that they aren’t done developing it yet.

    Also, as Jesse pointed out, to the majority of Facebook users, it’s just not important. Most are going to have JS enabled, most are going to be on some sort of broadband. Also, for the most part, those files are a one-time hit. Those files are all on a CDN, so likely have the headers so that they’ll be cached.

    So while I agree that they could do a lot to improve there, I don’t think it’s a super necessity. As you point out, they aren’t trying to please everybody. So the majority of users are happy and the tail-end you talk about doesn’t matter as much to them.

    Do the same analysis on MySpace and see what happens there. I bet the results are similar.

  9. [...] – bookmarked by 5 members originally found by mrlittlebig on 2008-10-07 Planet Mozilla Interns: Brian Krausz: Facebook: Sea Cow of the… http://nerdlife.net/?p=76 – bookmarked by 3 members originally found by thorstenrehm on 2008-10-05 [...]

Leave a Reply