Quaap home
Adventures of a stay-at-home, skeptical, homeschooling dad (etc.)


Fat websites

Scales always mean overweight! I'm not one to make fun of the overweight and obese (being a member of the group and all), but… while I try to avoid mere cosmetic judging, I think pointing out negative consequences of obesity is sometimes OK. So, that's what I'm here today to do: talk about problem obesity… Your website is just too fat.

If you have a website, here's some homework: open up Firefox or Chrome, and go to the URL of your page. Then press ctrl+shift+i to show the developer tools. Then click on the "Network" tab. Finally, hold down shift and click the browser's reload button, and be amazed.

How many requests did it make? How many megabytes did it download? How long did it take? Did it ever stop loading stuff? Try reloading the page again and see how long it takes to be able to smoothly scroll down on the page. At any time, did it move, flash, or re-flow the thing you were trying to look at? Would you proudly show that network report to your boss? Or your mother?

Think of someone on an old computer. Think of someone on a slow connection. Think of someone on a data plan. Think of the children!

The first websites were all hand-coded HTML, with sparing use of images, and Javascript and CSS hadn't even been invented yet. So even though the internet was slow as dirt in 1995, the majority of websites loaded in a few seconds.

Flash forward to the present. Random websites now take dozens of seconds to fully load, preventing you from scrolling to the content you want to see, and moving your page elements around until that last useless giant picture, annoying advertisement, or social media tie-in loads completely. And then there's usually more updating and downloading when you scroll or accidentally mouse-over the wrong thing.

An example: If I go to The Mary Sue (a website I visit often so I'll know what my geek-culture friends are talking about) in Chrome, the initial page takes nearly 30 seconds to fully load, and weighs 7MB. That's pretty huge, especially if you consider that coming across on someone's data plan. And while it's updating, I can't scroll down without the page stuttering and rearranging things.

I'm going to use it as my example here, but it's not a uniquely bad example. Most sites are similar. Cnn.com has similar (or worse) stats.

One of my favorite articles on this phenomenon is The Website Obesity Crisis. The best quote from it is:

I want to share with you my simple two-step secret to improving the performance of any website. 1) Make sure that the most important elements of the page download and render first. 2) Stop there.

Really, it does not need to be more complicated than that. I know advertising is what pays for websites and those are often out of the website creator's hands, but even ignoring advertising, there are dozens of seconds and tens of megabytes of improvements to do.


Obviously images are a part of the modern web, and I feature many images here. But images can be one huge reason why a website is slow to download and render. A few rules will help:

1) Don't use unnecessary images.

Is the image part of the content? Does the image add substantially to the page's meaning? If not, do you really need to add it?

Don't use "hero images" (giant images at the top of your article): no one is impressed and they just annoy people who must wait for them to download, and then scroll past them to read your content. A few small non-content images to help "theme" your article or to give Facebook something to display alongside your link is fine, but those non-content images can and should be small: 50kb tops, but 10kb or less would be better.

2) Resize images to the size that they will be displayed in the browser.

Don't use a 1024x768 image when the browser will end up resizing it to 500px wide. Just resize it to the size you need (or close to it) ahead of time and your images will be much smaller and there'll be rainbows and kittens and peace in all the land.

2) Use JPG for images that are photographs or drawings.

Use jpeg format for photographs and hand-drawn images. Set the quality to between 75 and 90 and no one will ever notice a difference in quality, but the average photo will be 3 to 10 times smaller than a png or gif.

3) Use PNG for images that are mostly text, graphs, or line drawings.

JPEG's quality loss is most obvious in images with text or line drawings, so use PNG for those. PNG also supports transparency. You can also use the programs "optipng" or "pngcrush" to make your pngs even smaller, and if you don't have those programs, use the highest compression available in your image editor of choice.

Our test site The Mary Sue fails hard here. For example, the images across the top in the "featured posts" section are pretty big (the first one is actually 1184x693px), but are displayed as tiny 140x92px images. In terms of file size that's 8 times bigger than they need to be. Most images are like that, but at least they're using mostly JPG for photos!


Unless the page's main content is the video, do not enable autoplay on the video. Don't make things on the sidebar autoplay. Don't make things in the header autoplay. It only annoys people and distracts them from your content.

Here's a simple flowchart:

Are you Youtube? → No → Don't make any videos autoplay.


Many, if not most, modern sites are javascript crazy. Some places it makes sense: Facebook is 99% Javascript, but Facebook is incredibly dynamic and needs to update random user content in real-time. And they've done years of optimization so the site actually feels fairly swift. Most sites? Not so much.

To go back to The Mary Sue, it is basically a static blog that adds new articles several times a day. The main page could be updated statically when a new article is published and it doesn't need to be more dynamic than that. Instead, on loading, the main page keeps loading things, changing, and moving things around for nearly 30 seconds! That's really crazy! Why does it do that?

Well, a quick check of the source shows that it contains 57 <script> tags. 57? Fifty-motherfucking-seven. Seven of them are direct references to external .js files, and the other 50 are inline scripts. But, of those 50 inline scripts, at least 10 of them end up calling additional external scripts on other websites, so I have no idea how much Javascript there really is. So that's at least 17 or 18 calls to web servers far and wide to bring back pieces of Javascript of which the majority, I guarantee you, are not making your experience any better.

If I remove all the <script> tags and their content from the page source, the page size goes from 107034 characters down to 47946. 54% smaller. The .html file is more than half script!

Though I have not classified them individually, a good percentage of these scripts seem to be related to advertising, the rest deal with wordpress, styling, comments, etc. Again, I get that most of the time the page developer doesn't get to choose what kind of Javascript kruft the ads use, but someone does, and Jeezy Creezy it makes for a slow page load and annoying viewing experience.

A good rule of thumb: Javascript is supposed to make your website better, not worse. Your visitors are there to view your content: your text, your images, and your videos. This is true even if your website is an app. So ask yourself: if you took out all that Javascript, would the user's experience worsen, or improve.

When I disable Javascript on The Mary Sue, I can still read all the content and see the images just fine, but it loads and renders completely in a 5-10 seconds instead of 20 or 30, so maybe all that scripting isn't really necessary? The videos don't work, of course, but if there was a way to enable just the Javascript for media players, that'd be great. Maybe make a few static advertisements to go on the side and call it a day.


CSS styling can be pretty cool. It's a nice way to separate out the style from the content, as the hype goes, and it makes it much simpler to restyle a page or add fancy effects. But, many a modern website completely abuse CSS. Multiple files, full of kruft and questionable decisions.

The source of The Mary Sue shows it references 11 separate stylesheets. Eleven? Eleven. That's not including any styling brought in by Javascript.

That's (at least) 11 more calls to servers to bring in those oh-so-important bits of style. Unlike with Javascript, I don't know of a way to easily disable the loading of CSS files, so I can't say for sure that they're useless, but I'm fairly certain I could get it down to one 20K CSS file in an afternoon or two.

From all over

To view that one page on The Mary Sue, it took Chrome over 370 request, and 25+ seconds, and then kept going every few seconds. Many of those requests seem to be targeted to websites other than www.themarysue.com. Cnn.com took over 500 requests and 20-some seconds before it'd let me scroll down more than a little bit.

Back in 1994 or so, when I made my first website, I wrote all the HTML by hand, because it was fun, and there really wasn't any other way yet. I discovered CGI used it to generate HTML, then wrote my own content management programs. Eventually I decided to try things like Word Press and Drupal, but they were always so cumbersome and slow on the low-end webservers I was willing to pay for (and most of the slow websites I investigate are Word Press based). So after a few years of that, I started making my own very-lightweight design again.

Now, my main page, at this instant, weights 339KB and finishes loading completely in less than 2 seconds, usually less than 1. If it wasn't on a shared server this would be a consistent-sub-one-second site. The main page has "only" 16 images, but even if I added teasers and featured article (optimized) images all over the place, it'd only get a few fractions of a second slower. Granted, I have no advertising at the moment, but surely this hints at how much room for improvement there is.

About your website… I think it's time for a few lifestyle changes…