Friday, June 22, 2012

Is there a 'Moore's Law' for web pages?

I came across an article I wrote in 1999 for The Guardian entitled Cut you modem's flying time which mentions that at the time the HTML of The Guardian's home page was 18kB.  Today the home page is more like 250kB.

I was curious about the growth pattern so using the Internet Archive I downloaded the home page of The Guardian for every year available from 1996 to 2011 (plus the current page) and compared the sizes of the HTML of the front page.  Here's the raw data:

  Year  Bytes
  ----  -----
  1996   5381
  1997  11140
  1998  10435
  1999  39013
  2000  97746
  2001  70933
  2002  92995
  2003  81833
  2004  92637
  2005  92078
  2006 108445
  2007 118300
  2008 186670
  2009 184271
  2010 181221
  2011 192592
  2012 253748

This excludes anything other than the raw HTML of / on The Guardian.  Clearly, it's grown a lot, but curious about the pattern I decided to see what curve would fit by using Wolfram Alpha.  Linear, quadratic and cubic fits were all about an R^2 of 0.90.  But an exponential fitted with R^2 of 0.97.

The exponential that generates that curve is 28985.6 * (1.134292^x) (x being the year counting 1996 as 0).  For comparison, Moore's Law is n * 1.414213^x (doubling every two years; I don't have an estimate for n).
For that exponential doubling takes a bit more than 5 years.
I wonder if there's a 'Moore's Law' for web sites.  Are we seeing exponential growth in the HTML used?  And what happens if we take into account the other assets?  And what's the relationship between this and bandwidth consumed on the Internet?


doranchak said...

Interesting! Your plot seems to agree with Figure 1 in this study that includes the top 1000 web sites from 1995 to 2007:

Average Web Page Size Septuples Since 2003

Manuel Strehl said...

The trend is going towards doing more declaratively: Microdata, the data-* attributes, the video element, new input types and so on are clear signs.

I assume, that with more and more HTML5 features coming into play the HTML for a piece of content will naturally increase.

An important metric, however, would be, if, and how much, the net content of the page increased. Are there more teasers, more copy text, and so on than in 1996?

doranchak said...

Too many people design web sites that look like this.

Chris Smith said...

Considering that most pages have at best 1-10KiB of worthy textual content, I don't know how they've managed to produce such a crapfest.

Liam Clancy (metafeather) said...

You can get a lot more historical data from the HTTP Archive Project which is doing similar analysis:

Kevin Day said...

You're clearly overfitting the data with the exponential fit. Just because it has a higher R^2 doesn't prove anything. Linear is probably the only one that would make sense for that data set.

Matt Doar (CustomWare) said...

I would expect to see the byte count decrease in the future as such sites move towards client-side content. For example, is a single small page that is populated by client side JavaScript calls. Measuring the size of an HTML file may not reflect the growth in complexity.