Molly Holzschlag has posted an interesting comparison of web standards as used by popular search engines. Of particular interest were the comments regarding Google, whose site does not validate, uses tables for layout, and generally violates the sensibilities of hard core standards proponents. The following comment to Molly’s post by "Frank" says it all.
Google’s search results page has been retooled using semantic markup and CSS by at least two dozen people as it is, but they just don’t care. It’s annoying, it’s sad, and it’s also pointless.
There have been some retooling jobs that saved an awful lot of markup and would thus, as a result, save Google ridiculous amounts of bandwidth, but did they show any interest? Nope.
This flies in the face of what I’ve heard about Google: namely that they make every effort to optimize their output—not for file size, but for speed. Changes which increase their output time are rolled back. (I imagine gzipping their output is an exception to that, but given their traffic volume, the savings are likely significant.) Take a look at their source code, there are several things worth noting. First, it’s quite compact; there is little extraneous white space, and shortened versions of tags are frequently used. Second, style information is embedded in each page rather than linked separately. Why? I suspect they found it’s faster to simply output their (brief) style sheet than to embed the link and have to process and additional web request, causing load on their servers (TCP is not exactly a light protocol), and delaying the browser’s ability to display content.
I made a similar standards argument during my recent work with a major internet retailer. Not only did they use table-based layout (frequently wrapping single-cell tables around block level elements), they unabashedly used spacer gifs. I jus about screamed when I saw it, but I wasn’t there as a web programmer, so their wasn’t much I could do about it. I did, however, write a brief paper outlining the performance and bandwidth savings. While processor performance likely wouldn’t be terribly affected (the limiting factor being the network connection), I estimated bandwidth savings to be at least 15%, just for tweaking the worst 20% of the page! The idea was tossed around, and gained some interest by a handful of middle managers, but was never implemented. Strategic projects—those focused on making money, not saving it (and bandwidth is sort of a sunk cost anyway)—were given priority, and developer resources were limited. Additionally, all changes to the web code were costly, as the entire site was handled by a monolithic CGI (which should be apparent to any technical user who bothered to study the URLs).
Web standards is a great ideal, and can be a great cost savings, but can also get in the way of performance. It requires web developers trained in implementing them properly—which nearly always costs more than college students who learned on their own. Few companies are willing to pay a premium for an end result that looks the same to 80%+ of their customers.
While I firmly believe standards should have broader adherence, I think standards advocates sometimes forget about the bottom line.