Browser maker Opera has conducted a recent study to see how much of the web is standards compliant. Using a specialized web crawler, dubbed “MAMA” for “Metadata Analysis and Mining Application,” that searches around 3.5 million pages, the company has determined that a mere 4.13% of the web is standards compliant.
Of course, one wonders about the accuracy of this study. There are certainly more than 3.5 million pages on the internet. Perhaps they were only searching a portion of the web that had less valid pages? And does a site with 100 non-compliant pages count as 100 invalid pages? How many of those sites are invalid because they try to comply to Microsoft’s bogus standard (a.k.a the “does it look alright in IE?” standard) at the same time?
I can understand the small figure, and maybe it is realistic. After all, many a website almost validates, such as Reddit.com, which has one lone (and minor) error stopping it from validating. And heck, Google and Amazon are validity-challenged. Amazon has “1445 Errors, 135 warning(s)” on it’s front page.
Many monolithic sites that you’d think would validate don’t, though they look fine in most browsers anyway. This brings up an interesting question: Does it matter whether you meet the standard to the letter, or is it okay if it looks fine in all of the standards-compliant browsers? What’s your opinion?
News article: Opera study: only 4.13% of the web is standards-compliant
Interesting Reddit Discussion: http://www.reddit.com/r/programming/comments/77grk/