CNN: “100 million sites on the web”. Oh really, CNN?

CNN, through Marsha Walton, just published a truly disturbing article about 100 million sites on the web today, worldwide. First of all, there are maybe 100 million domains on the WWW, not websites. That’s a HUGE mistake and CNN should be more careful about delivering such information to millions of listeners.

CNN’s story and coverage bases itself on Prof. Rebecca Grinter (Georgia Tech College of Computing) as well as Netcraft’s Rich Miller :

There are now 100 million Web sites with domain names and content on them

100 million domains

CNN video coverage with Miller (click play):

And since the guy mentioned blogs, how can there be only 100 million websites on the World Wide Web, if Technorati tracked 50 million blogs in August 2006 (report and graphs from David Sifry, Technorati CEO) ?

Technorati - 50 million blogs

Since David Sifry estimated about 100 million blogs in February 2007, I simply don’t understand how CNN can publish their article with such a big misinterpretation in it.

Once again, those are domains CNN. For any of you who would like to know the number of domains (and NOT websites :) ) registered in your country, here is a list.

Published by

Cristian Mezei

I am myself.

9 thoughts on “CNN: “100 million sites on the web”. Oh really, CNN?”

  1. According to my test right after BigDaddy Google had 25 billion pages indexed. This is the visible web, then there are a huge amount of sites that Google did not yet find.

    25B divided by 100M is 250. Does the average site has 250 web pages? Hmm. Interesting.

  2. Hmm… something went wrong in the previous post. So:

    They are right, Cristian. They say, as you quoted “There are now 100 million Web sites with ‘domain names’ and content on them”

    Of course, more specific is:

    “There are now 100 million Web sites with ‘different domain names’ and content on them”…

  3. Mihai: = web sites with domain name and content on it right ?

    So are 100 million other blogs, which in their turn are websites.

    So you are actually wrong (a bit). Correct is:

    “There are now 100 million Web sites with a TOP LEVEL Unique DOMAIN and content on them”…

  4. A domain can have many sites, for example the geocities crap.
    (Web) Site = a collection of web pages that uniform into a bigger entity.
    (Web) Page = a single page of content.
    The google count is documents / web pages.

  5. Ummm… 1 really big assumption… That news is about accuracy… Not anymore – News is just entertainment… Journalism has been sliding for a long time.

    And even what constitues a page is iffy. For example, ALTools runs on DotNetNuke – a CMS. It has 1 page, “Default.aspx”, and everything runs off of that “page”. Now there are several hundred pages in there. Add in forums at many sites and then what is a page, or even a document? These things are blurry at best.

    Do query strings count? When a page displays the current date, is it a new page on the next day?

    I’ve had data driven sites that had millions of “pages” all off of a few scripts.

    The idea of a web application is more useful now than ‘web site’. It’s easier to classify like that then. You simply then say that and are just 2 web applications. Which is true and you don’t have to debate about it. There are many subdomain spam sites out there. Do they count as individual sites for each subdomain? Sounds silly. But if you think of them as web applications things become a bit clearer.

Comments are closed.