Although
Google explicitly denies it and Yahoo! obviously uses it, in the search
research world, it has been suggested many times that search engine use
manual approvals and rankings of sites for the top 2-5000 queries.
These queries make up between 20-35% of all queries done in a commercial
search engine and by manually ranking pages for the top 10 returned
results, search engines can shave significant burdens from the server
load. It's no surprise, too, that search researchers believe that these
manual rankings will also improve the quality and perception of quality
of results for common searches.
For
obvious reasons, SEOs are fearful of this shift, but it is, in fact a
boon to the industry as a whole over the long term. Imagine being
manually ranked in the top 10 for an exceptionally popular search term.
The only way you can lose rankings is if the quality of your site/result
deteriorates in comparison to the competition. Instead of link-building
(which I personally find boring & distasteful), our jobs would be
primarily about building the best, unique content available for the
subject. That sounds like a switch I'd be happy to make.
If you need further convincing, take this quote from none other than Ricardo Baeza-Yates in a paper titled -Web Usage Mining in Search Engines:
"...precomputed (manually approved) answers could have better quality for example with manual intervention (e.g. main links related to a query as in Yahoo!). If we have precomputed answers for the most frequent queries, answer time is improved as we do not have to access the index... with only 2000 precomputed answers we can resolve 20% of the queries."