Serious Games challenging us to play better searches
Via: Google Webmaster Central Blog – Finding More High-Quality Sites in Search + High-Quality Sites Algorithm Goes Global, Incorporates User Feedback
Over a month ago Google introduced an algorithmic improvement designed to help people find more high-quality sites in search
Yesterday they've rolled out this improvement globally to all English-language Google users, and they've also incorporated new user feedback signals to help people find better search results.
In some high-confidence situations, they are beginning to incorporate data about the sites that users block into their algorithms. In addition, this change also goes deeper into the “long tail” of low-quality websites to return higher-quality results where the algorithm might not have been able to make an assessment before.
The impact of these new signals is smaller in scope than the original change: about 2% of U.S. queries are affected by a reasonable amount, compared with almost 12% of U.S. queries for the original change.
While Google is not making any manual exceptions, they will consider your feedback as they continue to refine their algorithms. They will also continue testing and refining the change before expanding it to additional languages.
Serious Games challenging us to play better searches
Over a month after first introducing their search algorithm improvements, Google have now rolled-out the changes to all English-language Google users. The changes are intended to reduce the influence of ‘link farms’ but relevant websites are expected to rise through the search results – the code name of the update being Panda (they named it internally after an engineer, and his name is Panda).
The update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.
Google depends on the high-quality content created by websites around the world, and feels the company has the responsibility to encourage a healthy web ecosystem. Therefore, it is important for high-quality sites to be rewarded, and that’s exactly what this change does.
It’s worth noting that this update does not rely on the feedback they've received from the Personal Blocklist Chrome extension, they launched early February. However, they did compare the Blocklist data they gathered with the sites identified by the new algorithm. According to Google, “If you take the top several dozen or so most-blocked domains from the Chrome extension, then this algorithmic change addresses 84% of them, which is strong independent confirmation of the user benefits.”
Early March, Steven Levy had breakfast at the TED conference with Google engineers who wrote the blog item announcing the change: the company’s search-quality guru Amit Singhal and Matt Cutts, Google’s top search-spam fighter.
The edited transcript was posted at Wired.com under the title TED 2011: The ‘Panda’ That Hates Farms: A Q&A With Google’s Top Search Engineers.
Here are the excerpts:
Singhal: So we did Caffeine (a major update that improved Google’s indexing process])in late 2009. Our index grew so quickly, and we were just crawling at a much faster speed. When that happened, we basically got a lot of good fresh content, and some not so good. The problem had shifted from random gibberish, which the spam team had nicely taken care of, into somewhat more like written prose. But the content was shallow.
Wired.com: How do you recognize a shallow-content site? Do you have to wind up defining low quality content?
Singhal: That’s a very, very hard problem that we haven’t solved, and it’s an ongoing evolution how to solve that problem. We wanted to keep it strictly scientific, so we used our standard evaluation system that we’ve developed, where we basically sent out documents to outside testers. Then we asked the raters questions like: “Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?”
Wired.com: But how do you implement that algorithmically?
Singhal: You can imagine in a hyperspace a bunch of points, some points are red, some points are green, and in others there’s some mixture. Your job is to find a plane which says that most things on this side of the place are red, and most of the things on that side of the plane are the opposite of red.
Wired.com: Do you feel that this update has done what you wanted it to do?
Singhal: It’s really doing what we said it would do.
Wired.com: Some people say you should be transparent, to prove that you aren’t making those algorithms to help your advertisers, something I know that you will deny.
Singhal: I can say categorically that money does not impact our decisions.
Wired.com: But people want the proof.
Singhal: There is absolutely no algorithm out there which, when published, would not be gamed.
Wired.com: This does seem to be a period where Google is getting more criticism of its search practices and quality.
Singhal: People expect that we will do a good job, and that’s appropriate. The criticism is a good thing because that mean that they really want us to do an even better job, which we’ll go next week and do exactly that.