- The Zagat surveys stand or fall on their central premise: that thousands of separate opinions add up to something like the truth ... [But] the majority can be wrong, and one well-informed opinion is worth more than those of a thousand amateurs.
One common approach is to allow people to rate the reviews. Amazon.com does this for customer reviews, allowing people to vote on whether the review was helpful. Slashdot takes this a step further, not only allowing users to moderate (rate comments), but also allowing users to metamoderate (rate the rating of the comment).
Mimi Sheraton would probably criticize this approach as just layering a popularity contest on top of a popularity contest. And it does have problems. For example, positive reviews on Amazon.com seem to get many more "helpful" votes than negative reviews. Slashdot moderators seem to have an adolescent sense of humor and favor ill-informed rants, perhaps seeking entertainment more than information.
So, what else can we do? Another approach is to attempt to identify authoritative people and treat all of their reviews or comments as higher quality. This is closer to what Mimi wants, well-informed reviewers to count more than uninformed reviewers. The trick is identifying informed reviewers. Amazon and Slashdot both emphasize active users, I'd guess on the theory that those that bother to put in the effort to be involved probably have something useful to say. Users could rate each other, but this again reverts into a popularity contest.
This does seem like a spot where social networks actually could be useful. Who is an authoritative reviewer? Someone who is considered authoritative by other authoritative users. Yes, it's circular, but identifying a seed set of authoritative users is enough to start the process going.
Would this work? Or would it be just another popularity contest?