Thursday, October 04, 2007

Web 2.0 is dead and spammy, long live Web 3.0?

While I am no more a fan of the name Web 3.0 than Web 2.0, Jason Calacanis has an entertaining rant on where the Web is going:
Web 3.0 throttles the "wisdom of the crowds" from turning into the "madness of the mobs" we've seen all to often, by balancing it with a respect of experts. Web 3.0 leaves behind the cowardly anonymous contributors and the selfish blackhat SEOs that have polluted and diminished so many communities.
This reminds me of what Xeni Jardin wrote back in Oct 2005:
Web 2.0 is very open, but all that openness has its downside: When you invite the whole world to your party, inevitably someone pees in the beer.

These days, peed-in beer is everywhere. Blogs begat splogs -- junk diaries filled with keyword-rich text to lure traffic for ad revenue.
See also my previous post, "Growth, crap, and spam", where I said, "There seems to be a repeating pattern with Web 2.0 sites. They start with great buzz and joy from an enthusiastic group of early adopters, then fill with crud and crap as they attract a wider, less idealistic, more mainstream audience."

See also my previous posts, "Community, content, and the lessons of the Web" and "Getting the crap out of user-generated content".

[Calacanis post found via Nick Carr]

5 comments:

DigitalBricklayer said...

Gee Greg, the Calacanis post was a classic piece of link bait which it appears you swallowed whole.

How human edited websites are a big innovation I don't know.

I'd be prepared to bet that most of the researchers on Mahalo are no more qualified than the editors at Wikipedia.

Bob Warfield said...

The web in general is subject to punctuated equilibrium (a theory of evolution) which leads it from the sublime to the homogenized junk. What facilitates this is low friction. Increase the friction slightly (in my post by looking where it is hard to find) and the noise level goes down considerably:

http://smoothspan.wordpress.com/2007/10/01/the-internet-first-breeds-diversity-then-conformity-punctuated-equilibrium/

Greg Linden said...

Hi, Bob. In your post you linked to, you said:

If someone really wants to innovate on search, they'd find a way to identify the off-the-radar results that matter. They'd help people to sail off the edge of the known world. It's the only way to find new worlds.

That is pretty much what Findory attempted to do, learning from your reading and searching to recommend news and information that would have been difficult to find on your own.

If you're interested, here is a small selection of older posts with some parts of Findory's vision that may be relevant to what you are discussing in your post: "Findory to shut down November 1", "Personalized blog search", and "Organizing chaos and information overload".

And, in an another post, "Winner takes all, relevancy, and personalized search", there was a discussion that might be of interest on the winner-takes-all effect and how we need to break it.

dan said...

No, Jason really does think that. I get to hear it from him every day.
I think that Findory was closer to what search will look like in the future than many other things out there. To call it machine-based search would ignore the real value in it - harnessing passive participation. Discovery is missing from so many systems out there, yet it seems so obvious once you use them. Napster would help you find what you already know, but never turn you on to new things. In contrast, the recommendation engines of Amazon and to a lesser extent iTunes show you so much more, and that is a big part of their value to users.

I give Greg a lot of credit for launching Findory, and am sorry to see it go. While I was never a user, I was very interested to watch it evolve.

Kaila Colbin said...

Hey Greg,

As reinforcement of your title, I submit Hugh's Law (from Hugh MacLeod at gapingvoid):

"All online social networks eventually turn into a swampy mush of spam."

@Bob, I suggest that increased friction isn't the only way to reduce noise. Increased alignment (which is what we work on at VortexDNA) can be far more effective. If you allow less content through (increased friction), you might get less spam by volume, but it could be the same by percentage. Wouldn't it be great if you could look at as much content as you wanted and have it all be relevant?