Mon

Oct 24
2005

Tim O'Reilly

Tim O'Reilly

The Amorality of Web 2.0

In a provocative essay entitled The Amorality of Web 2.0, Nicholas Carr skewers the idealism of folks like me and Kevin Kelly, both of whom have pointed out the potential of Web 2.0 to harness collective intelligence. Carr makes some good points -- even I am getting worried that the Web 2.0 hype is getting out of control -- but he does it in a way that I find disappointing, and increasingly common. His method is what Plato described thousands of years ago as sophism, "making the better appear the worse," not engaging in argument about the substance of what someone else is saying, but framing the discussion with straw men that can easily be demolished, arguments designed to win points rather than elicit truth.
 

For example, Carr focuses his argument against "collective intelligence" almost entirely on Wikipedia, ignoring all of the other examples described in my What is Web 2.0? article. And even in his discussion of Wikipedia, he makes the now-expected attack on the quality of entries with a few cheap shots rather than substantial analysis. While there clearly are problems with many Wikipedia articles, so too are there problems with traditional media. How can we castigate Wikipedia as flawed when our conservative television news services managed to persuade their viewers that weapons of mass destruction were found in Iraq, and that evidence was found linking Saddam Hussein to the Al Quaida attacks on 9/11!

What's more, Carr's article demonstrates the utility of Web 2.0 even as it denigrates the idea, displaying trackbacks from all over the web, and a rich tapestry of comments from readers. I'm writing this piece here, knowing that it will also be reflected automatically into the feedback on Carr's article. How remarkable would that have been only a few years ago, and how mundane today?

It's too bad that we can't have a real debate about ideas, rather than cynical rhetoric that creates heat without shedding much light. I agree with Carr's fundamental premise, about the amorality of technology, and the need to pour cold water on the idea that somehow new tools will change human nature. However, I wish he'd shown a little less amorality in his own writing.

tags: web 2.0  | comments: 26   | Sphere It
submit:

 
Previous  |  Next

0 TrackBacks

TrackBack URL for this entry: http://blogs.oreilly.com/cgi-bin/mt/mt-t.cgi/4360

Comments: 26

  Jason [10.24.05 04:27 PM]

It seems to me that one's position is undermined when a politically charged counter example is used. Particularly, when such a particular counter example was not absolutely necessary to prove one's point.

  Justin [10.24.05 05:13 PM]

Jason, well said.

  Rektide [10.24.05 05:53 PM]

Article was fairly blah. A large part was that it just lacked direction, and something as infantile as the collaborative collective intelligence of the web is a pretty easy target in the first place (there's your solipsism).

But the Web 2.0 is thoroughly immoral, in the whole, and as a platform. [Not just because the phrase itself is Ugly As Sin, but because] ultimately web 2.0 comes with enormous forfeits of control. Blessing and a curse, but consumers used to maintain their own data; if Adobe packed up shop tomorrow everyone would switch to Gimp with little more than a hiccup. Now we have some pretense of being able to manipulate the applications (where and how the WS's API licenses us to do so), which is royal good fun for Google map hackers and flickr scripters, but the deeper ramification is that consumers no longer own the data OR the application.

The entire stack, all the way down to the data itself, is running on someone else’s hardware that they've graced us with an interface to and maybe an API to access. What happens when companies close their web services, or close down altogether? Developers rely on good faith that their modifications and scripts will continue to work, and consumers can only hope that all functional and interface changes that the future has in store are changes they want. Have we completely forgotten how Google reduced the available API on Google maps already, how many times they've altered Google Groups against the hue and cry of the many? The web 2.0 has to it an extremely volatile dynamism, one I personally would be extremely wary of heralding as my new messiah. Tim, please forgive me for going completely Omega here, but while we've preach about the network externalities of Open, been lauded with tales about the wonders of Remix and Participation, we’ve quietly ignored how the system itself has reached an all-time new level of closure and control.

The only reason I can't declare web2.0 to be the very antithesis of Dataware is because so many services provide Atom feeds to let users recapture the data. At least intrepid hackers can rebuild the database to store what once would have been their own data, re-devise algorithms to process the data. In my mind at least, the real revolution is going to require a far better platform, one which is not so intrinsically mutually exclusive to open source. We've gone from having no control over the source, to having no control over the entire application. The new apps aren’t distributable. Web 2.0 preaches new freedoms and open boundaries. But it also comes with an entirely new level of control over everyone using the system.

I hope that's good enough for some debate.

  Danny [10.24.05 05:58 PM]

Tim, ok, I felt generally the same way about Carr's article. A striking example of low journalism, no less.

Now a particular aspect of the Web 2.0 idea I thought useful was the way it forced a body to stand back from immediate technologies and take stock of common ground in the bigger picture. An abstraction, if you like (heh, this coming from a Semantic Web enthusiast). But the emphasis the phrase has attracted in the wild, on twiddly bits of UI and the rash of fairly unremarkable startup offerings (apparently built for VC) makes me wonder...not about the phrase or the attitude one might associate with it, but whether there is anything remotely joined-up about the development work that's happening right now. I could be wrong, but to my eyes even calling it a bubble seems to indicate a kind of consistency I'm not sure is out there. I hope I'm wrong on this, because I consider the Web priceless. I'd hope for (loosely-coupled) coordinated forward-looking.

But while I'm out on this limb, please allow me to play the Devil's Advocate: Would it really matter if the phrase "Web 2.0" lost all credibility? Who or what would it actually hurt?

  Alexander Muse [10.24.05 08:21 PM]

The idea that you can indict the Web 2.0 idea on the basis of lousy Wikipedia entries is plain silly. It is like indicting the entire traditional media for the actions of Stephen Glass, Jayson Blair and Dan Rather.

  Matt Asay [10.24.05 09:38 PM]

I agree with the first two comments above, Tim. Going after "conservative television" was both faulty (Last time I checked, it wasn't just Fox or "conservatives" who botched first reactions) and flawed - it didn't help your argument any more than Nick's sophistry helped him win you over.

Regardless, I found Nick's views refreshing. Not because they undermined your basic premises about Web 2.0, but because they undermined the irrational, unthinking wake you've left. It's not your fault that people follow your thinking without thinking, but it's good to have someone (like Nick, in this case) stand up and force people to think.

In this way, I disagree that Nick's tactics were "amoral." I think people needed a bit of a jolt. (And isn't it strange that in a world with several billion people, we can essentially pinpoint one voice going against the tide? That takes courage. Or ego. Or both.)

Again, I didn't see Nick's comments as directed at you, per se. They were directed at the VCs and entrepreneurs thoughtlessly bleating their way into Web 2.0 nirvana, without really understanding what it was that you've been saying.

Just my $.02.

Matt

  Nollind Whachell [10.24.05 09:39 PM]

"I agree with Carr's fundamental premise, about the amorality of technology, and the need to pour cold water on the idea that somehow new tools will change human nature."

Actually I think the amorality of the technology is a necessity if we want to embed our own morality and culture within it. It is not the tools themselves which will change human nature but the way we choose to interact with one another using these tools. The choice always lies within us.

More details in my post entitled Web Squared.

  Tim O'Reilly [10.24.05 11:02 PM]

Jason, Justin, and Matt --

I apologize if I offended with a politically charged example of how mainstream media can be demonstrated to have just as much misinformation as Wikipedia. However, I don't think the point would be different if we substituted some excess or omission of the liberal media, of which there are also many. Please feel free to supply an alternate example. By contrast, Carr's point depends entirely on the examples chosen, since he is using selected examples to discredit the whole, not merely the specific cases shown.

Matt -- I have no problem with skepticism about Web 2.0. Readers of this blog will note that I recently pointed approvingly at Ed Sim's skeptical post about Web 2.0-based VC pitches. I'm skeptical myself about the excesses of people who follow the "irrational unthinking wake [I've] left." But that doesn't change my opinion that Carr picked the battles that he could win, and won them (if he did, in any reader's mind) only by misrepresenting or ignoring the other side. I was disappointed more than anything else, because there's a lot to be genuinely critical about. For example, Rektide's comment above is far more profound and thought-provoking than anything in Carr. (Rektide, I'd love to see you develop out those ideas into a full-on critique, because you have indeed put your finger on "the dark side" of what we're for now calling Web 2.0.)

Back to Matt: As to whether or not the allusion to the misinformation about Iraq provided by conservative pundits is correct, here's coverage of a study that found viewers of Fox were significantly more likely to hold "misperceptions" about this issue than, say, viewers of PBS. Obviously, that's not relevant to the original subject of this post, but given that the example I chose elicited significant criticism, I want to provide concrete backup for that bit of editorializing. If any of you have information debunking the University of Maryland study, I'd be glad to hear it.

  hfb [10.24.05 11:49 PM]

So is the 'collective consciousness' only valid if they agree with you? He has a very valid point on Wikipedia as it's not just the content that is often flawed or downright wrong, but the politicing that goes on behind the scenes in various factions, e.g. the various Tolkien related pages, goes beyond the pale at times.

The hype and the religious aspects of pimping 'Web 2.0' are rather oppressive, too. Why does everything have to be a 'revolution' which is just a codeword for the same thing delivered with a new marketing pitch?

If you've never read Caleb Carr's "Killing Time", perhaps you should.

  Sergey Schetinin [10.25.05 12:41 AM]

How about this solution? The idea is to create user demand for more complete data availability (via creating richer-than-browser clients).

  Phillip Fayers [10.25.05 03:01 AM]

even I am getting worried that the Web 2.0 hype is getting out of control




Don't worry too much about that, it's bound to happen. Web 2.0, like all other IT trends is following the path of the Hype Cycle. A concept described by Gartner Group analyst Jackie Fenn. Looks like we might already be at the "Peak of Inflated Expectation", ready to head on down to the "Trough of Disillusionment".

  Tim O'Reilly [10.25.05 07:49 AM]

HFB --

Not at all. I'd say the collective consciousness has all the flaws of individual consciousness, all the rights and wrongs. It's Carr who read into comments by Kevin Kelly and myself the idea that somehow this is an unalloyed good. I believe that something significant is happening, that the internet has reached a stage where a new kind of "collective intelligence" application is possible, but despite reference to "the wisdom of crowds" (which argues that aggregated individual opinions can often be more accurate than the selected opinions of experts), there is no premise that the collective intelligence is always right. Heck, spam of all kinds is the result of the same tech trends that give us benefits like PageRank or the rich reader commentary at Amazon. As Cory Doctorow says, "all complex ecosystems have parasites."

  Andrew Lin [10.25.05 07:53 AM]

To say that Web 2.0 is amoral implies that it had a morality to begin with. This morality could only be that of it’s participants. And to call all of it’s participants amoral, in my opinion, can only be a generalization drenched in fallacy. That said, I do understand and share Carr’s caution of venerating the amateur and distrusting the professional. After all, there is a reason why the pros are pros. I would never want to help propagate false or inaccurate information.

I wonder though, at what point did this shift occur? At what point did the amateur opinion ever come close to trumping the professional’s? My only supposition that I’ll enter into the fray is that of Marshall McLuhan: The medium is indeed the message. Has the medium so radically changed whereby the message of the amateur takes on equal weight to that of the pro? Has the medium brought about a social paradox whereby social intelligence and “the hype” eclipse and overshadow the clout of the seasoned individual? Is there a reversal mechanism in play now, or will there be in some not-too-distant future? More questions than answers here.

I'm writing more about it here.

  Eric Lussier [10.25.05 08:42 AM]

last.fm is a great example of how a collective can produce solid information. Obviously rating music is not nearly as controversial as an entry about Jane Fonda in wikipedia. Check it out if you haven't, but be forwarned, it's rather addictive.

  Tim O'Reilly [10.25.05 08:49 AM]

Given the direction the commentary on this piece has taken, it behooves me to give a more detailed response to Carr's article.

Let's start with his opening paragraph, a feast of hyperbole with absolutely no evidence presented, and quite contrary to my experience:

From the start, the World Wide Web has been a vessel of quasi-religious longing. And why not? For those seeking to transcend the physical world, the Web presents a readymade Promised Land. On the Internet, we're all bodiless, symbols speaking to symbols in symbols. The early texts of Web metaphysics, many written by thinkers associated with or influenced by the post-60s New Age movement, are rich with a sense of impending spiritual release; they describe the passage into the cyber world as a process of personal and communal unshackling, a journey that frees us from traditional constraints on our intelligence, our communities, our meager physical selves. We become free-floating netizens in a more enlightened, almost angelic, realm.

Kevin Kelly definitely makes "angelic" references (albeit rhetorical, in the sense of "bird's eye view", with no implication of spiritual value) in the Wired article Carr quotes from later, but Carr not only exaggerates but conveniently backdates his exaggeration.

I have no memory of any such rhapsodizing about the early web. We were initially focused on just getting people to pay attention, then, once people got that the web was important, we were focused on figuring out the right business model. There was definitely some starry-eyed idealism about how the web made everyone equal, an idealism that I did my best to skewer. In my 1995 article, Publishing Models for Internet Commerce, I wrote:


Everyone's initial thought is that the net does away with the need for a distribution layer. After all, any site is accessible from anywhere else. This is clearly far from the case. First of all, net bandwidth isn't evenly distributed. Hosting of mirror sites and data warehousing are enormous opportunities. But perhaps more important than the physical distribution layer are the mechanisms for "attention distribution."


You can see this very clearly in the development of Web-based advertising. I believe that GNN was the first Web site to use advertising sponsors; in the two years since, we've seen the market go from incredulity ("You can't do that!") to arrogance ("Anyone can do that!") to a more mature realization that while anyone can put up a Web site, not every Web site will get an equal number of hits.


When there were only a few hundred or even a few thousand sites on the Web, all one had to do was to set up shop and let the visitors come. With tens of thousands, and soon hundreds of thousands of sites, it becomes clear that setting up a Web site is something like setting up a shop on the streetcorner outside your factory. You may get some visits from passers-by, and because distance is no object, you can easily invite in your existing customers, but the rest of the world may never know you exist. Companies try to get "word of mouth" by creating more and more innovative or controversial come-ons, but that will only go so far. As the market matures, you'll see a topography emerge in which certain sites stabilize as the focus or starting point for a certain type of user.


Back to Carr. He goes from his initial unsubstantiated claim that the early web was characterized by religious millenialism to a substantial quote from Kevin Kelly's Wired article. And while Kevin certainly waxes eloquent about this decade as a world changing moment, akin to the birth of the great religions, or the framing of the American constitution, he never makes any of the claims that Carr makes on his behalf, that "all the things that Web 2.0 represents - participation, collectivism, virtual communities, amateurism - become unarguably good things, things to be nurtured and applauded, emblems of progress toward a more enlightened state."

This is why I called Carr's essay an exercise in Sophism. He starts with an outrageous, unsubstantiated claim, then adds a layer of interpretation to what he quotes -- an interpretation that was not in the original -- and then proceeds to demolish his straw man.

Finally, he gets concrete, dissecting a couple of Wikipedia entries as a way of demonstrating the flaws of collective intelligence. But as an argument with Kevin Kelly or me, this is again a straw man, since neither of us would actually subscribe to the premise that Carr puts into our mouths. Wikipedia's accuracy or lack thereof actually has no bearing on my argument that Wikipedia represents "profound change in the dynamics of content creation." I stand by that statement, though not the one that Carr ends up arguing with, namely that "Wikipedia...has to be a beautiful thing if the Web is leading us to a higher consciousness."

As a publisher, I am quite concerned with the "profound change" that I referred to. As Carr argues a bit later in his piece, in one of his few substantial comments:

The Internet is changing the economics of creative work - or, to put it more broadly, the economics of culture - and it's doing it in a way that may well restrict rather than expand our choices. Wikipedia might be a pale shadow of the Britannica, but because it's created by amateurs rather than professionals, it's free. And free trumps quality all the time.

If Carr had begun his piece here, he would have gotten a lot less attention, but made for a much fairer and more substantial argument. I agree with Carr that the prevalence of "free" content is making other kinds of content that were once economically valuable no longer viable. For example, reference books are a much smaller part of my publishing program than they were ten years ago, because the Internet excels at reference, and even though much of what's available is of lower quality than what publishers used to sell, it's good enough for many readers' purposes.

But isn't this just Clayton Christensen's innovator's dilemma? Good enough replaces better, and (if history is any guide) eventually catches up.

But more to the point, as I argue at length in my piece The Open Source Paradigm Shift, we can learn from both the PC revolution (in which "worse" computers replaced better ones) and the open source software revolution, in which "worse" software is driving out polished commercial applications, that, despite the fears of naysayers, value doesn't go away, it just migrates elsewhere, due to what Christensen calls "the law of conservation of attractive profits."

Recorded music made the home piano an anachronism rather than the focal point of family music creation, and we're poorer for that. But what did we gain? Far more widespread access to all the world's music. And even as mastery of the piano faded away as a mark of social accomplishment, the impulse to make our own music burst out again into flower, with the guitar, and rebellion, as its symbols.

Nicholas -- this isn't about rapturous views of a transcendent future. It's about the way the world is becoming different, and how thinking about the nature of that difference helps us to adapt and prosper. Microsoft knew something about the way the PC would change the computer industry that IBM didn't, and profited thereby. Google knows something about the web that AOL didn't, and is profiting thereby.

Web 2.0 is a convenient name for describing some fairly significant technological, social, and economic changes. Understanding those changes is important for anyone who wants to do well in the new era. You do a real disservice to your readers by trivializing the issue, and for rhetorical points, demolishing the straw man that you have constructed.

(P.S. I notice that my response to Carr hasn't yet appeared in the list of trackbacks. Awaiting approval, or failed to pass the test?)

  Hooman Radfar [10.25.05 11:26 AM]

There has been an increasing amount of controversy surrounding the hype associated with Web 2.0. To all those folks that are already gearing to attack--take it easy--Web 2.0 is still new.

Ask your mom and dad if they use any of the popular Web 2.0 services. Ask folks at Fortune 500 companies what their strategy is to react to the evolution of Web 2.0. Ask your friends if they have invested in any Web 2.0 companies stock, or are excited about the exciting new direction that the web has taken. They will look at you like you are crazy. And, the first question they will undoubtedly ask is--what is Web 2.0?

In my humble opinion, I think we need to do less talking and more coding. There will always be naysayers and luddites. They will point fingers and cry foul at every turn. Ultimately they will pick on the visionaries like Tim. I am always reminded of this quote from our friend Howard Aiken, "Don't worry about people stealing an idea. If it's original, you will have to ram it down their throats."

Later fanboys.

  Rektide [10.25.05 04:01 PM]

@Eric Lussier: last.fm, or, as I much preffered it, Audioscrobbler, is one of the greatest things the internet has done for consumers and artists alike. True long tail magic. Good call, a shining example of moral activity.

  dennis [10.27.05 07:40 AM]

Even to the extent his criticisms are valid, they're premature. I seriously doubt that thirty years from now, the Web is going to look much like it does now. But I do think it's going to continue to evolve in the "2.0" direction, with ever more sophisticated methods of collaboration. Wikipedia has its flaws, but really it's a rather simplistic system. I think we'll do better.

  Eric Lussier [10.27.05 09:39 AM]

I read recently that the founder of Wikipedia is interested in implementing some open source type QA strategie. Simple enough, sections of text that have been certified by an "authority" are marked as "stable". I guess we'll be seing entries like "History of the web: Beta". Thanks for the follow up , Tim.

  Joe Shelby [10.29.05 11:49 AM]

Actually, its just the baseness of Carr's Wikipedia examples that's been bugging me. The trouble is that he's measuring the quality of an encyclopedia on how it handles contemporary things. People don't use an encyclopedia for things of such dynacism or currency. Would I go to Brittanica or even my cdrom of Encarta for information on "Bill Gates". No; I'd go straight to Google.


Had he actually demonstrated that the EB, as Wikipedia's "competition" is better at those two specific entries, he might have at least been able to justify his use of the examples, but it wouldn't have supported his primary gripe.


Encyclopedias are used for science and for history, not for current events. In fact, the strength of Wikipedia is that it CAN be continually updated to support current events (look up the Harriet Miers story), though when dealing with politics there's always the back-and-forth of removing the other side's commentary. The theory is that eventually by weeding out the non-truths from the other side, the truth will emerge. As long as one side doesn't hold to an absolute that the other side won't acknowledge ("China as a nation is never wrong and never lies about its history to its citizens."), the eventual resolution will be complete and as accurate as humanly possible.


Once THAT's overwith, someone with decent English skills can come along and clean up the text.


I got into the habit of looking up Wikipedia for WW2 battles and found the information complete, accurate, and given the ability to link to the various players involved, infintely more useful than looking for it in a book. Wikipedia as a history source is well on its way to providing the average student with all the power of "Connections" that James Burke has been promoting his entire career. Its key advantage is that the users can add more connections than are already there.


Another place where its limitation is its advantage is in the size of the content. The EB has inherited its work, and its reputation, from its paper-bound edition of the past (does it still maintain that?). From an editorial standpoint, this does give it the advantage of having clear, concise wording looked over by professional editor after editor for (in some entries like British history or zoology/biology) decades. A little national pride hasn't hurt, either.


On the other hand, Wikipedia is able to dedicate space to alternatives, to addressing new evidence and new theories that the EB would be willing to wait for years for the scientists or historians to work it out before publishing. Space is cost to be paid out by EB; Space is opportunity to be better for Wikipedia.

  Jason Coleman [10.31.05 11:20 AM]

I think Tim's point with his example, whether well chosen or not, was that anecdotal examples prove nothing. There are always outliers on any trend. Being able to point them out does not disprove the general trend itself. I believe Tim was suggesting that Mr. Carr would have done better to try and demonstrate the trend rather than just pick out some select cases.

We can attack buzzwords ad nauseum, but in reality, Web 2.0 is just the notation that describes using data to create something that is more than the sum of it's parts (are cliché's better that buzzwords?). The web is a pile of information that is increasing exponentially and assigning some form or order to it seems, to me, the essence of Web 2.0. Here is the collective intelligence that Tim O'Reilly speaks of. I obviously don't speak for him, and I can't say I have the same enthusiasm about it that he and so many others do. However, it is a non-discussion to focus on some ill-considered examples rather than the broader notion.

  Tim Norton [11.04.05 08:11 AM]

I think theres no argument that you can find low quality writing on wikipedia, but what exactly is quality writing? Something well presented, or the actual substance and truth of whats being written about. I know I'd certainly prefer the latter, there is nothing quite so dis satisfying to see and distructive on the world as lies written beautifully.

The problem with media is that the focus has been about dressing information up, and all this has created is a world where substance and facts are second. If its told properly, people will absorbe it. And this has left us with a increasing base of false perception.

I think its a very positive sign that you can read a page on wikipedia which is clearly being contended by multiple people, there are many different views to create an accurate view of whats happening, whats happened. Our history has been written by victors, and while their tales often appear miraculous, if we were to live in the times we would have seen the simple failures of them. Felt the opression of them. Yet as common people we would have no say in the writing of the works that get fed to the common people of tomorrow.

The biggest concern with something like wikimedia is that the mainstream will try to allign the information and views which are being broadcast. But the only sustainable way to counter this is to ensure the platform is open, so people can take their time, driven by their feelings and beliefs to have their say, and continue the process of clearing out some reality and truth from the fog that seeks to shield it.

Ofcourse the web, the peoples web is not always an answer to attaining enlightenment, but as with any path to enlightenment, the answer is not to look for answers, but means which let answers flow in.

The peoples web is making this ability the top priority, ability to share views rather than settle on which is right and set it in concrete.

This is the fundamental flaw not just in professional media, but often professional everything. Whenever we as professionals have a closed circle, we have less progress, we get more self righteous, less in touch with reality and more concerned with creating one which enables us to prosper.

The most satisfying and liberating affects on my professional life has been opensource. One day i look forward to openpharama, openfood, openmedic....

The risks created by allowing non-profesionals into areas of the world which rely on knowledge are far outwieghed by the opportunity to grow the base of knowledge and more importantly, the base of people it is distributed amongst. This is the opportunity of the open, peoples world, and peoples web.

Why do we have millions of people on medication, because they need it? or because many dont understand their body and the affects of their actions on their health. We shouldn't all have to become doctors to find these things out.

  npdoty [01.04.06 08:10 AM]

You might be careful with your charge of sophism. Socrates was brought on trial in Athens for making the weaker argument seem the stronger (amongst other things, but that was the central complaint, I think) and such claims are often made in place of substantive argument. Now you might be right that Mr. Carr's article is wrong (I'm not sure that's right or that that's what your complaint is) or that it is sensationalist (that does seem to be right and to be the focus of your objection) but preceding your comments with accusations of sophistry makes me suspicious from the beginning.

  Tim O'Reilly [01.04.06 01:28 PM]

npdoty -

I'm curious about your response. I did give concrete examples in counter-argument to Carr, and I continue to think that characterizing what he did as sophism is quite accurate. I don't believe that Socrates was brought up on charges of sophism -- that was his criticism of others. He was brought up on charges of corrupting the youth, and ultimately, for making the powers at be look bad.

  npdoty [01.06.06 01:06 AM]

My apologies if my response came off as an attack, as I had truly meant no such thing. With regards to the philosophical question, you're quite right that it was making the elders look bad that was Socrates's crime -- but it's phrased, at least, as an accusation of sophism or something like it: "Socrates is an evil-doer [...] and he makes the worse appear the better
cause" (from the Apology from Project Gutenberg). By convincing the young that the elders and their accepted views are wrong, they believe that Socrates has been tricking the young rather than showing them the truth. (And Socrates's rather perplexing style of argument makes such a conclusion understandable -- Socrates makes you doubt what you are sure you know.) My point was merely that accusations of sophistry are often just signs of disagreement with an argument that appears successful.



You certainly do give examples in your counter-argument to Carr. But it seems to me that Carr gives reasonable examples as well: writing in Wikipedia often can be bad (those two pieces don't seem all that atypical to me) and the lack of funding for in-depth research in the blogosphere is an understandable worry (though there might be some counterexamples).



It might just be that Mr. Carr misunderstands your position. It seems clear from some of your comments to this post that you accept that there are dangers to collective intelligence as well as individual intelligence, but this might not be obvious from your noting in the earlier article the wisdom of crowds and the selecting for value that the blogosphere provides. Your longer comment here addressing the end of Carr's article is very valuable to all, I think, in distinguishing your view from what Carr characterized incorrectly (on purpose or not) as evangelism. The straw man that Carr set up was not the specific Wikipedia entries that he chose, but the over-idealized misrepresentations of advocates such as yourself.

  Credibility search Directory [03.30.06 08:33 PM]

I think Tim's point with his example, whether well chosen or not, was that anecdotal examples prove nothing. There are always outliers on any trend. Being able to point them out does not disprove the general trend itself. I believe Tim was suggesting that Mr. Carr would have done better to try and demonstrate the trend rather than just pick out some select cases.

Post A Comment:

 (please be patient, comments may take awhile to post)






Type the characters you see in the picture above.

RECOMMENDED FOR YOU

RECENT COMMENTS