by Henry Farrell on September 15, 2005

“Tyler Cowen”: at Marginal Revolution has some interesting things to say about his experiment in allowing comments on his blog.

1. Visitor stats rise considerably. But this happens so quickly, I believe it is people hitting “reload” to read additional comments, rather than more readers.

2. The more that comments are regularly available, the more rapidly the quality of comments falls. The quality of comments stays high when it is periodic, not automatic, and when we request comments specifically.

3. The quality of comments is highest when the matter under consideration involves particular facts and decentralized knowledge. Posts which mention evolution, free will, or Paul Krugman do not generate the highest quality of comments.

So my current sense (Alex chooses his own course, though I believe he agrees) is to ask for comments periodically rather than always having comments open. The goal is to maximize the real value of comments, rather than the number of comments (or measured visits) per se.

Which of these specific claims can be universalized? Speaking, like Tyler, from personal experience, it seems to me that his observations on visitor stats are probably generally true. The relationship between the general availability of comments, and the quality of the comments falling in particular varies considerably from blog to blog. _Making Light_ has been extraordinarily successful in building up a community of commenters with interesting things to say (it has a homier feel than most comment sections; everyone mostly knows each other). The argument that more commenters=less interesting discussions has a lot of truth to it – there is very clearly a Gresham’s law effect, where bad commenters drive out good ones. Which suggests (and again _Making Light_ illustrates this well) that a vigorous moderation policy can help counteract the negative effects of growth. Finally, Tyler may be on to something when he talks about specific facts and decentralized knowledge – but there’s another factor there which I think is even more important. That’s the extent to which there is minimal agreement on a shared set of facts in the first place. Where there isn’t – and where there’s strongly opposed viewpoints – blog comments sections tend to break down rapidly. For Tyler, it’s Paul Krugman; for us, it’s the Israel-Palestine question (where I don’t allow comments any more on the rare occasions that I post ). But even here, Jonathan Edelstein’s Head Heeb seems to succeed in hosting generally civil discussions – I suspect that this is another example of the community effect – the commenters are a group of people who seem to have come to know each other over time, and have a good sense of the ground rules of debate. But enough rabbitting on; over to you.

Search filters

by Eszter Hargittai on September 15, 2005

A serious problem with content filters – whether add-on software or the “safe” search mode of systems – is that they often block legitimate content that should not be filtered out. These false positives can include important information that most would have a hard time defending as harmful. Paul Resnick and colleagues have done some interesting work on this regarding filtered health information.

Now comes to us a helpful little tool (found through ResearchBuzz) that lets you run searches to see what content is blocked in the safe-search modes of Google and Yahoo!. Type in a search term and see what sites would be excluded from the results when running the safe mode on the two engines.

Curiously, Google blocks the when you turn to safe mode for a search on “breast cancer” while Yahoo! doesn’t. (The Breast Cancer Site does not seem to have objectionable material, its noted mission is to raise funds for free mammograms.)

By the way, Google’s and Yahoo!’s results can be quite different regardless of what gets filtered. Dogpile has a nifty little tool that visualizes some of the differences. I discussed it here while guest-blogging over at Lifehacker a few weeks ago.