In praise of negativity

by Henry Farrell on July 24, 2020

Andrew Gelman has a post on the benefits of negative criticism, where he talks about the careful methodological demolitions he has done of others’ research that he has found to be slipshod.

if you want to go against the grain you have to work harder to convince people. My point is that this is the exact opposite of Cowen’s claim that following his advice “Avoid criticizing other public intellectuals. In fact, avoid the negative as much as possible” will force you to keep on thinking harder.

I’m in favor of a strong culture of criticism, but for a quite different reason: because serious criticism is probably the most valuable contribution we can make to the cognitive division of labour. There’s a possibly mistaken understanding of a truly excellent social science book behind this argument.

The book, which I’ve mentioned previously, is Hugo Mercier and Dan Sperber’s The Enigma of Reason. I should note that Hugo is a co-author on work in progress, but is absolved of any mistakes I make in interpreting his ideas. I should also note that his book with Sperber is at the least a ground breaking work. In my opinion, it’s a straightforward classic.

Mercier and Sperber’s basic argument is, as I understand it, as follows. First – that reasoning has not evolved in the ways that we think it has – as a process of ratiocination that is intended independently to figure out the world. Instead, it has evolved as a social capacity – as a means to justify ourselves to others. We want something to be so, and we use our reasoning capacity to figure out plausible seeming reasons to convince others that it should be so. However (and this is the main topic of a more recent book by Hugo), together with our capacity to generate plausible sounding rationales, we have a decent capacity to detect when others are bullshitting us. In combination, these mean that we are more likely to be closer to the truth when we are trying to figure out why others may be wrong, than when we are trying to figure out why we ourselves are right.

This has important consequences. The problem is that our individual reasoning processes are biased in ways that are really hard for us (individually) to correct. We have a strong tendency to believe our own bullshit. The upside is that if we are far better at detecting bullshit in others than in ourselves, and if we have some minimal good faith commitment to making good criticisms, and entertaining good criticisms when we get them, we can harness our individual cognitive biases through appropriate group processes to produce socially beneficial ends. Our ability to see the motes in others’ eyes while ignoring the beams in our own can be put to good work, when we criticize others and force them to improve their arguments. There are strong benefits to collective institutions that underpin a cognitive division of labor.

This superficially looks to resemble the ‘overcoming bias’/’not wrong’ approaches to self-improvement that are popular on the Internet. But it ends up going in a very different direction: collective processes of improvement rather than individual efforts to remedy the irremediable. The ideal of the individual seeking to eliminate all sources of bias so that he (it is, usually, a he) can calmly consider everything from a neutral and dispassionate perspective is replaced by a Humean recognition that reason cannot readily be separated from the desires of the reasoner. We need negative criticisms from others, since they lead us to understand weaknesses in our arguments that we are incapable of coming at ourselves, without them being pointed out to us.

Even if most of the action is going to be at the collective or group level, there are some possible lessons for how we ought to behave individually (some individual dispositions will be more likely to give to or benefit from collective debate).

Most obviously, serious criticism/disagreement is one of the most valuable things that we can do or we can get as public intellectuals (for values of public intellectual that mean no more and no less than someone who wants to think and argue in public). On average, our criticism of others is going to be closer to the truth than our own original thoughts. Furthermore, our original thoughts are likely to be valuable just to the extent that they’re responsive to serious criticism from others, and have been modified in response to previous rounds of criticism. More broadly, reasoning well will often be less about reasoning purely, than being reasonable (i.e. being open to others’ reasoning).

When we criticize others, we should try to do so non-pejoratively, but crisply and clearly. Randall Jarrell says that “a good motto for critics might be what the Persians taught their children: to shoot the bow and speak the truth.” He’s completely right – but the critic doesn’t have to be an asshole about it. It’s likely that some people will still find plainly stated criticisms obnoxious (this may not be a good way to build alliances), but they will be more likely to benefit from the criticisms if they are clear rather than circumspect.

Furthermore, while defending our own (inevitably biased) perspectives, we should be open not only to the likelihood that people with other perspectives have important things to say, but that on average they will have a better understanding of the weaknesses of our ideas than we do ourselves. We should look to cultivate good criticisms from others, and in particular people from different perspectives, whose criticisms are more likely to hit on weaknesses in our own reasoning that aren’t visible either to us or those who agree with us.

As a corollary, what may initially seem to us as trolling (and sometimes, what actually is trolling), may contain valuable criticisms that we may benefit from. The tradeoffs are that diversity of perspective is typically correlated with diversity of goals – someone who disagrees with how you see the world is also likely to want different things from it. But you should still push towards the margins of diversity as best as you can, since it is at those margins that you will get the most unexpected criticisms, even if some of those criticisms are irrelevant, since they presuppose that you should want different things than those that you do want. There are judgment calls as to where you stop – but you should do your best to be open to criticisms that are intelligent, clearly expressed, and plausibly constructive with respect to the goals that you want to achieve, rather than overtly destructive of them.

So what this all points to is something very different than the pursuit of bias-free reason that’s still popular across much of the Internet. It’s not about a radical individual virtuosity, but a radical individual humility. Your most truthful contributions to collective reasoning are unlikely to be your own individual arguments, but your useful criticisms of others’ rationales. Even more pungently, you are on average best able to contribute to collective understanding through your criticisms of those whose perspectives are most different to your own, and hence very likely those you most strongly disagree with. The very best thing that you may do in your life is create a speck of intense irritation for someone whose views you vigorously dispute, around which a pearl of new intelligence may then accrete.

Of course, collective reasoning is not the only desideratum of public debate. Much argument is about politics, persuasion and collective action, where a very different logic applies. The advice in this post is advice for public intellectuals but not for politicians. Weber’s essays on science and politics as vocations are useful here, and in particular his defense of the nobility of the political hack. As a hack, your professional duties are different, and the logic outlined here is at best of questionable benefit.

Furthermore, there is an obvious clash between the collective benefits of reasoning, where one provides most value added through improving the ideas of others, and the individual rewards of being a public intellectual (this time in the sense of actual career) where one does best through polishing one’s own reputation. The counterpoint is that we likely radically underestimate the importance of the invisible and non-individually lucrative contributions that people make to the collective benefit by improving others’ ideas.

One of my favourite passages from anywhere is the closing of Middlemarch, where Eliot says of Dorothea:

Her full nature, like that river of which Cyrus broke the strength, spent itself in channels which had no great name on the earth. But the effect of her being on those around her was incalculably diffusive: for the growing good of the world is partly dependent on unhistoric acts; and that things are not so ill with you and me as they might have been, is half owing to the number who lived faithfully a hidden life, and rest in unvisited tombs.

Striving to be a Dorothea is a noble vocation, and likely the best we can hope for in any event; sooner or later, we will all be forgotten. In the long course of time, all of our arguments and ideas will be broken down and decomposed. At best we may hope, if we are very lucky, that they will contribute in some minute way to a rich humus, from which plants that we will never see or understand might spring.

{ 36 comments }

1

Mike Huben 07.24.20 at 12:54 pm

“It’s not about a radical individual virtuosity, but a radical individual humility.”

That’s very close to something I learned from arguing with fundamentalists. They would claim science was “arrogant”, and I would point out that science shows far more humility than they do. Science is humble, in that it is based on evidence that can be observed and gathered pretty much by anyone, and on reasoning that can be seen and questioned by anyone.

2

marcel proust 07.24.20 at 2:19 pm

Minor objection to your use of “Bullshit” because it does not align with the meaning that Harry Frankfurt’s analysis brought into common parlance. Rather than BS, I think you mean nonsense or wrong headed reasoning (though admittedly the last is a bit circular, since you are already using the word “reasoning”).

3

Emma 07.24.20 at 2:51 pm

This post is beautiful. Thank you for making it.

4

Anarcissie 07.24.20 at 3:42 pm

@1 — Ideally science is humble. Ideally, it is skeptical, about its own as well as others’ conclusions and views. But people are seldom ideal, including science fans and promoters. I’ve had the experience of raising questions with science types and being instantly dismissed because I was not a credentialed scientist. Years later, I saw the same questions raised by c.s.’s and taken very seriously, which is much more in line with other academic behavior rather than the science ideal. (Details on request, but they’re boring.) (This is not a criticism of scientism, by the way, which is another subject.) Anyway, Mike, did you learn anything from the Fundies besides confirmation of the superiority of your way of thinking?

5

DCA 07.24.20 at 3:46 pm

Some relevant (maybe) references about this process: John Ziman’s “Public Knowledge: the Social Dimension of Science” (1968!), a pleasantly brief argument (by a scientist) that science (and other areas of scholarship) are all about convincing others; David Hull’s “Science as a Process”, exploring a specific area in extreme (and at times sordid) detail; and Geoffrey Lloyd’s “The Revolutions of Wisdom”, about the roots of the process in Greek legal argumentation.

Personally, I’d be reluctant to go further (after an initial criticism) unless the recipient is prepared to say “I see your point, but…” or at least “Let me think about that”. The unwillingness to do this is one of the distinguishing features that make trolls so tedious.

6

SusanC 07.24.20 at 4:39 pm

Buddhist philosophers often say that we are attached to our own ignorance. (In the sense of clinging to a beliefs even when we have plenty of evidence that is false). I’ll believe that as a description of human beings.

We do it collectively, too: entire groups of humans with a share belief in something that ought to Be obviously false. If we have the OP’s postulated mechanism for detecting fallacies, we turn it off when listening to members of our own tribe. I am a bit skeptical about our ability to be skeptical.

7

Sashas 07.24.20 at 5:28 pm

At the risk of doing precisely what you recommend… ;-)

Henry, in your OP you appear to use the words “reason” and “rationalization” interchangeably. These are distinct concepts and I think it’s very important to keep them separate, even as we recognize that people frequently claim to be doing one, while in fact doing the other. You claim that reasoning hasn’t evolved as we think it has, but this evolutionary argument is dramatically weakened the moment you consider rationalization as the dominant of the two concepts (when two traits are evolutionarily linked, the dominant will… dominate), and further weakened when you consider that intellectuals are not a separate species of human (and so the evolutionary argument would really only apply on average across all humans).

I think it does not need justification to claim that one can become a “public intellectual” either by being particularly good at reason or particularly good at rationalization. Humanity as a whole has evolved to become better at rationalization. (I’m actually not sure I buy the evolutionary argument at all, but I’m happy to agree that if it occurred, it was to the optimization of rationalization, not reason.) But when I put these thoughts together, why would a public intellectual who became so through skill with reason be better off using their skill at rationalization?

Up to this point, I’ve looked at producing reason/rationalization, rather than the skill of analyzing it. You claim we have a decent capacity to detect when others are bullshitting us (rationalization skill, but on the analysis side). That gets a “citation needed” from me. (Does Mercier’s book present empirical evidence? Is this claim coming from somewhere else?) I have seen no evidence for this claim, and my lay interpretation of research on misinformation suggests to me that many of us are in fact very bad at bullshit detection.

The rest of your argument seems to rest on these two claims, both of which I contest.

I suspect you may be responding to an argument I can’t see. There’s hints throughout suggesting someone claimed all criticism is bad (or all negative criticism?), and you’re responding to that. I think you have overplayed your hand. I agree that negative criticism is valuable within public intellectual discourse, and I even agree that it can be valuable to the recipient. But the reasons you present aren’t why.

8

politicalfootball 07.24.20 at 6:00 pm

Tyler Cowen’s discouragement of criticism is typical of his efforts to dress up a self-serving norm as a higher intellectual principle. In an environment where public intellectuals are subject to rigorous criticism, Cowen wouldn’t get to be one.

9

oldster 07.24.20 at 6:02 pm

I don’t understand the Mercier & Sperber view. Why isn’t the public use of reason posterior to the private, even in the cases they envision?

I try to persuade my fellow-villagers to join with me in digging a well. It will provide us with water, I say. It will allow us to avoid the crocodiles at the river, I say. If we all contribute a few hours of our labor, then the job will not be overwhelming for any one person.

Why think that those public rationales diverge from the very sequence of reasoning — private, internal reasoning — that first gave me the idea that we should all get together to dig a well? How did I even get to M&S’s first stage, of “wanting something to be so,” without doing some reasoning about what I want and how it can be attained? Yes, I want something to be so: I want there to be a well in our village. And I came to want the well by a process that involved quite a healthy amount of private reasoning, both means-end reasoning (I want water; how can I get it?), and reasoning that involved figuring out the world (crocodiles bite; there’s water underground; etc.)

Perhaps M&S would concede that there are cases in which the public reasoning of persuasion exactly matches the private reasoning that led to the initial wanting-to-be-so. But they are rare cases, they’ll say, and transparency of this sort is the exception. More often, there is some skullduggery at work in the well-diggery: my real motives are irrational ones that I do not express, because they would not persuade other people. And the things I say out loud are mere misdirection and subterfuge, like the Red-Headed League asking people to copy prose long-hand, while they are really setting up a bank-heist.

Okay, maybe. But even here, it seems to me that dishonest rationalization works only against a background presupposition of honest, transparent reasoning, in the same way that lying works only against a background presupposition of truth-telling. At least sometimes, we must encounter reasoning that is persuasive and reliable and reliably informs us about the persuaders true intentions. Otherwise, public reasoning would cease to be persuasive at all.

So I think I still don’t get the priority claim, or how the priority (on their view) is really the opposite of what we normally assume it to be.

Relatedly, how is our bullshit detection supposed to work? If it simply spots lapses in the public reasoning of the would-be persuader, then here too it seems to invoke modules that are non-accidentally designed for private reasoning about how the world is. (“Dig a well?Don’t you remember that we tried to dig a grave last year and found impenetrable bed-rock at 4 feet underground? You’ve got your facts all wrong, and your proposed means will never achieve your announced ends!”). Or is the idea that we approach all public reasoning with suspicion and distrust — not about details, but as a general stance? Is our bullshit detection a general discounting of claims and a general imputation of bad faith? Because then it seems like a very poor means of finding truth or of improving public discourse. There’s no one so easily conned as the man who is always on the lookout for scams. (I think there’s a pithier formulation that is escaping my memory right now, but then again I don’t claim to be able to repeat five nouns.)

So I don’t get that part either.

10

bianca steele 07.24.20 at 6:45 pm

Everybody is going to read their preferred narrative about the Internet and fair or unfair criticism into the OP, but I’m not sure anything in particular follows from it. We don’t get our beliefs by pulling balls out of an urn, and debate isn’t a process of randomly colliding objections against arguments until only the strongest is left standing. Even if debate does happen by other people taking potshots at my beliefs in order to strengthen the group dogma, why doesn’t that imply I should defend beliefs that I personally object to, in order to allow others to weaken them? That’s beyond the point that talking about work is different than doing work. Only one of those happens online and in the press.

I would suggest not forgetting that QAnon is a discussion collective.

11

MisterMr 07.24.20 at 9:29 pm

Perhaps this is a bit off topic, but:

The idea that “reasoning” evolved not so much to understand the world, but to explain to/influence others strikes me as very likely.

If I try to analyze my tought process, generally I first have an idea through an “intuition”, and later I flesh it out through “reasoning” (where by “reasoning” I mean word-tought, whereas by “intuition” I mean more like a gestalt-like, pre-verbal thingie).
If by “reasoning” we mean word-tought, it seems likely that “reasoning” is not the way we reach our conclusions, since it is based on language that is an interpersonal thing.

I assume that the book uses “reason” also in the sense of word-tought.

However there is a problem, which is that intuition is not sharable without language (and thus word-tought), and also can’t be really tested before being put in words.

But if word-toughts are only a reflex/explanation/way to convince other peoples of our intuitions (that anyway are even less definible than word-tought), then criticizing an argument always just works on the surface of it, and anyway it is difficult to define “rationality” in a meaningful way.

Also I often have this doubt about people that speak of rationality and rational arguments: what do they exactly mean by “rationality”? Do they mean logic in the sense of formal logic? For example Umberto Eco once said that people in the middle ages were more rationalist than us, in that they believed that the world could be understood directly from reason + the Bible (whereas today we are empiricists and we trust pure reason much less). If we think in terms of rationalism VS empiricism this is correct, but then very few people would use the term rationality in this sense.

12

Neil Levy 07.25.20 at 12:39 am

I also love The Enigma of Reason and Hugo’s more recent book. I strongly agree that the perspective has important implications for the division of cognitive labor. But I think the message you take from it (whether intended by Mercier and Sperber or not) is still too individualistic. The benefits of criticism you point to are real and I agree that one of the most important things we can do is engage in it, thereby playing our role in the distribution of cognitive labor. But I don’t think the benefits flow to the person criticised. Rather, they flow to the intellectual community; the person criticised won’t take them on board, see their strength or see us as doing them a favor. I don’t think we are doing them a favor.

Perhaps relatedly, I don’t think the data Mercier and Sperber cite supports quite their official view (which you report accurately). I think a better interpretation is that rather than individual cognition developing to play a social role, it developed to play a role n generating accurate beliefs: accurate beliefs at the community, rather than the individual, level. I’ve defended this a bit in print already with more to come soon.

13

ph 07.25.20 at 2:03 am

I like the post, Henry, and some of the comments. I tell myself that we can learn a lot from any situation if we’re willing to look. You note that if we’re lucky we’ll discover our own biases (almost always) in the process, and that some of these will elude us. Thus, the value of discourse, criticism, and debate with others who hold different views. My own experience is that differences are often based on people choosing different evidence, weighting, and rubrics. Discovering even that makes debate and discussion worthwhile

It’s perhaps easier if we’re engaging with others with some baseline assumptions ie mutual respect and a willingness to wait, assess, and re-engage for clarification etc. multiple times to get a clear understanding of what others say. Humility translates well in practical terms as teachable, willing to learn. That’s a valuable talent, even more so if we can get the ego and identity needs out of the process, always an impossibility, but something to keep in mind when we’re trying to determine ‘why’ we don’t accept a particular assertion.

Finally, reason is reason and faith is faith. As a person of faith I accept and embrace the irrational, unprovable, and illogical, which means my personal beliefs in a hereafter, for example, or creation, are about as solid as believing the devil lives in my toaster. Which is pretty much how my favorite Christian dissenting sect ‘the Ranters’ saw things.

Freeing ourselves from the need to live a reason based life is closer to what Hume would understand and relieves from many tedious and pointless exercises such as disproving the existence of God, or the existence thereof – free will/predestination. People of faith generally report that they’re happy and fulfilled. Faith in a higher power also provides a direct route to understanding at a core level how little we know, and how meaningless much of what we deem important actually is.

Which I think is the point of much of your excellent post – it’s the journey, not the destination and how we treat each other along the way, a journey which provides plenty of opportunities to discover the world, others, and ourselves.

14

Kien 07.25.20 at 8:57 am

Hi, thank you for the illuminating post. It helps me appreciate the value of public criticism and scrutiny, and doubt the social value of my own “original thoughts” (ha ha!).

I would add though that we ought to distinguish between public criticism of ideas, not people. Much of public criticism seems to be aimed at diminishing our rivals, and not necessarily their ideas. So perhaps the exhortation not to be “negative” rightly applies to negativity about people.

15

Alex SL 07.25.20 at 10:35 am

(1) I don’t see how there is anything wrong about aiming to eliminate one’s own biases and become a better dispassionate reasoner even if we know that it is hard to achieve. The problem is not that the ideal is the wrong goal to pursue, it is rather that too many of those internet guys assume they have successfully achieved a fixed end state (“I am a rational person, so I must be correct about everything, now lets condescend towards those who are less rational than I am”) when instead what they need to do is to adopt a constant state of self-doubt (“maybe I am wrong – let’s carefully examine the other side’s arguments”).

(2) I have no strong opinion on whether our ability to reason has evolved for rationalisation or to allow us to make good decisions. But even if I tentatively accept the former, that does not mean we therefore have to conclude that we cannot now make use of this ability to arrive at sound conclusions all by ourselves. My higher cognitive functions presumably did not evolve to read and enjoy a novel either, given that there were no novels until very recently, but that won’t keep my from doing so.

(3) I find that it is hardly ever possible to convince people who have publicly argued a case, regardless of how robust or polite the rebuttal is. When we argue publicly, collectively, we do so predominantly for the sake of the people who are officially still on the fence; for nearly everybody who has spoken out finds it difficult to publicly change their mind for fear of being seen as admitting to an error, having weak principles, being a flip-flopper, being an opportunist, etc. The best one can usually hope for is somebody changing their position but claiming to have always held the new one, merely having been misunderstood; but even that is impossible in any case that has become a political shibboleth.

(4) Perhaps a bit off-scope, but as a peer reviewer I have lately found that being too polite and careful about expressing criticism can back-fire. I am often worried about sounding too harsh, and then sometimes find that the authors simply do not understand that e.g. a given analysis was done in a completely unacceptable way; and once I have to repeat my feedback in stronger words in the second round of review they might get annoyed, as they felt that they have now addressed all the feedback as they understood it and don’t want to do something that major at so late a stage in the process. It is a difficult balance to strike.

16

rjk 07.25.20 at 12:09 pm

One persuasive argument in favour of accepting criticism is an approach that inverts the normal sense of “winning” an argument. We normally think of “winning” as forcing one’s opponent to submit, thereby abandoning their prior belief and perhaps adopting our own. But if we agree with Mercier and Sperber, the “losing” party is very unlikely to do this unless they are genuinely persuaded. Therefore if one succeeds in persuading one’s interlocutor into accepting one’s position, or at least into abandoning their own mistaken one, then it is the interlocutor who has gained from this exchange. They emerge unburdened of a previous error, and perhaps in possession of a new and self-evidently persuasive notion. Considering oneself as the potential recipient of such benefits can change how one approaches criticism.

The internet rationalist’s smugness is therefore necessary, because it allows them to feel good about following the correct procedure rather than necessarily having the correct opinions. The problem that Mercier and Sperber identify (viz. our preference for our own bullshit over other people’s) is overcome by identification with the process of rational debate over the possession of particular opinions.

I’m not sure how many internet rationalists actually manage to do this, of course, but I think this is what they’re aiming for. They are annoying to others because it’s only by identifying strongly and positively with “rationalism” that they’re able to overcome the self-bullshitting bias, and it turns out that people who are willing to entertain all kinds of ideas without feeling a need to identify themselves with anything other than procedure are quite disturbing to everyone else.

17

Mike Huben 07.25.20 at 12:10 pm

Anarcissie @ 4:

Saying science is humble does not exclude skeptical: it ought to be both.

did you learn anything from the Fundies besides confirmation of the superiority of your way of thinking?

No such thing. I realized that I was not being scientifically humble, spouting my own opinions without backing, and that I had to be better if I wanted to give answers that were actually scientific.

You can tell me if humility is superior or not.

18

Kenneth 07.25.20 at 4:00 pm

An entailment: our academic journals should radically shift their balance between ‘original’ articles and ‘critical notes’ (many no longer have any of the latter)

19

bianca steele 07.25.20 at 5:16 pm

I took the opportunity to look up Mercier and Sperber yesterday and it looks interesting, so thanks for the pointer.

A paper that was available online refers to Wason problems, which are often used to prove “people are irrational.” The problem described was to provide guesses as to the rule governing a pattern of integers presented as an incomplete sequence. The definition of rationality desired is to attempt to invalidate hypotheses by efficiently presenting guesses that will, as predicted, be judged incorrect. What the online rationality people do, in effect, is take these kinds of results to heart and transform the way they approach problems to effectively preempt the criticism presented in the Wason exercise. If M and S are correct, I think, they ought to take more seriously the reasons they sense the usual presentation of the exercise needs more added to it.

This won’t get them tenure in a psychology department, but I wonder if the goal really is to ensure everybody thinks like a professor of all the departments simultaneously?

20

Kiwanda 07.25.20 at 6:48 pm

What I find puzzling about the OP is that it advocates the pursuit of truth through argument, criticism, debate, and discussion, as though these were not already understood for centuries to be critical to how the legal, legislative, and scientific systems function best.

The OP makes a good case for open debate and discussion, but these are not new ideas, that “the free exchange of information and ideas is the lifeblood of a liberal society”, that “the value of robust and even caustic counter-speech from all quarters” should be upheld, and that “the way to defeat bad ideas is by exposure, argument, and persuasion, not by trying to silence or wish them away”, that what is needed is more persuasion.

21

Jeff 07.25.20 at 8:07 pm

Neil Levy 4: But that would require group selection, assuming we are talking about biological evolution.

22

Ebenezer Scrooge 07.25.20 at 9:57 pm

I’ve been a bureaucrat all of my adult life, so I’ve been on both the sending and receiving end of internal criticism. Criticism–especially of a person of less power or standing–is HARD! The person criticized must end up feeling like they can push back, if they need to. (“I beseech you, by the bowels of Christ ….”) And yet, they must be criticized, often because they don’t know what is needed as much as you do.
Upward criticism is much easier, but of course riskier.

23

m sam 07.25.20 at 10:59 pm

I read Enigma of Reason, and I think you captured the thrust of that book well. But I think your argument is missing many of the same elements as it did; for instance, wasn’t it Diogenes who went around with a lantern looking for the face of an “honest man?”

After reading that book I was left feeling that evaluating criticism (i.e. skepticism) is just as important as listening to it. Granted, their argument rests on their studies which purport to show that we naturally see motes in other’s eyes much more clearly than our own but, I think it’s fair to say the issue is far less than settled.

And really, as a firm believer in this so-called “negative criticism,” I still think there is a lot of previous work being overlooked in this emerging space, starting with Diogenes.

24

marris 07.25.20 at 11:01 pm

Wise and beautiful post! I would have written something like “down with cancel culture!” but your writing is much better.

25

Barbara 07.26.20 at 1:17 am

@marcel proust(2)
As so-called visual thinker (i.e., non-verbal, non-linear) I will have to learn to memorize “nonsense” to replace the “bullshit” that appears in my brain to a comment, statement, argument that I know immediately to be wrong in some way but don’t have the words to counter the utterance of the other.

Many thanks

26

John Quiggin 07.26.20 at 3:49 am

“diversity of perspective is typically correlated with diversity of goals – someone who disagrees with how you see the world is also likely to want different things from it. ”

This means, I think, that the most reliably useful criticism of particular ideas comes from people who mostly agree with you. That’s certainly my experience.

27

bad Jim 07.26.20 at 3:50 am

Two snippets from Richard Feynman’s famous lecture, “Cargo Cult Science”:

For example, if you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked—to make sure the other fellow can tell they have been eliminated.

Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can—if you know anything at all wrong, or possibly wrong—to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.

In summary, the idea is to try to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgment in one particular direction or another.

The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.

28

Tim Worstall 07.26.20 at 10:59 am

“I’m in favor of a strong culture of criticism, but for a quite different reason: because serious criticism is probably the most valuable contribution we can make to the cognitive division of labour.”

Quite possibly but one of the problems is getting people to believe you even when you’re right in the criticism.

Take, as an example, the idea that we’re imminently going to run out of nice minerals to play with. The problem, starting with Limits to Growth, Blueprint for Survival and so on and running through to today’s insistences upon recycling (no, I’m not against recycling, only the error used in arguing for it), is that everyone, but everyone, is getting the definition of mineral reserve wrong.

The claim is we’ll run out of reserves in 20-30 years (or whatever) and so we’re doomed. But the useful colloquial definition of mineral reserve is the stuff prepared for us all to use in the next 20 to 30 years. So that we’ll use what we’ve prepared is the claim which sounds a little less shocking. Rather than the claim being made, which is that when Blue Peter uses the sticky black plastic prepared earlier that’s the end of the show forever.

Sure, OK, no one listens to me anyway and probably rightly. But I’ve a letter in Nature about this (in, not to, that’ll never happen), written a book on it, spread the wisdom across the interwebs. And yet still the mistake is made. Large chunks of environmental politics and policy are based upon this very mistake.

People still insist upon measuring future mineral availability by reference to mineral reserves. That’s simply wrong, wholly and provably so.

There was a bloke who, in a published paper, insisted that the world would run out of hafnium in 2017. That didn’t happen, obviously, but he’s not retracted the paper, edited it, worked out why he was wrong. He’s now on one of those committees that tells us all we’ve got to recycle metals because we’re about to run out.

That original claim about hafnium was one of the utmost, absolute, stupidity to start with. But trying to get it corrected is, umm, well, difficult.

29

steven t johnson 07.26.20 at 3:43 pm

John Quiggin@21 wrote: “‘diversity of perspective is typically correlated with diversity of goals – someone who disagrees with how you see the world is also likely to want different things from it.’

This means, I think, that the most reliably useful criticism of particular ideas comes from people who mostly agree with you. That’s certainly my experience.”

There is a great deal of truth to that. I would also suggest that the person who is criticized or benefits from criticism of ideas and propositions they mostly agree with also includes those who later incorporate the ideas, especially if the convenience of memory has turned the criticisms into their own ideas. Remembering now that’s what they’ve always thought, is ideal.

But as an example of how criticism from a foreign angle is entirely useless, let me observe that “criticism” is not just the refutation, pretended or real, of arguments or facts. Criticism also means the connection to context, current and historical. Criticism is the evaluation (possibly even numerical!) of the significance of the propositions studied. Criticism can be something as basic as noting the genre or the grammar.

In the case of the OP, the genre is a plea for radical humility, more or less as an exemplar of a rational humility. There are several problems here.

First, the notion that the best way to find truth is an adversarial procedure strikes me as far too limited. A lawsuit between two corporations is not the model for reason in my opinion.

Second, the notion that skepticism is the path to knowledge, rather than a defense of reaction moderated by good manners, strikes me as undesirable.

Third, the notion that the ability to reject some propositions as false depends entirely on the ability to affirm some propositions as true is precisely why skepticism is so lethal to reason.

Fourth, the notion that saying something is true is dogmatism forgets that the real claim, as in science, is that correction of erroneous claims and arguments must be made by reliable means.

Fifth, the objection to the fourth point, that there is no a priori method to determine reliable modes of argument is a confession to truth defined as a logical a priori ultimately discoverable by pure reason, whatever that may be. As such, it is a theological objection. You may discuss philosophy of science, for example, but such verbal sketches are like maps: Yes, they are “true” but they are not the journey, which is “the” truth. But this objection wants a map that doesn’t require going outside the door.

I don’t see how any of these criticisms are negative in the sense meant, but nor do I see how they are radically humble. And as per John Quiggin do I see how they could possibly be useful to the OP.

Bng ll mnnrd, wll nt th nvctn f lrnd crnks Mrcr nd Sprbr s tslf cntr-xmpl t rdcl hmlty. It is perfectly unclear how trying to teach someone the sweet science, which involves reasoning, somehow allows for deception. It is perfectly unclear how trying to work together to build a dwelling or successfully hunt large and dangerous game somehow invests the emotionally intelligent with opportunities to deceive with facile rationalizations. It is possible I suppose the ability to whisper sweet nonsense to young women will encourage them to have sex…but I really do not think powers of rationalization have been so much more effective in getting children than being rich, even in societies where rich is meat. There is no talking of evolution, not rational talk anyhow, without demonstrating differential reproduction.

30

notberlin 07.26.20 at 8:52 pm

Be critical of others but not of ‘thy’-self? Geez, where have we seen this philosophy before? Because everyone is too selfish by nature to have self-reflective qualities or ‘know thyself?’ I for one am perfectly capable of criticizing others as well as being aware of my own faults/tendencies… I consider it a main tenant of being a compassionate being. Without the former I’m just a self righteous ego pointing a finger, not interpreting or standing up for anything, without the latter I’m not really an empathetic human; knowing my own weaknesses does not excuse fighting abuse and injustice from others. You have to do both at the same time. This philosophy of selfishness (i.e. we don’t see our own bullshit but only that of others), it serves the ruling classes very well.

31

mark 07.26.20 at 11:06 pm

Good post.

I remember a classic dsquared observation along these lines. Roughly: “It is easier to tear down than to build up” is correct, but usually interpreted as an argument against people making criticism. In fact what it implies is that someone engaging in criticism has a better chance of being correct.

32

Neil Levy 07.27.20 at 12:20 am

Jeff @21:

We’re not talking about genetic evolution. At least I’m not. Genetic evolution gets you some dispositions we need for cultural evolution to get going.

In any case, group selection is no longer the swear word it once was, so I’m not too worried about that.

33

Ryan Baker 07.27.20 at 1:13 am

The principle here, that we’re insufficient to criticize ourselves, that makes sense. However, it doesn’t respond to Cowen’s principle of avoiding the negative. The main reason is that the criticism you speak of doesn’t have to happen in the public domain. I’m fairly certain Cowen is referring to something different, the public critique. Critique as an avenue to establishment of your own credibility. That type of endeavor is more political than intellectual. If the intent of the criticism is to help inform the recipient, placing it in a public domain is contrary to the intent as the recipient, being a human, will find it more difficult to accept, and be more likely to react defensively.

34

Tm 07.27.20 at 12:28 pm

When I read book reviews (I’m mostly referring to nonfiction here), I’m often annoyed. Most reviews I have found are not even close to a critical examination of the book under review. I consider such reviews worthless. There is little to be learned from them. As an example, it is very difficult to find critical reviews of Jared Diamond’s books but the few that exist are very stimulating. When most reviewers agree in heaping praise on a book, I consider that a sign of lazy intellectual conformism. There is noo good, relevant book that isn’t in need of some criticism.

35

Metatone 07.27.20 at 1:46 pm

Commenting is more valuable than writing posts – I always knew it.

36

eg 07.28.20 at 2:35 pm

I suppose this might be what Blake meant when he said, “opposition is true friendship?”

Also, to the extent that the progress of ideas is built upon the falsification of prior ideas, I see merit in what Mercer and Sperber are offering.

Comments on this entry are closed.