Daniel Kahneman, who was, along with Elinor Ostrom, one of the very few non-economists to win the Economics Nobel award, has died aged 90. There are lots of obituaries out there, so I won’t try to summarise his work. Rather, I’ll talk about how it influenced my own academic career.
When I was an undergraduate, in the late 1970s, economic analysis of decisions under uncertainty was dominated by the expected utility (EU) theory of von Neumann and Morgenstern. The mean-variance approach, still popular in finance, was regarded as, at best, a special case of the correct EU theory. Some early theoretical challenges, notably from French theorist Maurice Allais around 1950 had been thoroughly refuted, at least to the satisfaction of most in the field. (A more fundamental critique by Daniel Ellsberg (later famous for leaking the Pentagon Papers) had been shunted into the “too-hard” basket.)
The first big challenge to this consensus came in a 1974 paper by Kahneman and his long-time collaborator Amos Tversky (already a big name in the field of measurement theory) who found that judgements about probabilities were characterised by a variety of systematic biases, based on misleading heuristics. This set off a surge of interest in challenges to EU, including a revival of the criticisms made by Allais.
One of the key ideas here was that rather than take an arithmetic average of the utilities yielded by different possible outcomes of an uncertain, people might place more weight on low-probability outcomes like winning the lottery or dying in a plane crash. Unfortunately, the obvious approach of transforming probabilities into weights doesn’t work. Think about a choice which yields lots of different outcomes, each with a small probability and a utility close to, but below 1. The weighted average procedure will yield a value greater than 1, implying that the choice would be preferred to getting 1 with certainty. This is obviously silly (the technical term is a violation of dominance)
In 1979, while working on my undergraduate honours thesis, I came up with a solution to this problem. If the transformation is applied to the cumulative probability of getting an outcome less than or equal to some given value, rather than to individual probabilities, only the probabilities of extreme outcomes (like lottery wins and plane crashes) are overweighted and violations of dominance are avoided. This approach is now called rank-dependent utility theory
In the same year, Kahneman and Tversky published the first version of a generalized version of EU called prospect theory. Among other changes, Kahneman and Tversky used probability weighting in the problematic form described above. They avoided dominance violations in a rather ad hoc fashion, by “editing” out dominated prospects.
My own idea took the usual tortuous process to publication, eventually appearing in the (then new) Journal of Economic Behavior and Organization in 1982. It didn’t attract much attention at first, but eventually got noticed by some of the leading figures in the newly developing field of generalized expected utility theory, and even by Allais, who had returned to the topic after an absence of many decades. Finally, in 1992, Kahneman and Tversky incorporated the rank-dependent idea into their cumulative prospect theory, which became the standard version of prospect theory.
To the extent I have any fame as an economic theorist, it’s mostly due to this work. And, if you are going to engage in debate on policy issues, the credibility gained from having a (moderately) big name in economic theory makes it hard for rightwing economists to dismiss you.
So I owe a big debt to Kahneman (as well as Tversky). He will be missed.
{ 10 comments }
Alan White 03.29.24 at 5:32 am
A very interesting tribute that makes a bit of sense to a dolt like me who doesn’t get econ theory. But I did wonder: does your work at all intersect with any problem of ranking Bayesian priors? If that’s stupid then I apologize, but an intriguing post nevertheless.
John Q 03.29.24 at 6:10 am
Thanks, Alan. It’s not a silly question, since RDU is Bayesian (or maybe post-Bayesian) in spirit, and involves ranks. But the ranking here refers to uncertain outcomes of choices, rather than to evaluations of different choices.
engels 03.29.24 at 12:15 pm
Thinking Fast And Slow will live on in the annals of unintentional self-reference.
https://slate.com/technology/2016/12/kahneman-and-tversky-researched-the-science-of-error-and-still-made-errors.html
Rob Chametzky 03.29.24 at 1:37 pm
Off-topic:
Happy Birthday!
John Q 03.29.24 at 6:50 pm
Engels @3,
My research draws on Kahneman’s work with Tversky and not his later solo stuff. Broadly speaking, this is the split between decision theory and behavioral economics, on which I plan to write something if I get time. Decision theory incorporates K&T critique of Expected Utility, but is a quiet little subfield with conferences that don’t attract much attention. Behavioral economics has made a big splash, and is now suffering from the inevitable adverse reaction.
John Q 03.29.24 at 6:51 pm
Thanks, Rob!
Peter Dorman 03.29.24 at 7:43 pm
I came to Kahneman and Tversky in course of my dissertation on occupational safety and health in the early 80s. I was intrigued by the potential power of prospect theory if you think of reference points as components of social norms. For instance, there might a normative acceptance of a given degree or type of risk in the workplace, and the particular risks workers encounter on their own job would be evaluated relative to that reference state. This points to a more precise statement of the organizer’s problem: getting workers to change their reference points. But even though this sort of application is fairly straightforward, you won’t find it in the behavioral econ literature. Perhaps this is because the main interest has been in identifying “errors” in common heuristics, and the investigation of norms comes from a different approach to the nature of rationality in a social context.
I was also struck by the near absence of theorizing cognitive dissonance avoidance (“denial”), which plays such a crucial role in occupational risk perception. Akerlof and Dickens wrote a paper on this, and as far as I know, that’s about it. (There are a couple of pages on it, with an extremely simple model, in my book on occupational safety and health.)
I guess what I’m saying is that there is a large untapped potential in behavioral econ once we get beyond focusing on departures from expected utility maximization.
oldster 03.30.24 at 5:52 am
“…and even byais, who had returned to the topic after an absence of many decades….”
I’m not familiar with the word “byais”. Could that be a typo for “by Allais”?
John Q 03.31.24 at 12:35 am
Oldster: yes, fixed now I hope
CarlD 04.01.24 at 5:18 pm
@Peter Dorman this reminds me of conversations about the bad outcomes of lecture pedagogy as acceptable failure, up to and including enshrining typical failure rates as normative (e.g. The Curve). Lower failure rates of other pedagogies are interpreted as intolerable risk. Getting workers to change their reference points in this case has been the (failed) work of centuries, and of course there is all sorts of ideology ready at hand.
Comments on this entry are closed.