What went wrong with New Keynesian macro ? (more bookblogging)

by John Q on October 13, 2009

More bookblogging! It’s all economics here at CT these days, but normal programming will doubtless resume soon.

Most of what I’ve written in the book so far has been pretty easy. I’ve never believed the Efficient Markets Hypothesis or New Classical Macro and it’s easy enough to point out how the occurrence of a massive financial crisis leading to a prolonged macroeconomic crisis discredits them both.

I’m coming now to one of the most challenging section of my book, where I look at why the New Keynesian program (with which I have a lot of sympathy) and ask why New Keynesians (most obviously Ben Bernanke) didn’t, for the most part, see the crisis coming or offer much in response that would have been new to Keynes himself. Within the broad Keynesian camp, the people who foresaw some sort of crisis were the old-fashioned types, most notably Nouriel Roubini (and much less notably, me) who were concerned about trade imbalances, inadequate savings, and hypertrophic growth of the financial sector. Even this group didn’t foresee the way the crisis would actually develop, but that, I think is asking too much – every crisis is different.

My answer, broadly speaking is that the New Keynesians had plenty of useful insights but that the conventions of micro-based macroeconomics prevented them from forming the basis of a progressive research program.

Comments will be appreciated even more than usual. I really want to get this right, or as close as possible




Refuted doctrines


New Keynesian macroeconomics 

In the wake of their intellectual and political defeats in the 1970s, mainstream Keynesian economists conceded both the long-run validity of Friedman’s critique of the Phillips curve, and the need, as argued by Lucas, for rigorous microeconomic foundations. “New Keynesian economics” was their response to the demand, from monetarist and new classical critics, for the provision of a microeconomic foundation for Keynesian macroeconomics.

The research task was seen as one of identifying minimal deviations from the standard microeconomic assumptions which yield Keynesian macroeconomic conclusions, such as the possibility of significant welfare benefits from macroeconomic stabilization. A classic example was the  ‘menu costs’ argument produced by George Akerlof, another Nobel Prize winner. Akerlof sought to motivate the wage and price “stickiness” that characterised new Keynesian models by arguing that, under conditions of imperfect competition, firms might gain relatively little from adjusting their prices even though the economy as a whole would benefit substantially.

The approach was applied, with some success, to a range of problems that had previously not been modelled formally, including many of the phenomena observed in the leadup to the global financial crisis, such as asset price bubbles and financial instability generated by speculative ‘noise trading’. 

A particularly important contribution was the idea of the financial accelerator, a rigorous version of ideas first put forward by Fisher and by Keynesians such as Harrod and Hicks. Fisher had shown how declining prices could increase the real value of debt, making previously profitable enterprises insolvent, and thereby exacerbating initial shocks. The Keynesians showed how a shock to demand would result in declining utilisation, meaning that firms could meet their production requirements without any additional investment. Thus the initial shock to demand would have an amplified effect on the demand for investment goods.

In a 1989 paper, Ben Bernanke and Mark Gertler integrated these ideas with developments in the theory of asymmetric information to produce a rigorous model of the financial accelerator. 

It would seem, then,  that New Keynesian economists should have been well equipped to challenge the triumphalism that prevailed during the Great Moderation. With the explosion in financial sector activity, the development of massive international and domestic imbalances and the near-miss of the dotcom boom and slump as evidence, New Keynesian analysis should surely have suggested that the global and US economies were in a perilous state.

 Yet with few exceptions, New Keynesians went along with the prevailing mood of optimism. Most strikingly, the leading New Keynesian, Ben Bernanke became,  the anointed heir of the libertarian Alan Greenspan as Chairman of the US Federal Reserve. And as we have already seen, it was Bernanke who did more than anyone else to popularise the idea of the Great Moderation.

Olivier Blanchard summarises the standard New Keynesian approach (which converged, over time with the RBC approach) using the following, literally poetic, metaphor

A macroeconomic article today often follows strict, haiku-like, rules: It starts from a general equilibrium structure, in which individuals maximize the expected present value of utility, ¯rms maximize their value, and markets clear. Then, it introduces a twist, be it an imperfection or the closing of a particular set of markets, and works out the general equilibrium implications. It then performs a numerical simulation, based on calibration, showing that the model performs well. It ends with a welfare assessment.

Blanchard’s description brings out the central role of microeconomic foundations in the New Keynesian framework, and illustrates both the strengths and the weaknesses of the approach. One the one hand, as we have seen, New Keynesians were able to model a wide range of economic phenomena, such as bubbles and …, while remaining within the classical general equilibrium framework. On the other hand, precisely because the analysis remained within the general equilibrium framework, it did not allow for the possibility of a breakdown of classical equilibrium, which was precisely the possibility Keynes had sought to capture in his general theory.

The requirement to stay within a step or two of the standard general equilibrium solution yielded obvious benefits in terms of tractability. Since the properties of general equilibrium solutions have been analysed in detail for decades, modeling “general equilibrium with a twist” is a problem of exactly the right degree of difficulty for academic economists – hard enough to require, and exhibit, the skills valued by the profession, but not so hard as to make the problem insoluble, or soluble only with the abandonment of the underlying framework of individual maximization.

A critical implication of Blanchard’s haiku metaphor is that the New Keynesian program was not truly progressive. A study of some new problem such as the incentive effects of executive pay would typically, as Blanchard indicates, begin with the standard general equilibrium model, disregarding the modifications made to that model in previous work examining other ways in which the real economy deviated from the modelled ideal. The cumulative approach would imply a model that moved steadily further and further away from the standard GE framework, and therefore became less and less amenable to the standard techniques of analysis associated with that model. 

This, I think, is what Paul Krugman had in mind when he suggested in his essay ‘How Did Economists Get It So Wrong?’ that economists, as a group, mistook beauty, clad in impressive-looking mathematics, for truth. The work described by Blanchard was beautiful (at least to economists) and illuminated some aspects of the truth, but beauty came first. An approach based on putting truth first would have incorporated multiple deviations from the standard general equilibrium model then attempted to work out how they fitted together. In many cases, the only way of doing this would probably be to incorporate ad hoc descriptions of aggregate relationships that fitted observed outcomes, even if it could not be related directly to individual optimization.

New Keynesian macroeconomics, of the kind described by Blanchard, was ideally suited to the theoretical, ideological and policy needs of the Great Moderation. On the one hand, and unlike New Classical theory it justified a significant role for monetary policy, a conclusion in line with the actual policy practice of the period. On the other hand, by remaining within the general equilibrium framework the New Keynesian school implicitly supported the central empirical inference drawn from the observed decline in volatility, namely that major macroeconomic fluctuations were a thing of the past.

#

DSGE

Eventually, the New Keynesian and RBC streams of micro-based macroeconomics began to merge. The repeated empirical failures of standard RBC models  led many users of the empirical techniques pioneered by Prescott and Lucas to incorporate non-classical features like monopoly and information asymmetries. These “RBC-lite” economists sought, like the purists, to produce calibrated dynamic models that matched the “stylised facts” of observed business cycles, but quietly abandoned the goal of explaining recessions and depressions as optimal adjustments to (largely hypothetical) technological shocks.

This stream of RBC literature <a href=”http://www.econosseur.com/2009/05/leamer-and-the-state-of-macro.html”>converged with New Keynesianism</a>, which also uses non-classical tweaks to standard general equilibrium assumptions with the aim of fitting the macro data.

The resulting merger produced a common approach with the unwieldy title of Dynamic Stochastic General Equilibrium (DSGE) Modelling. Although there are a variety of DSGE models, they share some family features. As the “General Equilbrium” part of the name indicates, they take as their starting point the general equilibrium models developed in the 1950s, by Kenneth Arrow and Gerard Debreu, which showed how an equilibrium set of prices could be derived from the interaction of households, rationally optimising their work, leisure and consumption choices, and firms, maximizing their profits in competitive markets. Commonly, though not invariably, it was assumed that everyone in the economy had the same preferences, and the same relative endowments of capital, labour skills and so on, with the implication that it was sufficient to model the decisons of a single ‘representative agent’.

The classic general equilibrium analysis of Arrow and Debreu dealt with the (admittedly unrealistic) case where there existed complete, perfectly competitive markets for every possible asset and commodity, including ‘state-contingent’ financial assets which allow agents to insure against, or bet on, every possible state of the aggregate economy. In such a model, as in the early RBC models, recessions are effectively impossible – any variation in aggregate output and employment is simply an optimal response to changes in technology, preferences or external world markets. DGSE models modified these assumptions by allowing for the possibility that wages and  prices might be slow to adjust, by allowing for the possibility of imbalances between supply and demand and so on, thereby enabling them to reproduce obvious features of the real world, such as recessions.

But, given the requirements for rigorous microeconomic foundations, this process could only be taken a limited distance. It was intellectually challenging, but appropriate within the rules of the game, to model individuals who were not perfectly rational, and markets that were incomplete or imperfectly competitive. The equilibrium conditions derived from these modifications could be compared to those derived from the benchmark case of perfectly competitive general equilibrium.

But such approaches don’t allow us to consider a world where people display multiple and substantial violations of the rationality assumptions of microeconomic theory and where markets depend not only on prices, preferences and profits but on complicated and poorly understood phenomena like trust and perceived fairness. As Akerlof and Shiller observe 

…

It was still possible to discern the intellectual origins of alternative DSGE models in the New Keynesian or RBC schools. Modellers with their roots in the RBC school typically incorporated just enough deviations from competitive optimality to match the characteristics of the macroeconomic data series they are modelling, and prefer to focus on deviations that are due to government intervention rather than to monopoly power or other forms of market intervention. New Keynesian modellers focused more attention on imperfect competition and were keen to stress the potential for the macro-economy to deviate from the optimal level of employment in the short term, and the possibility that an active monetary policy could produce improved outcomes  

Because New Keynesians were (and still are) concentrated in economics departments on the East and West Coast of the United States (Harvard, …) while their intellectual opponents are most prominent in the lakeside environments of Chicago and Minnesota, the terms ‘saltwater’ and ‘freshwater’ schools have been coined (by Krugman?) to describe the two positions. But such a terminology suggests a deeper divide between competing schools of thoughts than actually prevailed during the false calm of the Great Moderation. The differences between the two groups were less prominent, in public at least, than their points of agreement. The freshwater school had backed away from extreme New Classical views after the failures of the early 1980s, while the distance from traditional Keynesian views to the New Keynesian position was summed up by Lawrence Summer’s observation that ‘We are now all Friedmanites, Lawrence Summers’. And even these limited differences were tending to blur over time, with many macroeconomists, and particularly those involved in formulating and implementing policy shifting to an in-between position that might best be described as ‘brackish’.  

However, the similarities outweigh the differences. Whether New Keynesian or RBC in their origins, DSGE models incorporate the assumption, derived from Friedman, that there is no long-run trade-off between unemployment and inflation, that is, that the long-run Phillips curve is vertical. And nearly all allowed for some trade-off in the short run, and therefore for some potential role for macroeconomic policy.

The differences between saltwater and freshwater DGSE models may be discussed in terms of the venerable Keynesian idea of the multiplier, that is, the ratio of the final change in output arising from a fiscal stimulus to the size of the initial stimulus. Old Keynesians had argued that the multiplier (as the name suggests) was greater than one since the beneficiaries of government expenditure would increase their consumption of goods and services, leading to more workers being hired who in turn would increase their own consumption and so on.  The ‘policy ineffectiveness’ proposition of the New Classical school implied that the multiplier should be zero or even negative, because of the incentive-sapping effects of government spending and the taxes required to finance it. The DGSE modellers tended to split the difference.

Although the issue was rarely discussed explicitly, the DGSE models favored by the New Keynesian school typically implied values for the multiplier that were close to 1, while those derived from RBC approaches suggested values that were positive, but closer to zero. Given the mild volatility of the Great Moderation, such models yielded no justification for active use of fiscal policy, and good reasons for governments to maintain budget balance as far as possible. New Keynesians also typically rejected active use of fiscal policy, and relied exclusively on monetary policy to manage the economy, But, compared to their freshwater colleagues they had a more positive view of the ‘automatic stabilisers’. Since tax revenues tend to fall and welfare expneditures to rise during recessions a government that maintains a balanced budget on average will tend to run deficits during recessions and surpluses during booms. On a Keynesian analysis, the fact that government spending net of taxes is countercyclical (moves in the opposite direction to fluctuations in the rate of economic growth) tends to stabilise the economy. Vast numbers of journal pages were devoted to refining these different viewpoints, and to defending one or the other. But in practical policy terms, the differences were marginal

Reflecting their origins in the 1990s, most analysis using DSGE models assumed that macroeconomic management was the province of central banks using interest rate policy (typically the setting of the rate at which the central bank would lend to commercial banks) as their sole management instrument. The central bank was modelled as following either an inflation target (the announced policy of most central banks) or a “Taylor rule”, in which the aim is to stabilise both GDP growth and inflation.

On the whole, while central banks showed a some interest in DSGE models, and invoked their findings to provide a theoretical basis for their operations, they made little use of them in the actual operations of economic management. For practical purposes, most central banks continued to rely on older-style macroeconomic models, with less appealing theoretical characteristics, but better predictive performance. However, neither DSGE models nor their older counterparts proved to be of much use in predicting the crisis that overwhelmed the global economy in 2008, or in guiding the debate about how to respond.

{ 32 comments }

1

dsquared 10.13.09 at 7:25 am

This chapter absolutely nails it. The big defence of neoclassical economics contra the heterodox is that of course they do cover all these important things like executive pay and financing and investment uncertainty and relative wealth and haven’t you read Plunkitt (1986) in the American Economic Review for God’s sake?!!?

All of which might be true, but all these little anomalies were quarantined in their separate boxes; there was never an attempt made to figure out the interactions and produce a unified model with all the impurities in it at once. In general, the excuse is “well this would be too difficult”, but of course it’s only too difficult to achieve within a particular formal framework. And the profession just wasn’t prepared to move outside that formal framework.

2

stostosto 10.13.09 at 7:59 am

Mmmm. Impure models. Sounds way better than Dynamic Stochastic General Equilibrium.

3

goldie 10.13.09 at 8:18 am

It was Robert Hall who coined the terms “freshwater” and “saltwater” economics, not Krugman. I’m sure I’ve got the reference somewhere.

4

Alex 10.13.09 at 11:56 am

Hence the Road to Freakonomics; the search for the weirdest possible data and problem cases that were also amenable to analysis using the model of the firm in perfect competition and all…that…jazz.

It seems to me, at least, that the economists who did predict the Big Crunch all have in common that they started by observing a departure from equilibrium, or more subtly, the existence of a pathological equilibrium, and then considered how the system would eventually unwind it. Specifically, three groups:

Accounting – as in the paper you linked to not long ago, the ones who were concerned about the flow-of-funds were aware that funds were flowing into odd places. (Caricature: Roubini, and in a sense, the Calculated Risk crew)

Intermacro – the international macro guys were very well aware of the global imbalances, and because current account imbalances are intermediated through the banking sector, some of them went on to fear that the unwinding would be an epic financial crisis. (Caricature: Setser)

Distribution – out on the left, there was concern that a similar situation to that of 1929 (in the left’s account of the Depression) was building up. Real wages were going nowhere, and as the rich have a lower marginal propensity to consume, aggregate demand depended on a continuing investment boom, which was financed through a huge run up in financial markets. When sentiment swung, investment would collapse, and so would pretty much everything else. (Caricature: Doug Henwood or John Ross)

Those who did not predict the crisis started by observing that the system ought to be in equilibrium, noting that certain departures from it existed, and seeking reasons why it hadn’t blown up yet.

This is actually a valid research project; Johan Galtung observed about international relations theorists that they spent a great deal of time on the causes of wars, but very little on the causes of peace. But the reverse is the case in economics; where Keynes (and some others like Minsky) differed was that they were interested in the excursions from stability, not how the economy works on a normal Tuesday. So were the three groups I cited.

5

Kevin Donoghue 10.13.09 at 1:10 pm

Wikipedia agrees with goldie. The freshwater-saltwater classification seems to have appeared first in Robert Hall’s unpublished 1976 paper. He elaborates on the “spectrum of salinity” as follows:

To take a few examples, Sargent corresponds to distilled water, Lucas to Lake Michigan, Feldstein to the Charles River above the dam, Modigliani to the Charles below the dam and Okun to the Salton Sea.

6

Chris 10.13.09 at 2:43 pm

But, given the requirements for rigorous microeconomic foundations, this process could only be taken a limited distance.

It’s not just that this process was chained to micro, but that it was chained to *false* micro. It was known even then that rational actors were a fantasy that were only kept because it was mathematically convenient.

If your micro is strongly empirically verified, then insisting on consistency with it is limiting, but in some sense justifiable: the macroeconomic phenomena that actually occur in economies *do* result from the actions of individual actors in those economies. But if your micro is completely non-empirical and adopted only because it makes the math more convenient, tying yourself to it will only limit your ability to describe the effects of real-world decisionmaking.

It starts from a general equilibrium structure

This is the part I find most difficult to understand. It’s economics as a sitcom: no matter what happens during the episode, everything is back where it always was at the beginning of the next episode. Every economics paper refutes perpetual equilibrium between rational actors, but like South Park’s Kenny, it’s back at the beginning of the next episode as if nothing had happened.

That amounts to an outright refusal to attempt to build on previous achievements. How can anyone who operated like this possibly have believed that they were doing science? How can you write your second paper in which you ignore the results of your first paper and go back to the convenient fiction of rational actors in perfect equilibrium?

7

Ken Houghton 10.13.09 at 6:06 pm

“My answer, broadly speaking is that the New Keynesians had plenty of useful insights but that the conventions of micro-based macroeconomics prevented them from forming the basis of a progressive research program.”

I do not believe the word “progressive” is necessary. GE is, after all, merely a translation of Dr. Pangloss’s “this is the best of all possible worlds” into what passes for economic research.

Also What Chris Said. (“It was known even then that rational actors were a fantasy that were only kept because it was mathematically convenient.”) A beautiful model that tells us nothing other than that this is the best of all possible worlds and will continue to be so is significantly less useful than locating Piltdown man on the evolutionary scale.

8

Yarrow 10.13.09 at 6:11 pm

Ken: ‘I do not believe the word “progressive” is necessary.’

And it’s ambiguous — after thinking a bit I realized you meant progressive in the sense of the progressive accumulation of knowledge, but on first reading I took it to mean progressive in the sense of progressive politics.

9

Thorfinn 10.13.09 at 7:20 pm

http://faculty-web.at.northwestern.edu/economics/gordon/GRU_Combined_090909.pdf

Seems like a good step forward. You have auction-like prices in some parts of the economy (oil) and sticky prices elsewhere.

An obvious policy implication is wage subsidies, that would in a recession keep unemployment and labor income high while allowing the market price of wages to fluctuate. Countries like Singapore and France have tried these with some success.

10

PGD 10.13.09 at 7:58 pm

One thing with this chapter, and actually the other chapters of the book I’ve seen, is that it simply isn’t detailed enough for someone who isn’t already familiar with the twists and turns of economics. For the CT readership it works, because many of us already are, not to mention being presold on both he significance and truth of Quiggin’s points. But I think to make this compelling for an interested outsider you need to unpack things more. Some of Krugman’s gift for taking apart the key assumptions of a model in English would be helpful, and I think some simple algebraic examples of clearing markets would help too.

For this chapter, I think you need to go more deeply into complexities of what happens when assumptions of a GE model are relaxed. GIve examples of the complexities that arise when a single assumption is relaxed, then the separate complexities that occur when another single assumption is relaxed, then show how those exponentiate when both are relaxed at once. That will give the reader a much clearer sense of both it was so hard for economists to simultaneously question the assumptions of the neoclassical market-clearing macro model while at the same time staying within it as the baseline workhorse set of assumptions, which is what the New Keynesians tried to do.

11

pushmedia1 10.13.09 at 10:04 pm

“For practical purposes, most central banks continued to rely on older-style macroeconomic models, with less appealing theoretical characteristics, but better predictive performance.”

I don’t think this is true. It was true when Mankiw wrote his engineers v scientists paper a couple years ago, but now many European banks use DSGE models and the Fed has introduced its Edo model. Also, the Fed’s Rochelle Edge reports their DSGE model does a better job at forecasting than Fed staff opinions, old-style macroeconometric models and basic time-series methods.

12

Concerned Economist 10.14.09 at 3:13 am

As with your earlier post this one seems at best misleading. I am struck that perhaps I am missing the intent of the book. It doesn’t really address the real areas where change is required but instead places blame on features of the subject which often arouse suspicion but are of relatively minor importance to the current state of the subject.

As I see it, the primary thesis in this post is that the reason that modern macroeconomics has been unable to anticipate and address the financial crisis is because researchers insist on studying models with only one imperfection at a time. Instead of allowing for a rich interplay which could arise from multiple market imperfections, economists isolate these features. Because the rest of “the analysis remained within the general equilibrium framework, it did not allow for the possibility of a breakdown of classical equilibrium, which was precisely the possibility Keynes had sought to capture in his general theory.” You write “An approach based on putting truth first would have incorporated multiple deviations from the standard general equilibrium model […].” Instead of allowing for views of the macroeconomy that would move “steadily further and further away from the standard GE framework,” economists insisted on anchoring their world views in the safe harbors of standard microeconomics.

And what reasons would aspiring researchers have to follow this research program? Tractability. Going further down the rabbit hole was simply too difficult for modern economists. “[G]iven the requirements for rigorous microeconomic foundations, this process could only be taken a limited distance.” You also seem to suggest that the theories were constrained by the social norms of the profession. The one-at-a-time approach was “intellectually challenging, but appropriate within the rules of the game.” This, you conclude, is the real cost of micro-foundations. Insisting on theories that are grounded in microeconomics is simply too limiting.

As someone who works in this field, I probably have a somewhat distorted perspective on this discussion. It is appropriate that I disclose this bias at the outset and the reader should keep this in mind below.

The reason that many economists consider one imperfection at a time in their theoretical work is to isolate the subject they want to analyze. Anything that is not essential to the mechanism can be cut away. In the DSGE tradition, the mechanism being analyzed is considered against some fairly standard backdrop. Thus, if one is studying credit market imperfections, he or she might embed a model of credit market failure into an otherwise standard RBC or New Keynesian model. In fact, even these substructures sometimes get in the way. My own personal style is to remove even this substructure and instead make some even more drastic simplifying assumption to shut out the rest of the world (e.g., assume that all other prices are fixed). This is not because we don’t want to consider the interaction of imperfections. In fact, finding interesting and, better yet, dramatic interactions is a good strategy for getting published and recognized. The reason you don’t see more of it is because typically combining frictions gives very predictable results. You take imperfection A and combine it with friction B and get AB. This wouldn’t be much of a contribution.

Also, just because we study things one at a time doesn’t mean we don’t believe that many frictions exist simultaneously in the real world. Policy recommendations routinely address multiple failures. For an example of a model that does consider multiple imperfections see Smets and Wouters [2003] or Christiano, Eichenbaum and Evans [2004]. These models feature (simultaneously) wage rigidity, price rigidity, imperfect competition in both the labor and goods markets, habit formation preferences, variable capital utilization, limited financial market participation, investment adjustment costs and shocks to preferences, productivity, money demand, labor supply, and taxes.

The insistence on micro-foundations has more to do with a desire to be specific. It is really just economists pushing back against “hand waiving.” They want researchers to be specific. This type of research opens itself up for empirical measurement and testing. When the models fail to match the data (which is inevitable), the models can be refined or scrapped, etc. Being less precise doesn’t save you. (Of course, if we can’t be precise about some feature, it is acceptable to assume it away as long as there is a clear disclosure that a big piece of the puzzle is missing. “If you grant me ABC which I can’t explain, then XYZ follows.” Most researchers don’t like putting themselves in this position but, at least in my view, this is fine.)

You are still describing the literature on financial market imperfections in the past 30 years as though it was just a formalization of earlier ideas. This is simply wrong. Debt-deflation has little to do with the financial accelerator and essentially nothing to do with the current financial crisis. (Debt deflation is about the real effects of variations in inflation; debt deflation/ inflation arises because of contracts which are nominally denominated.) True, earlier thinkers were aware that finance was important and that bank failures were a bad thing. They just didn’t know why and didn’t have a coherent framework for analyzing these phenomena. The necessary framework was the microeconomic theory of asymmetric information which wasn’t fully developed until the 1980’s. A related branch of microeconomics dealing with “principal-agent” problems is about to become more prominent as the regulatory discussion turns to the structure of compensation for bankers and traders – particularly when government funds are seen as necessary to insulate systemic losses. This work, which deals directly with the financial crisis, is inherently microeconomic in nature. Going back to using ad hoc fitted empirical relationships will not help us deal with problems like the ones we face now.

Finally, the post does not identify areas where macro does need to change. First, while the work on financial market imperfections was on point and is being used now to inform policy, this work largely stopped at the turn of the century. I’m not really sure why this happened but it is clearly going to start up again in a big way.

Second, there are clearly aspects of current macroeconomics which will fall out of favor because of the financial crisis. The RBC model isn’t one of them since the role of productivity fluctuations in business cycles had been set aside many years ago (primarily because of improvements in our ability to measure productivity at the firm and sectoral level).

My guess is that the main casualty of the financial crisis will be the emphasis on price rigidity as an important part of macroeconomics. Price and wage rigidity is the central feature of New Keynesian economics. Keynes himself laid the primary blame for short-run failures at the foot of wage rigidity. While there is no doubt that many prices adjust infrequently, price rigidity played no role at all in the current downturn. Inflation was (more or less) stable and so according to the New Keynesian model, we should be near the natural rate of employment. This seems hopelessly off track. The New Keynesian model almost always features price rigidity and almost never features credit market imperfections. I would guess that this is about to be reversed. I also guess that the models of credit market failure that will be developed in the coming years will have micro-foundations. And this is a good thing.

13

KPL 10.14.09 at 5:14 am

Willem Buiter addressed the failures of DSGE modelling in March of this year.

http://blogs.ft.com/maverecon/2009/03/the-unfortunate-uselessness-of-most-state-of-the-art-academic-monetary-economics/

The failing is not the specification of this or that DSGE model, with this or that imperfection thrown in for inspection, but the methodology of optimization over an infinite horizon, with the necessary (for DSGE modeling), but harmful for understanding, transversality conditions and linearization with additive shocks.

Some of the money quotes:

“The friendly auctioneer at the end of time, who ensures that the right terminal boundary conditions are imposed to preclude, for instance, rational speculative bubbles, is none other than the omniscient, omnipotent and benevolent central planner. No wonder modern macroeconomics is in such bad shape. The EMH is surely the most notable empirical fatality of the financial crisis.”

“Even during the seventies, eighties, nineties and noughties before 2007, the manifest failure of the EMH in many key asset markets was obvious to virtually all those whose cognitive abilities had not been warped by a modern Anglo-American Ph.D. eduction.”

“The typical graduate macroeconomics and monetary economics training received at Anglo-American universities during the past 30 years or so, may have set back by decades serious investigations of aggregate economic behaviour and economic policy-relevant understanding.”

“Confusing the equilibrium of a decentralised market economy, competitive or otherwise, with the outcome of a mathematical programming exercise should no longer be acceptable.”

“The Bank of England in 2007 faced the onset of the credit crunch with too much Robert Lucas, Michael Woodford and Robert Merton in its intellectual cupboard. I believe that the Bank has by now shed the conventional wisdom of the typical macroeconomics training of the past few decades. In its place is an intellectual potpourri of factoids, partial theories, empirical regularities without firm theoretical foundations, hunches, intuitions and half-developed insights. It is not much, but knowing that you know nothing is the beginning of wisdom.”

14

lemuel pitkin 10.14.09 at 5:31 am

I can’t help feeling like this whole project is basically answering the question, Why isn’t the moon made of green cheese?

The volatile components of the cheese would boil off in the vacuum of space.
The pictures sent back by lunar missions don’t look like cheese (altho of course the underlying cheese could be covered up by a layer of dust.)
Cheese is an organic product and the relevant organisms are not known to exist in Earth orbit.
Cheese lacks the compressive strength to support itself against the gravity of a moon-sized body.

All good answers but why are you asking the question?

Someone like John Q. is quite capable of seeing the flaws in conventional models but the moment his attention turns elsewhere his background assumptions revert to pure Chicago. And like one of those cases Oliver Sacks rights about, his field of vision extends only to the right; there are all kinds of heterodox approaches in economics but the only time you hear about them on CT (apart from the odd Dsquared post) is when John Q. washes his hands of them. Could you learn something about the crisis from the Post-Keynesian or — god forbid — Marxian traditions? Quite possibly; but not here, where, as usual, it’s better to fail conventionally than to be right but marginal.

Sorry, John, I know this isn’t helpful. But would it really kill you to acknowledge that there are economists outside the mainstream?

15

John Quiggin 10.14.09 at 6:31 am

LP, I know I drank too deeply at the well of Chicago. As you say, the conclusion of the post you link that

The more that we see the network economy as the paradigmatic form of organization, the less reason there is to regard the distribution of income thrown up by the market as having any claim to be normative

is one that could only be written by a disciple of Milton Friedman.

Seriously, I think I’ve made it clear in numerous posts and comments already that I’ll be talking quite a bit about post-Keynesian ideas including Minsky on finance and questions of fundamental uncertainty. The Austrians will also get a run, though with criticism as well as praise. I’m not planning to say much about Marxian economics, because I haven’t seen much to interest me in that tradition for quite a long time. But, feel free to point me in the right direction.

16

dsquared 10.14.09 at 6:40 am

Debt-deflation has little to do with the financial accelerator and essentially nothing to do with the current financial crisis

I disagree with this totally, and also don’t think that Joe Stiglitz makes the very extreme claims for the asymmetric-information literature that you’re making here.

17

John Quiggin 10.14.09 at 6:42 am

@Concerned Economist

Some quick replies. I agree with you on tractability and say so in the draft above. My point is that, if macro problems are intractable using the DGSE approach, and we need macro policies, we have to find some other approach, such as the incorporation of aggregate relationships without rigorous micro foundations.

I also agree that the New Classical and RBC approaches were discredited well before the crisis. I hint at this above and will spell it out in the next section to come. In the section after that, I’ll give some ideas on how macro needs to change.

I’m surprised by what you say about price rigidity. The current crisis seems very like the Great Depression in having its origin in financial markets, but doesn’t affect the relevance of the question, why don’t wages and prices adjust to clear the market? Even non-Keynesians like Bryan Caplan see this as a problem and the Keynesian answer as relevant.

18

Robert 10.14.09 at 7:18 am

12: “Keynes himself laid the primary blame for short-run failures at the foot of wage rigidity.”

No, he did not.

19

citoyen 10.14.09 at 10:12 am

I am a lawyer turned student of economics for just one year and a half, so do not take my mistakes too severely.

First while I have little experience practicing economics, I should say that I find all this khunian rhetoric (“everything has changed with the crisis”) either exaggerated or misleading. I don’t think that current economic theory, interpreted in a suitable large and plural way, is discredited with the crisis. The so called “New institutional finance” (both models of credit rationing and financial accelerator) as described by Knoop in this book (http://www.amazon.com/Modern-Financial-Macroeconomics-Panics-Crashes/dp/1405161817) fit rather well in the picture of the current crisis. Balance sheet effects of financial crises are nothing new; they are the main element of the New Institutional Finance and were one of the pillars of the third generation of currency crises as developped by,… Krugman! (http://sapir.tau.ac.il/papers/sapir_conferences/Krugman.pdf). So, like Barry Eichengreen (http://www.nationalinterest.org/Article.aspx?id=21274) I think one must separate the failure of economist to predict the crisis, which might be traceable to sociological dynamics and feedbacks of optimism, and the suitability of the conceptual framework. The elements of the reasoning, I believe, were just there, while it is true that most economists (though not all, Shiller or Kenneth Roggoff are an example) failed to predict it.

Secondly, again, my impression might be flawed, but I do not think that new keynesians failed to integrate imperfection in a whole framework. The point is, however, that many imperfections have analogous effects- either staggered wages or efficiency wages in its many formulations hypothesis produce the very same effect of wage rigidity- so the general framework does not need to be so particularised. Another point is that, as Concerned Economist pointed out, when you introduce a and b, you get a combination of both. Finally, I believe there were efforts in that very direction. This paper of Stiglitz and Greenwald http://www.bad-sports.com/~rajith/Teaching/MAFE502/Bruce(23-26).pdf for instance, proposed, as soon as at the beginning of the nineties, a model with the basic ingredients of wage and price rigidity and credit market rationing. A standard textbook, such as Carlin and Soskice (“Macroeconomics imperfections, institutions and policies”) proposed a similar approach with its three equations model which does not relies on DSGE.

To sum up, I believe that the post fails to address the pluralism of the new keynesian approach and makes a caricature of it.

20

Chris 10.14.09 at 2:47 pm

“Confusing the equilibrium of a decentralised market economy, competitive or otherwise, with the outcome of a mathematical programming exercise should no longer be acceptable.”

Hear, hear. Some economists need to make up their mind whether they are scientifically studying the behavior of actual economies of human beings, or doing a branch of mathematics about the behavior of idealized theoretical entities (something like game theory with a lot of players). The results of the latter have proved to be of very, very limited fruitfulness as tools for doing the former.

21

Christian 10.14.09 at 3:53 pm

It seems that one intent of this chapter is to trace the development of macro between ~1970 and now. To that I end, it would be useful to follow the arc of macro through specific papers and authors. For example, Edward Prescott, Fynn Kydland and Robert Lucas come to mind in developing the DSGE model in a number of papers: Kydland, and E. Prescott (1982), “Time to build and aggregate fluctuations”, Econometrica and Lucas (1972), “Expectations and the Neutrality of Money”, JET.

There are lots of examples like this, which I’m sure you’re aware of, but I do think it would be helpful to the chapter (and book). I recognize this is a preliminary draft and you do make mention to specific articles, but at least for me, using specific references expands and supports the argument. Regardless, the chapter reads well and is an excellent overview of the arc of macro and some of its blind spots. Will you be mentioning any heterodox economists who are entirely out of the fresh or saltwater?

22

Christian 10.14.09 at 3:57 pm

Sorry, I see you develop this more in your book wiki.

23

PGD 10.14.09 at 3:58 pm

Debt-deflation has little to do with the financial accelerator and essentially nothing to do with the current financial crisis.

Debt deflation definitely has something to do with the financial accelerator — if nominal debts immediately readjusted for changes in the price level, then economic shocks wouldn’t have as negative an effect on balance sheets and therefore creditworthiness. They may not trigger the initial bank run, but they prolong it.

24

Ken 10.14.09 at 4:41 pm

KPL quotes Willem Buiter: “The friendly auctioneer at the end of time, who ensures that the right terminal boundary conditions are imposed to preclude, for instance, rational speculative bubbles, is none other than the omniscient, omnipotent and benevolent central planner.”

Modern economics is much more in favor of central planning than one might think. Since we are all rational actors with complete information seeking to optimize the same utility function, a central planner will be making exactly the same decision in each case as would any system of distributed actors. A similar argument can be made from corporate governance – one never sees calls to break up large companies, even ones whose revenues exceed the GNP of many nations, on the grounds that their internal control structures (almost universally top-down with a central control) is inefficient.

25

Concerned Economist 10.14.09 at 7:39 pm

To Prof. Quiggin:

Let me clarify – I do not think tractability is the reason that many researchers consider one friction at a time. Models with multiple frictions, multiple sectors, multiple shocks and so on are essentially just as tractable as those with one friction, one sector, etc. That is, solving and simulating multiple friction models and computing optimal policy responses is straight-forward. The reason most researchers consider one friction at a time is for clarity and focus only. Frameworks with multiple frictions have not resulted in dramatic interactions like the complete “breakdown classical equilibrium” you anticipate.

Regarding price rigidity – it would appear that the relevant prices have moved to clear these markets. We just don’t like where they have cleared. There is no doubt that many prices move sluggishly and that these movements have consequences. The question is whether sluggish price adjustment has important macroeconmic consequences. Ask yourself this – would the current situation be greatly improved if more prices were adjusted? If not, then you probably don’t put much weight on the importance of nominal rigidity for the current crisis.

To dsquared:

Fisher, like most observers of the time, was aware (at an intuitive level) of the dangers of excessive amounts of debt. Debt-deflation deals with the effects of inflation or deflation on real indebtedness. When deflation occurs, the real value of debt, or real indebtedness, rises which could exacerbate the problems associated with debt -assuming that debt has bad effects to begin with.

The financial accelerator (and similar contributions) aim at explaining why excessive debt and other financial frictions hinder investment in the first place with the hope of developing policies and institutions to address these frictions. Fisher’s debt deflation theory does not really contribute to this understanding but instead takes credit market frictions and their associated maladies as an assumption. Regarding the current situation, there hasn’t been sharp deflation (or inflation) recently so debt-deflation has probably not played a role.

26

KPL 10.14.09 at 10:01 pm

Pushmedia 1 @11

the paper by Edge stops its analysis on 2004, which kind of misses the point. And that is, while DSGE models might be quite good at forecasting the macroeconomy when it is on a steady state growth path, the test is whether they can forecast, or even just explain after the fact, abrupt movements in the business cycle.

I hope the Fed’s economists will soon release a paper on how their DSGE models, calibrated to data up to mid 2007, have gone in forecasting the past two years.

27

Disgruntled Subeditor 10.15.09 at 3:38 am

What went wrong? what went wrong? that’s very easy to explain.
Keynes forgot to take into account the effect of a strategic resource and the monopoly of a few overprivileged economies in holding it. If everything operated on the same levels and at the same degrees of endowment then there would have been no problems, but this is clearly not the case.

28

JoB 10.15.09 at 1:50 pm

Hey John Q, I have no clue whatsoever whether you’re going to read this (to be honest in the matter: it wouldn’t hurt to get one) but leaving the economics aside: nope, this is not your best contribution. Somebody said, someplace here, nog long ago: ‘that you’re better on the attack than on the defense’, or something on that order.

But I beg to differ: when you’re defending something new (see last one), you rock; and when you’re attacking some historical stuff, you don’t. IMHO, of course – ‘all acronym explanation’ chapters are good for conversations with economists I guess but that’s as exciting as discussions amongst SW engineers (for lack of a funnier analogy).

29

John Quiggin 10.15.09 at 9:24 pm

More work needed, I guess, JoB. I will at least kill the acryonyms.

30

dsquared 10.15.09 at 9:32 pm

Fisher, like most observers of the time, was aware (at an intuitive level) of the dangers of excessive amounts of debt

this really isn’t true – Fisher had a very well developed analytical theory of the reasons why excessive debt impeded investment and as I say, Bernanke and Stiglitz recognised this and do not make the same claims of originality for the financial accelerator literature that you do.

31

PGD 10.16.09 at 4:10 am

Regarding the current situation, there hasn’t been sharp deflation (or inflation) recently so debt-deflation has probably not played a role.

You have got to be kidding. There’s been enormous deflation. Ah, economists…

32

Bob86 10.22.09 at 1:27 pm

When the microbe has grown for the day, and reached its maximum number of organisms, you than transfer 1% of the culture to a new tube. ,

Comments on this entry are closed.