“Tim Lee”:http://www.cato-unbound.org/2010/09/24/tim-lee/of-hayek-and-rubber-tomatoes/ takes exception to my “post of a couple of weeks ago on James Scott and Friedrich von Hayek”:http://www.cato-unbound.org/2010/09/24/tim-lee/of-hayek-and-rubber-tomatoes/, suggesting that I construct a ‘curious straw-man’ of Hayek’s views. Unfortunately, he completely misreads the post in question. Nor – on serious investigation – do his own claims actually stand up.
First – how Lee misreads the post. His accusation of straw man building is as follows:
bq. In attributing to Hayek the view that “markets are superior,” Farrell conspicuously fails to mention: superior to what? This omission allows Farrell to construct a curious straw man of Hayek’s views, suggesting that Hayek championed large-scale commodity markets over smaller-scale markets that employed more local knowledge. Although it’s possible Hayek staked out this position somewhere in his voluminous writings, it’s certainly nowhere to be found in the famous essay Farrell linked to.
The ‘superior to what’ question is easily resolved by looking to the “essay under debate”:http://www.econlib.org/library/Essays/hykKnw1.html, where the answer would seem to be ‘superior to one and a half straw men of Hayek’s own making.’ I quote:
bq. The answer to this question is closely connected with that other question which arises here, that of who is to do the planning. It is about this question that all the dispute about “economic planning” centers. This is not a dispute about whether planning is to be done or not. It is a dispute as to whether planning is to be done centrally, by one authority for the whole economic system, or is to be divided among many individuals. Planning in the specific sense in which the term is used in contemporary controversy necessarily means central planning—direction of the whole economic system according to one unified plan. Competition, on the other hand, means decentralized planning by many separate persons. The halfway house between the two, about which many people talk but which few like when they see it, is the delegation of planning to organized industries, or, in other words, monopoly.
In other words, Hayek is claiming that markets are superior to planning (which must necessarily be done through a single centralized planner) or organized monopoly (which no-one actually likes). One could perhaps defend Hayek by arguing that there were indeed many people who argued for centralized planning at the time he wrote this essay. But one would then have to contend with the notorious contortions he got into about how “Labour’s efforts to introduce planning into the economy would ineluctably lead to totalitarianism”:https://crookedtimber.org/2010/04/12/the-road-to-where/. The straw man is baked into the theoretical cake (which as a result tastes pretty godawful).
As for the ‘Hayek never championed large scale markets’ bit – this is a plain and simple misreading. My post stated:
bq. I imagine Scott’s counterblast going something like this: It is all very fine to say that markets provide a means to communicate tacit knowledge, and it is even true of many markets, especially small scale ones with participants who know each other, know the product and so on. But global markets do not rely on tacit knowledge. They rely on standardization – the homogenization of products so that they can be lumped under the appropriate heading within a set of standard codified categories. Far from communicating tacit knowledge, the price system (and the codified standards that underlie it) destroys it systematically.
In other words, the distinction between large scale and small scale markets is implied by _Scott_ rather than Hayek. When Scott says:
bq. It seems to me that large-scale exchange and trade in any commodities at all require a certain level of standardization.
and later refers to Polanyi’s _The Great Transformation_ as ‘the most important book I’ve ever read.’ he seems to me to be building on Polanyi’s crucial distinction between small scale and large scale markets. Roughly speaking, Polanyi sees local markets as fine, because they are ’embedded’ in society, and hence do not threaten it. But when long distance trade first becomes unmoored from societies, and then to restructure them along its own principles of order, we start to get into trouble.
Scott’s specific twist on Polanyi’s thought is to identify standardization as a key facet of this unmooring. Local markets are embedded in local structures of knowledge, and rely upon them extensively – people know that Giovanni’s tomatoes usually taste better than Luigi’s, even if they are less regular and have more surface blemishes. International markets require technical standards – in order to buy and sell tomatoes, one has to be able to categorize them as Grade II (with certain defined attributes in terms of color, size, consistency etc), Grade III etc. Scott argues that there is a real and fundamental _loss of knowledge_ that occurs in the standardization process, regardless of whether the standards are imposed by states are by markets. Indeed, he sees the two as going hand-in-hand. Brad DeLong (like Lee) made much of Scott’s critique of the state, but “failed to observe”:https://crookedtimber.org/2007/10/31/delong-scott-and-hayek/ how intricately this was linked to a critique of the market. The homogenizing tendencies of German state forestry are a product of market forces as well as fiscal needs. Quoting Scott:
bq. forest as a habitat disappears and is replaced by the forest as an economic resource to be managed efficiently and profitably. Here, _fiscal and commercial logics coincide; they are both resolutely fixed on the bottom line._ (my italics).
Hence – when Lee suggests that:
bq. Although Scott specifically declines to endorse Hayek’s policy agenda, I think Seeing Like a State is squarely within the Hayekian intellectual tradition.
he is simply wrong, unless (contrary to what Lee and I both believe), Hayek has written somewhere within his voluminous opus on the problematic tradeoffs that markets face when they become unmoored from local society. This distinction is, I am pretty sure, an inherent part of Scott’s intellectual project.
Lee makes a positive Hayekian case on behalf of standards – this too, it seems to me, is wrong, but for different reasons. He makes the standard Hayekian case for decentralized rules that individuals comply with without understanding, and argues that standards can be subsumed under these rules. I think that this is a tougher case to make than he realizes – but no matter. The key problem is in his (again standard Hayekian) defense of the processes through which certain rules come to dominate.
bq. What makes decentralized economic institutions powerful isn’t standardization but the possibility for competition among alternative standardization schemes. Rubber tomatoes create an entrepreneurial opportunity for firms to establish a more exacting tomato standard and deliver tastier tomatoes to their customers. In real markets, you see competition not only among individual firms but among groups of firms using alternative standards. Markets gradually converge on the standards that are best at transmitting relevant information and discarding irrelevant information. In contrast, when standards are set by the state, or by private firms who have been granted de facto standard-setting authority by government regulations, there is no opportunity for this kind of decentralized experimentation.
The problem with this claim is that it relies (as Hayek relies) on quite heroic assumptions about the underlying conditions under which competition occurs. There is an excellent discussion of this in my sometime co-author Jack Knight’s “Institutions and Social Conflict”:http://www.amazon.com/gp/product/0521421896?ie=UTF8&tag=henryfarrell-20&linkCode=as2&camp=1789&creative=390957&creativeASIN=0521421896. (pp.94-97)
bq. Hayek offers the most influential [evolutionary explanation of how social institutions evolve to meet a society’s functional needs]. … At one level, the “spontaneous formation” of the rules of a given society’s conduct is what Hayek labeled “the results of human action, but not of human design.” … At a more fundamental level, the evolutionary process of natural selection produces the overall social order for a society: “The natural _selection _ of rules will operate on the basis of the greater or lesser efficiency of the resulting _order of the group.”_ … Thus, the social orders are selected by a process of competition for productivity benefits for the group as a whole.
bq. For this kind of general evolutionary account to succeed, each of the following components must be satisfied: (1) the group as the unit of selection, (2) natural selection through the struggle of survival, and (3) a comparative metric of the fitness value of the social order. … conceptual difficulties of the group as the unit of selection … It should be clear that the empirical requirements for satisfying such a general theory are unattainable. Hayek himself admits that we are unable to trace historically such an evolutionary process.
This is part of what I mean when I say that Hayek is quite incurious about the underlying conditions of markets. He has a theory, but it is never tested properly and doesn’t seem to work in practice. Nor do we have any empirical evidence that Hayekian processes apply in the field of technical standards. To the contrary. In the “only large-scale statistical study of the political economy of standard-setting”:http://www.duke.edu/~buthe/downloads/MattliButhe_WPv56n1.pdf that I am aware of, the authors find that market-based standardization processes systematically lose out to more politically embedded forms of standard setting in an internationalized economy. More specifically, they lose out _not_ because they are crushed by state power, but because political institutions better underpin flows of information, make it easier to coordinate on standards, and finally comport better with international standard setting processes in ISO and elsewhere. Perhaps they also lead inevitably to centralized totalitarianism, but I’m personally prepared to take that risk.
Perhaps the _very particular_ standards that Lee points to are different. He claims:
bq. The web browsers we all used to retrieve this article conform to a variety of technical standards, including TCP/IP, HTTP, and HTML. … This suite of now-dominant protocols emerged from an intense process of inter-standard competition during the 1980s and 1990s. This competitive standardization process is not a market process—accessing a web page is not a financial transaction—but it is very much a Hayekian one.
Or again, perhaps not, if “notorious libertarian-hater Dan Drezner”:http://www.danieldrezner.com/research/egovernance.pdf is to be believed. In a detailed study of the adoption of TCP/IP, Drezner finds that state power was the key factor determining which standard won out:
bq. Although Defense Department and ARPANET constituents favored the TCP/IP protocol, other networks did not rely on it. The actors behind these alternative networks had different motivations. Companies with investments in computer networks preferred developing their own proprietary standards, so as to reap the pecuniary rewards of managing their own networks. By the mid- seventies, Xerox was pushing Xerox Network Systems (XNS), Digital was marketing Digital Equipment Corporation’s Digital Network Architecture (DEC- NET), and IBM was promoting its System Network Architecture (SNA) to its government buyers … The major economic powers feared the prospect of being held hostage to a firm’s ownership of the dominant network protocol. This was particularly true for states with government monopolies of the telecommunications sector … The second and more significant initiative was the push by the United States, the UK, France, Canada, and Japan to have the International Organization for Standardization (ISO)—an NGO of technical standard setters—develop compatible network standards for both private and public uses. … This initiative resulted in the 1978 creation of the Open Systems Interconnection (OSI) model. … because OSI stressed openness and accessibility, the TCP/IP code fit more seamlessly with the OSI framework … The actual outcome reflected the preferences of governments.
I think that there is a plausible counter-argument – but this argument would be based on “Habermasian theory rather than Hayekian decentralized order”:http://papers.ssrn.com/sol3/papers.cfm?abstract_id=363840. The two rely on “very different social logics”:http://www.springerlink.com/content/b8167107l4662l47/.
But perhaps, in the absence of government power, we might expect Lee’s arguments to work? Almost certainly not. Hayek-style evolutionary arguments really only work when actors are indifferent between coordination outcomes (i.e. they want to coordinate, but do not care what outcome they coordinate upon). See further Jack Knight again (his chapter in “this volume”:http://www.amazon.com/gp/product/047208576X?ie=UTF8&tag=henryfarrell-20&linkCode=as2&camp=1789&creative=390957&creativeASIN=047208576X provides the basic intellectual framework; for a short summary see pp.543-544 of this “article that we co-wrote”:http://www.henryfarrell.net/farrellknight.pdf ). And over a wide variety of market standards – including those that Lee is most interested in, actors are most emphatically _not_ indifferent over which standard applies. Different standards will benefit different firms – and firms which manage to gain effective control of a widely used standard will have very considerable power to set the rules of the market by so doing.
This applies in spades to Lee’s other example of Hayek in action – the HTML standard. Anyone who has had to optimize web sites for various browsers will be familiar with the fact that different browsers have different ways of interpreting what is purportedly the same standard. And anyone who bothers to look into the politics behind this will be quite aware that Microsoft’s infamous policy of ’embrace, extend, extinguish’ plays a significant role in this story. Microsoft Word’s .doc standard is an even purer example of how standard-setting processes play out in markets where businesses are competing to set the rules of the game. And if anyone wants to argue that Microsoft Word was an efficient outcome of a Hayekian process of evolutionary competition, all I can do is post this image:
!http://upload.wikimedia.org/wikipedia/en/d/db/Clippy-letter.PNG!
and say: Sir. I refute you thus.
In short then: (1) Lee completely misreads my post. (2) James Scott is not a Hayekian under any reasonable definition of the term Hayekian. (3) Hayekian arguments about evolutionary competition both are implausible in general, and provide a demonstrably bad explanation of how technical standards evolve. (4) And, finally, rubber tomatoes suck. I think that covers everything.
{ 36 comments }
Evan 09.30.10 at 5:57 pm
I’d agree with you on the standards front, but not for the reason you list, which is less to do with malice aforethought from microsoft than from the inherent difficulty of software engineering something to a sometimes ambiguous spec, but from the changes in the process from the early days of HTML to the spec formulation of HTML 5.
Prior versions of the spec were largely canonizations of previously existing bodies of work. They did OK because everyone was under tremendous commerical pressure to get something working and to do it quickly. They had a lot of examples to generalize from, as well.
The standards process for HTML5 worked at it from the other direction: people got together to define extensions and revisions to the language. This process has been a giant boondoggle because it was largely speculative at the time it was started, and no one had the time or the inclination to participate other than big companies who had other interests and ulterior motives in play.
So imagining it as a single process is wrong. Each standardization effort is unique and has its own challenges and rewards. Embrace, extend, etc is only a partial explanation, and I think that people trying to push their corporate interests into the future has a lot more to do with the failures of recent standardization efforts, since everyone realizes how much is on the line, now.
That said, this doesn’t invalidate your larger point. A clued-in government representative being there might have given the corporate actors someone to blame for their negotiation failures, if nothing else, and may have made things run a little bit more smoothly.
Aulus Gellius 09.30.10 at 7:06 pm
To start things off with a stupid irrelevant question:
“the push by the United States, the UK, France, Canada, and Japan to have the International Organization for Standardization (ISO)”
How on earth does the acronym come to be ISO? That can’t be the word order in french, can it (I would think OIS)? And you give the English name, which should become IOS. And obviously, I don’t see how ISO could abbreviate a Japanese phrase. This is the great question raised by your post!
David 09.30.10 at 7:07 pm
Wouldn’t this be a great moment for Scott to appear, like Marshall McLuhan in Annie Hall, and tell us all what he really meant? (Or Hayek, if he’s reading this …)
Henry 09.30.10 at 7:18 pm
Anyone who knows him (Scott – not Hayek) is encouraged to bring this to his attention, if only so that he can correct me if I’m wrong.
bianca steele 09.30.10 at 7:22 pm
Dan Drezner remarks that TCP/IP fits an OSI framework better than other protocols like X.25. It’s a little more complicated than that. ISO, as a government procurement protocol, mandated ISO’s IP (which is different from the IETF’s IP, but performs the same functions) at layer 3 in place of X.25 (which performs the same functions as the two IP’s, more or less). The vendors went out and implemented TP/IP, as it was called, and got it certified, but hardly anybody bought it (the market, of course, included much more than the US and overseas government markets). By contrast, they mostly already had at least a cheapo version of TCP/IP and it cost very little to validate. I could probably list close to half a dozen more “problems” with OSI off the top of my head, most of them probably no longer corporate secrets, but most of them possibly only relevant to my former employer and a handful of people I’ve interacted with.
These things always seem to miss the interesting narrative from my point of view, and besides that, it’s hard to convey the technical information (and people get in the habit of dumbing it down, sometimes probably too much). I would say there are interesting questions why ISO’s IP didn’t take off (in the US, my understanding was that European and Asian telecoms were using it for quite some time), and why X.25 was perceived as inadequate, but those are two difficult protocols, and I’m a little puzzled as to why they seem to have been conflated. (There are also interesting comparisons to be made between IETF, ISO/ITU, and W3 (not to mention the minor consortiums that popped up before corporate hegemony had been achieved) in terms of how they make decisions. The IETF guys are very outspoken, and I wonder whether they are easier to study.)
I haven’t read the recent responses to James Scott’s post (and it’s been one of those days), but I thought Tim Lee’s first response made a lot of sense.
bianca steele 09.30.10 at 7:23 pm
“two different protocols,” not “difficult”
JanieM 09.30.10 at 7:26 pm
For Aulus Gellius:
“The organization’s logos in two of its official languages, English and French, include the word ISO, and it is usually referred to by this short-form name. ISO is not an acronym or initialism for the organization’s full name in either official language. Rather, the organization adopted ISO based on the Greek word isos (ἴσος), meaning equal. Recognizing that the organization’s initials would be different in different languages, the organization’s founders chose ISO as the universal short form of its name. This, in itself, reflects the aim of the organization: to equalize and standardize across cultures.”
From wikipedia.
bianca steele 09.30.10 at 7:28 pm
Also: to purchase a single copy of an ISO standard through the US standards body, ANSI, in 1991 or so, cost $100-200, and could only be done by an organization that belonged to ANSI. I was told that you could buy the same book through the UN bookstore as a CCITT (now ITU) publication, at a lower price. The TCP/IP standard was free for the asking.
Kevin Donoghue 09.30.10 at 7:52 pm
Thanks, JanieM. So much for my theory about ISO’s reason for not wanting to be called IOS.
Tim Worstall 09.30.10 at 8:50 pm
Forgive me for repeating, in part, an argument I made on another thread.
“he seems to me to be building on Polanyi’s crucial distinction between small scale and large scale markets. Roughly speaking, Polanyi sees local markets as fine, because they are ‘embedded’ in society, and hence do not threaten it. But when long distance trade first becomes unmoored from societies, and then to restructure them along its own principles of order, we start to get into trouble.”
I see a terrible confusion here between “small scale markets” and “local markets”.
I work in a very small market, one in which somewhere between 20 people and 200 people globally are, from active participants to interested observers of it. Consumers of the end product are those several billion who enjoy streetlights. The 20 to 200 participants in the direct market are not local in any geographical sense. I deal with people in Japan, Taiwan, Kazakhstan, Germany, Austria, China, none of whom I have ever met. With those in the US, UK etc, some of whom I have met.
Polanyi, at least as I understand him (please do correct me) argues that the web of inter-personal actions makes small scale market (ie, in small markets) transactions somehow more fulfilling, for there is a sense of human contact, obligation, to them. And it’s certainly true that in my experience, dealing with a small number of people over a number of years (at the very least for the trust issues) does indeed make markets easier.
But that isn’t the same as “local” at all. Unless we are saying that phone calls and email make people local.
Have I actually understood this correctly? That Polanyi is arguing that geographically close markets are somehow better? I’m willing to believe that small markets are, where human contact aids in that Hayekian local knowledge, but seriously unwilling to believe that geography, these days, has much to do with it.
Local doesn’t mean “near to me” about information after all, but “known to me”.
Sebastian 09.30.10 at 11:42 pm
“Or again, perhaps not, if notorious libertarian-hater Dan Drezner is to be believed. In a detailed study of the adoption of TCP/IP, Drezner finds that state power was the key factor determining which standard won out:”
This is certainly an accurate historical description. State power was used to create a standard that won. But I think you are implying that no standard would have won out without state power deciding and forcing the issue. That is much larger supposition, and I’m not sure it is so easily defended.
Is that what you mean?
ChrisB 10.01.10 at 12:33 am
I’ll be damned. You can include pictures in Crooked Timber. Who knew?
marcel 10.01.10 at 2:11 am
or organized monopoly (which no-one actually likes).
Didn’t Schumpeter put in a good word or 2 for monopoly?
Myles SG 10.01.10 at 2:36 am
Isn’t one fault of this the conflation of long-distance trade with the loss of specialist, un-centralised knowledge/proudct? Methinks it could be plausible that there be long-distance (i.e. “gloabalised”) commerce and specialised knowledge simultaneously, for example, say, in sports cars. Arguably lots of sports cars could not exist absent a worldwide market, and were the market to be nationally bounded. In this case globalisation makes increased variety (contra rubber tomatoes) more rather than less likely.
For example, Lotus cars. Or to make my case a bit less strenuous, suppose that there are international agents whose whole livelihoods derive from specialist and internationalized, non-standard knowledge. They clearly exist in real life.
Henry 10.01.10 at 3:11 am
Tim – I think that this would be a reasonable criticism of both Polanyi and Scott. Neither is especially interested in the ways that thick bonds can be reconstituted across large distances by new forms of communication.
Sebastian – the point I am making is that Lee makes a specific empirical claim that the dominance of TCP/IP came about through a Hayekian process of group selection. If Dan is correct (Bianca argues that the story is more complicated, but hasn’t presented her alternative story which I would like to see – it makes me feel warm and happy that CT readers seem able to weigh in on any topic under the sun), then this empirical story is false. And of course standards can emerge in the absence of state control (see the .doc story, where I do not think states did more than act as somewhat important consumers), but I believe, as in the .doc standard story, that the processes through which they will emerge will usually be _political_, having a lot more to do with differences in the interests and bargaining power of the firms in question. Microsoft had an interest in continued dominance, and considerable bargaining power thanks to network effects, which it leveraged to make .doc into the standard format for documents being shared for editing etc. Perhaps this will break down now that we are moving to cloud computing – but I see no reason to expect that the next round will be any less about power relationships than this one.
Jim Rose 10.01.10 at 3:31 am
to add to marcel,
schumpeter style monopolies 0r serial competition can be common and pro-consumer because of the rewards for introducing new and better goods leads to thecreation of these goods in the first place.
Google is number 20 or so search engine that came along displacing previous giants by being a better search engine. serial and actual competition still constrains google.
patents and copyrights are to some government sanctioning and monopolies and to others they are property rights. Those that oppose patents because they are a monopoly seem to think that a schumpeter style advantage of being first is enough of a reward to encourge innovation.
there are gains and losses from industrial concentration. Hayek suggested that competition is a discovery procedure that will find out through market rivalry and trial and error which methods of production and channels of distribution serve consumers best.
Sebastian 10.01.10 at 6:59 am
“And of course standards can emerge in the absence of state control (see the .doc story, where I do not think states did more than act as somewhat important consumers), but I believe, as in the .doc standard story, that the processes through which they will emerge will usually be political, having a lot more to do with differences in the interests and bargaining power of the firms in question. Microsoft had an interest in continued dominance, and considerable bargaining power thanks to network effects, which it leveraged to make .doc into the standard format for documents being shared for editing etc. Perhaps this will break down now that we are moving to cloud computing – but I see no reason to expect that the next round will be any less about power relationships than this one.”
Now I’m totally confused. Do you think that Hayek didn’t believe that power relationships were importnat in the market? I guess I don’t understand what special function you think governments have in the emergence of standardization that markets either cannot or do not fulfill. You point to cases where the government has chosen to exercise its rather considerable power by forcing a standard, and you seem ok when it does so as opposed to say when Microsoft exercises its rather considerable power. What I don’t see is thinking about why the distinction is so crucial to you.
Many of the times the government acts, it does so because one of the players has more political clout than the other players. (See for example how the DMCA shaped the Digital Media field). It isn’t as if these decisions are made in a way that penetrates the public mind–so most of the politics around them ends up being deeply insider-influenced.
The unique thing *I think* governments could bring to the table is a focus on externalities usually in the form of safety. (Though Underwriter Laboratories shows that even this doesn’t have to be a strict government function). But you seem to be suggesting that serious government intervention is needed in a host of other standardization areas.
Am I reading you correctly?
Henry 10.01.10 at 11:15 am
Sebastian – Hayek had a theory of institutional change which is all about evolutionary selection for functionally efficient institutions. Read up on it if you do not believe me. That is what Lee is invoking here. And it does not explain the evolution of standards. Non-functionally efficient standards are rife when they further the ends of powerful actors. This is true both when government is involved in setting these standards, and true when government plays no role. The specific examples that Lee invokes of standards that were purportedly selected through Hayekian processes were not so selected. And when there are countervailing forces, they seem to have more to do with quasi-Habermasian processes of debate and the dislike of engineers for cruddy solutions even when they maximize their companies’ profits, rather than the broad Hayekian forces that Lee claims. Finally – none of this is at all surprising given the extremely unlikely circumstances under which a Hayekian selection process would obtain (basically – he is importing bad functionalist theory from the evolutionary biology of his day). That’s the problem. I am not advancing a specific positive theory of standards and how they come about, but instead showing that Lee’s theory is wrong on a variety of levels.
Henry 10.01.10 at 11:17 am
And there are no ‘government shoulds’ or ‘governments can bring to the table’ here. I’m not making a normative argument about the appropriate role of government (though Scott is in a backhanded way, as is Hayek). I’m treating this as an empirical exercise in understanding the processes of institutional change (something that is central to my own academic research). And I am finding that the proposed theoretical account is obviously inadequate to explain the empirical outcomes.
Pete 10.01.10 at 12:43 pm
patents and copyrights are to some government sanctioning and monopolies and to others they are property rights
Seems only fair to point out that both Lee and Hayek are sceptical of patents and copyrights in exactly this way: they think that markets are information processing mechanisms, and so they want information to be free wherever possible. This means they can credibly claim that Clippy the paperclip is one of the many evils that result from government interference. That’s pretty clearly what Lee’s getting at when he talks about “private firms who have been granted de facto standard-setting authority by government.”
It’s also pretty clear that by Lee’s lights a Habermassian conversation is just one variety of “broad Hayekian force” – a bunch of decentralised actors exchanging information and converging on a best solution. I don’t know enough about Hayek or Habermas to know whether or not that’s entirely coherent, but it’s definitely a much easier position to defend if you think that IP rights are an interference with the conversation.
Sebastian 10.01.10 at 3:23 pm
“Sebastian – Hayek had a theory of institutional change which is all about evolutionary selection for functionally efficient institutions. Read up on it if you do not believe me. ”
Jesus Christ. Was that truly necessary? Especially when you are misstating it?
Hayek had a theory of institutional change which is largely about evolutionary selection for functionally efficient processes WHEN THEY ARE ALLOWED TO COMPETE. I know that because I’ve read up on it, ummm quite some time ago, thanks. I also know that one of his biggest critiques of institutions (largely government, but also private monopolies) is that they use political power to stifle that competition and thus perpetuate lower quality processes/decisions.
You haven’t simplified it in a useful way, you’ve simplified in a way that gets rid of all the key insights. So it should come as no surprise that you find Hayek unhelpful (probably in more areas than just standard making) compared to other people who seem to find him very helpful (say DeLong and Lee).
And then you say things like “More specifically, they lose out not because they are crushed by state power, but because political institutions better underpin flows of information, make it easier to coordinate on standards, and finally comport better with international standard setting processes in ISO and elsewhere. ”
I don’t know how you think that “comport better with international standard setting processes” and “crushed by state power” are such separate things that the second should be considered a negation of the first, but it seems neither obvious nor demonstrated. For many values of “forced to comport with the international standard setting processes” you are in fact describing “crushed by state power”.
“International markets require technical standards – in order to buy and sell tomatoes, one has to be able to categorize them as Grade II (with certain defined attributes in terms of color, size, consistency etc), Grade III etc. Scott argues that there is a real and fundamental loss of knowledge that occurs in the standardization process, regardless of whether the standards are imposed by states are by markets.”
Scott argues that there is a real and fundamental loss of knowledge that occurs in the standardization process. And he is right. However, that does not imply that the losses in the government standardization process and the market standardization process are equal in magnitude. Government standardization tends to get locked into one dimension of analysis. Market standardization isn’t. If the market players misidentify something crucial in their analysis, it is open to attack from someone else. You causually dismiss the rubber tomatoes example, but fail to note that the market has actually responded in that one. The entire Whole Foods supermarket is built on the distinction as are a number of not-as-large health food stores, or the like. We wouldn’t have enormous ethanol subsidies a decade or more after it became obvious that they were horrible for the environment if the market was in charge of the decision.
“I’m treating this as an empirical exercise in understanding the processes of institutional change (something that is central to my own academic research). And I am finding that the proposed theoretical account is obviously inadequate to explain the empirical outcomes.”
Here again, I’m confused by why you’ve invoked Hayek in the way that you have. Hayek certainly doesn’t deny the government’s ability to create and maintain standards. His whole problem with central planning isn’t that it can’t exist. He suggests that the decisions made in that mode aren’t typically as good as if they are forced to compete. And as bianca steele suggests, that might be the case even within good versions of mostly government ordered standardization systems.
Tim Lee 10.01.10 at 3:43 pm
Thanks for the thorough response. I’m still not sure I understand your critique of Hayek. Hayek’s argument is that markets (and other decentralized institutions) are better than central planning. You respond by pointing out that small-scale markets are better than large-scale markets. Even if this is true, it’s not a refutation of what Hayek wrote. If your point is that Hayek failed to acknowledge that small-scale markets are superior to large-scale markets, that seems like a peculiar thing to fault him for, since that wasn’t the point of his essay and for all we know he would have readily accepted the point as consistent with his broader argument.
On the success of TCP/IP, I think there’s a lot wrong with Drezner’s version of the story. The OSI model was indeed a “metastandard,” which is another way of saying it wasn’t really a standard at all. It was a conceptual framework for thinking about networking standards. It was very trendy to talk about the OSI, but my sense is that the “adoption” of OSI didn’t actually have much of an effect on which protocol emerged victorious, since almost any protocol could be shoehorned into the OSI model. From quickly skimming Drezner’s paper, he doesn’t seem to offer any actual evidence to the contrary. For one thing, if national telco monopolies were in the driver’s seat, we’d all be using X.25 rather than TCP/IP.
Your point is even less clear to me in the case of HTML. My primary point was that the victory of HTML over competing standards like Gopher was a Hayekian process with little if any central planning. The evolution of HTML since it became the dominant standard in ~1995 is a complex story, but I don’t see anything in that story that contradicts my point. Obviously, large companies like Microsoft have some influence over the process, but even when Microsoft had its peak 90%+ browser market share, it still wasn’t able to control the evolution of the web platform. “Embrace and extend” mostly failed.
bianca steele 10.01.10 at 7:30 pm
Henry:
I think the interesting part of the story Drezner tells has to do with the US government’s support for the ISO process by preference to the IETF process. This was most likely a huge mistake, but he is correct that it’s difficult to call it a preference for government involvement over non-government involvement, given the DoD’s involvement in the origins of the IETF. (I assume European PTT’s count as government actors, incidentally, which makes for an asymmetry with the US situation, especially given the breakup of AT&T around the same time.)
I only meant to point out that, as the choice was not between the ISO-supported X.25 an actually more ISO-compatible TCP/IP, that one sentence seems questionable to me. Also, yesterday I thought I saw something that suggested either Drezner or you had relied on reports from people associated with European PTT’s, who I think may turn out to have a different point of view than others. A lot of this seems still to be at the level of oral histories. Even if I could always tell the difference between a process driven by Hayekian-style price signals and one driven by what he would consider irrelevant factors, I don’t see the information needed to make the decision as already being out there.
bianca steele 10.01.10 at 7:40 pm
Incidentally, the very small part of the ongoing work of the IETF that I’ve had contact with in the past ten or fifteen years has impressed me with its capacity for getting workable agreements between people from different organizations in an appropriate space of time. I’m sure I would be driven out of my mind, no different from anybody else, by the hassles, if I actually went to the meetings myself. At the beginning of my career, I worked with the ISO version of the same thing, and if it had ever gotten to the same level of use, it would have been chaos at a fifth the tempo.
Henry 10.01.10 at 8:18 pm
Tim – the point here is that the small markets vs. large markets distinction may seem like small beer to you, and presumably was small beer to Hayek – but is not to Scott. He is clearly situated in a lineage descending from Polanyi (see especially Polanyi’s “The Economy as an Instituted Process” essay collected in the Mark Granovetter and Richard Swedberg volume on the sociology of economic life). And he buys into Polanyi’s arguments about the pernicious consequences of large scale markets, which seem to me to be directly in contradiction to Hayek’s arguments about the benefits of the price mechanism. To put it another way – you could, if you wanted to, assimilate some of Scott’s arguments about the problems of seeing like a state into a broad Hayekian framework. But you cannot assimilate Scott himself into the Hayekian tradition without tossing aside his claims (which are understated in SLAS, but nonetheless a crucial part of his general argument) about how large scale markets, just as much as states, can have dire and pernicious consequences. NB that I do not agree with large chunks of Scott’s argument myself – but this is why I think that your effort to assimilate him to Hayek doesn’t work. I don’t know if you have read Polanyi, but once you do, it becomes quite clear why Scott is a Polanyian rather than a Hayekian.
On HTML – I will grant that the initial HTML vs. Gopher etc standards fight (which I remember using back in the day) involved little in the way of power struggle. But that was in large part because the stakes were low-to-nonexistent given the absence of any commercial activity on the Internet. When it became clear that there were real stakes involved, we saw exactly the kinds of direct struggle with asymmetric power between actors pushing competing standards that you might have expected, resulting in all sorts of painful inefficiencies thanks to competing standards, and the actor with the most bargaining power (Microsoft) _winning_, driving its major competitor out of business, and then not bothering to update its browser for a number of years because it didn’t need to any more. The same problems occurred in a dramatically more marked form with .doc. If (a) standards can be used to create, extend or maintain specific firms’ market dominance, (b) firms have asymmetric abilities to push for standards that benefit them to the possible disadvantage of other firms, and (c ) firms want to maximize their profits, then a Hayekian story of group selection for functionally efficient outcomes is simply not going to explain the development of standards. Self-interested firms are going to push for standards that benefit them. And those with most bargaining power through e.g. control of a broader platform, will usually win out.
Tim Lee 10.01.10 at 11:00 pm
Again, I think this is an incorrect, or at least myopic, reading of history. The story of the last decade has been Microsoft repeatedly trying and failing to use its dominance of the OS and browser markets to control the evolution of web standards. Microsoft has pushed for the adoption of Microsoft-friendly products and standards like IIS, ActiveX controls, Passport, and SilverLight. They have largely been ignored by the broader web ecosystem. Despite Microsoft’s dominance of the browser market, the web has continued to evolve toward open standards and free software not controlled by Microsoft (or anyone else in most cases). And notice that IE is now rapidly losing market share to free-software-based alternatives.
Do corporate incumbents have some influence over the evolution of technical standards on the web? Sure, it would be silly to claim otherwise. And sometimes, as in the word processing market, dominance in one market can be leveraged into dominance of adjacent markets. As a Hayekian (and Scottian) I think it’s unfortunate when that happens!
But again, I don’t understand why you think any of this contradicts anything Hayek wrote. You seem to be imputing to Hayek a preference for corporate-controlled standard-setting. But AFAIK he never expressed such a preference, and from where I sit, the continued success of bottom-up institutions like the IETF and W3C are illustrations of Hayek’s argument that decentralized, competitive standardization processes are superior to centralized ones. You can fault Hayek for not focusing more on this question, but that’s hardly a refutation of the arguments he actually made.
Sebastian 10.01.10 at 11:51 pm
“When it became clear that there were real stakes involved, we saw exactly the kinds of direct struggle with asymmetric power between actors pushing competing standards that you might have expected, resulting in all sorts of painful inefficiencies thanks to competing standards, and the actor with the most bargaining power (Microsoft) winning, driving its major competitor out of business, and then not bothering to update its browser for a number of years because it didn’t need to any more. ”
The bolded part is a Hayekian statement. The fact that you don’t recognize it as such is not Hayek’s fault. The thumbnail sketch you keep using is wrong.
And your example is specifically wrong too. Microsoft has repeatedly tried to dominate the web standards, and generally it has failed. It more recent successes are only because they’ve finally accepted the input and innovations from their rivals.
Is silverlight the standard? No.
Did ActiveX kill the .NET framework even though Microsoft favored the first product over the second so they could try to get more royalties offer their patents? (yes I know they made both). Not even hardly.
And when did browsers gain the most ground in quality? In the years before and after Microsoft had a hammerlock on the market. Why? Competition. That’s a Hayekian insight, and correct so far as I can tell. Hayek believes in quality improvement through competitive pressure. He would have no trouble at all with the idea that if a company like Microsoft (often using government power through patent bullying) gains enough power to squelch competition, that it would exhibit many of the same deficiencies as exhibited by governments.
Antoni Jaume 10.02.10 at 11:22 pm
In as much I remember, being only an interested observer but neither an insider or practicer at the time, the main reason TCP/IP prevailed on OSI is that OSI solved a different set of requirements, akin to what would be a non neutral internet, and was much too heavy to the needs of early internet and processing power. X.25 was basically for phone companies, allowing for virtual circuits at a low level so phone calls could be done in cost efficient ways. Data comunication was still a minor use, and the main companies had proprietary protocols that did not mingle easily. TCP/IP on the other side was non proprietary and allowed small players to mix their wares which was of interest to the early internet, mostly universities.
Antoni Jaume 10.02.10 at 11:26 pm
“Microsoft has repeatedly tried to dominate the web standards, and generally it has failed.”
Microsoft has failed where the decision were made by people with technical competence, and triumphed where it was comercial expediency.
Henry 10.04.10 at 3:03 pm
Tim – I simply don’t buy your general argument here. My claim is not that Microsoft is the root of all evil, but instead that standards are a weapon used to shape markets, and that the outcomes that you get are not usually efficient ones. E.g. the reason that Silverlight has failed on the market, is because Flash has _not_ failed and has become an effective and rather crappy monopolistic standard (which Apple in turn is trying to break for its own commercial reasons). Also – .doc. So the difference between us is a factual one. When you say that:
bq. Despite Microsoft’s dominance of the browser market, the web has continued to evolve toward open standards and free software not controlled by Microsoft (or anyone else in most cases). And notice that IE is now rapidly losing market share to free-software-based alternatives.
you are suggesting – if I am not mistaken – that markets and other decentralized processes are indeed pushing us towards open standards, free software and all of these good and wonderful things that we both can agree, from our different perspectives, _are_ good and wonderful. But when I look at the world of the Internet, I see a very different story happening. I see more and more lockdown, and more and more standard setting through control of the end user experience (the success of the iPhone/iPad etc), through backroom deals between oligopolistic companies (Verizon and Google playing kissy-kissy over gutting net neutrality) and so on. This – rather than sliced freedom and openness – seems to me to be what unconstrained decentralized processes are leading to – and this is what the (in my view, more realistic) theories of institutional emergence and change emerging from non-cooperative game theory _would_ predict.
Sebastian:
bq. The bolded part is a Hayekian statement. The fact that you don’t recognize it as such is not Hayek’s fault. The thumbnail sketch you keep using is wrong. … And when did browsers gain the most ground in quality? In the years before and after Microsoft had a hammerlock on the market. Why? Competition. That’s a Hayekian insight, and correct so far as I can tell. Hayek believes in quality improvement through competitive pressure. He would have no trouble at all with the idea that if a company like Microsoft (often using government power through patent bullying) gains enough power to squelch competition, that it would exhibit many of the same deficiencies as exhibited by governments.
You are not understanding what the argument is about. As I noted above, this is not a generic argument about market competition (something which lefties like me can quite reasonably recognize as sometimes being a good thing, without becoming Hayekians). It is about the underlying conditions for institutional change, and whether (if left to themselves) self-organizing processes will lead to efficient outcomes. Hayek correctly seems to recognize (in contrast to some public choice work) that there is no market as such for institutions. Hence, he makes an evolutionary argument instead, in which ‘competition’ is the kind of Darwinian competition that you get between different species (the flexibility of the term ‘competition’ is the source of much intellectual confusion among libertarians imo). Unfortunately, he buys into a particularly _bad_ version of evolutionary theory, rife with explicitly functionalist claims, and arguments from group selection (which is not quite as disparaged as it used to be among evolutionary theorists, but still only occurs, if it does occur, under highly limited conditions). I do think that Hayek has genuine (and perhaps even fundamental) insights – but his theory of institutional change is not one of those moments of insight.
As a side-point, fwiw, his theory of monopoly, as best as I can remember, is a very squishy one (he sort-of-acknowledges that it can be a problem, but is clearly loath to suggest government intervention) that doesn’t (afaicr) refer to the dynamic consequences of monopoly for innovation etc at all. See also Schumpeter on monopoly’s putative benefits for innovation.
Tim Wilkinson 10.04.10 at 4:03 pm
Aye, ‘competition’ may indeed be a good thing, but the free market types always yoke it – entirely arbitrarily from a theoretical point of view (though not from a polemical one) – to the profit motive.
Note also – evolution is pretty horrid in its mode of operation, grotesquely slow in producing change, and not directed in any way toward any independently specifiable outcome, such as might be regarded as good or efficient in any meaningful way.
Sebastian 10.04.10 at 4:16 pm
Maybe the confusion is in another area? My understanding of Hayek is that he believed that competitive pressures would lead to *more* efficiency than government planning. You seem to be arguing that the competitive pressure outcomes are not particularly efficient when compared against some sort of ideal.
But as much as they may lead back and forth down alternating between freer standards and the more oligopolistic ones (though perhaps you should be heartened by the changes Apple has already begun to take in response to the Droid), I’m not sure how you can argue that international government intervention does better than that without at least considering enormous counter examples like the entire dreadful set of rules surrounding the DMCA (Digital Millennium Copyright Act) and the treaties surrounding the World Intellectual Property Organization which the DMCA relates to.
Now I’ll admit that it is difficult to tell how committed you are to the idea that international government organizations make better rules. You seem to swing from condemnation of the competitive pressure idea to apparent support for governmental standards but sometimes seem to backtrack into purely descriptive claims.
The reason I question your attempt to sometimes couch this as purely descriptive is that I have no idea where you think you have an argument with Hayek or anyone else you have been quoting if the only claim you are making is something along the lines of: sometimes standards show up because of competitive pressure, sometimes they get enacted through government power. As a descriptive matter, who argues against that? Certainly not Hayek. So far as I can tell, not Lee. And even if you are saying something like: standards have been increasingly crafted by governments; I’m not sure who you think you are arguing with.
Hayek didn’t argue that governments are incapable of forcing standards. If anything he was supremely afraid of the fact that he knew the government had the *power* to control things.
So if your *only* argument is that competitive market forces are often not the way standards come about, I don’t know who you think you are arguing with.
If your argument is that the government standards tend to be better, yes you have an argument with Hayek, and well a lot of people. But if that is your argument, the evidence you have marshaled doesn’t seem to be on point, and seems to ignore enormously awful counter examples like the DMCA.
Henry 10.04.10 at 6:09 pm
bq. If your argument is that the government standards tend to be better, yes you have an argument with Hayek, and well a lot of people. But if that is your argument, the evidence you have marshaled doesn’t seem to be on point, and seems to ignore enormously awful counter examples like the DMCA.
You’ve kept on trying to come back to the suggestion that I am making this claim – if you could point to the place where I make an argument that is even vaguely analogous to it, I’d like to see it. Nor do I make any attempt to “argue that international government intervention does better,” or anything like it. While I think that government does a lot of things ‘better,’ I am not sure that this is true of technical standards (although there are the Buethe and Mattli arguments that some degree of institutional support provides for better communication, this is a much more technical argument, which has nothing much to do with a positive assessment of standard quality). Instead, when I point to Drezner, it is to an _alternative argument_ about _what explains outcomes._ If you think that Dan is committed to the proposition that government does things better, you don’t know his work.
Again. What is at issue (in this particular sub-debate) is a set of questions about the forces driving decentralized institutional change. Hayek makes a series of arguments about these forces which (I think this is true: if I am wrong, Tim has yet to correct me) Tim buys into. I suggest that these arguments are in fact highly implausible and don’t explain what is going on here. What I am arguing about here is what _explains_ this or that social outcome, rather than whether _this social outcome_ is better_ than _that social outcome._ Explanation can link to normative preference (it clearly does for both Scott and Hayek, and presumably me – that I prefer explanations of social outcomes which focus on power asymmetries says some things about the kinds of social arrangements that I find better or worse). But it does not _reduce_ to such preferences. And as long as you have it in your head that I am making some sort of ‘government is better than markets’ claim, you are going to go on misunderstanding what is at stake in this debate in crucial ways. Not every intellectual debate in political economy reduces to fights over government vs. market. I’d really recommend – if you are interested in understanding this, reading for starters Jack Knight’s work referred to above, Doug North’s book on institutions (there is a co-written chapter by Knight and North which gives some interesting insights into the relationship between their arguments), Avner Greif’s work on the mediaeval economy and institutional change, as well, indeed as Hayek and Scott (I would heartily recommend his new book, _The Art of Not Being Governed_, which I think is wrong in some ways, but is a major work of social science and social theory, and beautifully written to boot). My positive argument (as opposed to the exegesis of Scott) is a disagreement about what kinds of explanatory arguments of institutional change are useful both in general and in this particular case.
Jason Treit 10.05.10 at 8:48 am
It’s been fun watching this thread from the sidelines. Once again, god bless Crooked Timber.
My sense of decentralized standards is they make for mediocre patchworks, and at the far edge promise still worse instability followed by clumsy reconciliation. Yet it’s this mediocrity I celebrate. In his 1996 sketch of the web as an evolvable system, Clay Shirky showed how weak a protocol the web was (and continues to be). Weak in conception, weak in implementation. “The problem with that list of deficiencies,” he went on, “is that it is also a list of necessities.” Anil Dash advanced a similar view in The web way vs. the Wave way, where he correctly told the fortune of Google’s sweeping, technically advanced, insularly designed Wave protocol.
So these are two reads on how large-scale markets converge on technical standards. We’ll disagree on how representative they are, but as proofs of concept I find them telling. The nut of the argument is that a truly networked standard trades away short-run efficiency for scale. Behaviour takes a long time to regularize; any instance of behaviour may break that regularity in a way that’s productive or beautiful or ugly or conflicted; others fight over which is which; and occasionally a change moves from margin to centre, nonstandard to standard, likely in a degraded form. When watching decentralized processes it is tempting to will either states or firms to find a shortcut. To the extent that either is capable of using power to this end, I’d like to see them held in check – not because they lack sound ideas, but because, as ever, the parts know less than the whole.
Sebastian 10.06.10 at 4:26 pm
Argh.
The reason I have trouble with it is because is because if that is all you are saying, it isn’t a problem with Hayek. So couching it as this enormous problem you have with Hayek, when you don’t makes it seem like a political advocacy game. But I apologize for misinterpreting. So let me try again.
“In other words, Hayek is claiming that markets are superior to planning (which must necessarily be done through a single centralized planner) or organized monopoly (which no-one actually likes).”
This part you get right at least. Though if you read the paragraphs you write immediately afterward you might see why I think you’re advocating government planning.
But you go immediately astray after that.
“I imagine Scott’s counterblast going something like this: It is all very fine to say that markets provide a means to communicate tacit knowledge, and it is even true of many markets, especially small scale ones with participants who know each other, know the product and so on. But global markets do not rely on tacit knowledge. They rely on standardization – the homogenization of products so that they can be lumped under the appropriate heading within a set of standard codified categories. Far from communicating tacit knowledge, the price system (and the codified standards that underlie it) destroys it systematically.”
It looks like you get lost in various types of ‘standardization’. First you should realize that you are going directly counter to one of the current major critiques of capitalism: that it provides too many choices along too many dimensions. This critique isn’t just difficult to resolve on a theoretical basis. It is a disagreement about the facts–your formulation suggests that there shouldn’t be so many choices to talk about, much less whether those choices ought to be considered a positive or negative development. In your formulation, they shouldn’t exist.
Quinoa, and corn, and wheat, and barley, and rice, and oats (to just stay in the cereal and near cereal grains), have rarely all been available to the same people at the same time. The reason they are all available in your local supermarket is because market pricing has encouraged distributors and growers to ‘standardize’ across very different dimensions to tease out market share (which is to say price signal) from various different people who want various different things in their staple food cereal.
And even in the same foodstuff–say tomatoes–it is easy to observe that consumers can and do discriminate on quality, which is to say that they are willing to pay different price points for different dimensions of tomatoness (some strangely prefer color, others prefer sweetness, others prefer various textures).
Now you could very well point out that the oats that General Mills puts in Cheerios tend to be very standardized and low quality. But that is fine, because they are going to toast them and put them in plastic bags for months anyway–where durability for the consumer, ease of use, and ease of toasting are important quality dimensions. For someone wedded to the deep taste of ‘oateyness’ they will buy other products, which are in fact available and can be discriminated on by direct price and by market share (indirect price).
So yes, in more localized versions of Scotland, you might have been able to get a larger variety of oats, but you certainly don’t have access to a larger variety of cereals in the past/more localized hypothetical. And except for the most dedicated oat aficionado, that is a good thing, and even the dedicated oat aficionado can probably use the price signal to get most of what he wants.
Now how do these wide varieties of standardization come about? They come about because various companies struggle for various market shares, either by directly competing (quality of the directly compared product along one or more dimensions) or by competing for market share (competing for substitutes by appealing to the consumer on that basis). So yes, there is some loss of local knowledge, but there is an enormous gain of various competing local knowledges on close but not exact substitutes leading to a huge variety of choice–so much so that various leftist critiques are based on the enormous proliferation of choices.
So I’m not sure the standardization critique of Hayek makes any sense as an empirical matter when restricted to a very narrow definition of product (say ONLY tomatoes), and it certainly seems to fail when the definition of product is expanded at all (cereals, tasty fruits, tasty vegetables) [I also note that a large portion of the critique of quality fruits and vegetables is strongest only when they are out of season in your hemisphere and have to be shipped from the other side of the world. When they are in season near you, they are almost always better. But that is kind of strange to complain about because they would be totally absent otherwise].
The Hayekian insight is that market standardization is likely to take place along lots of possible different dimensions which get played out in the market on direct pricing on many possible dimensions of quality and which get played out in market share (indirect pricing signal) across substitutes. It tends to do better than central planners or other actors who try to evade the price signals through monopoly or other means(which is why is why Lee says “better than what”).
Now it may be that you want to restrict the critique to technical standards. (If so you probably shouldn’t have mentioned the tomato thing). Technical standards play out somewhat differently, but also exhibit less information loss under either model because you often have the technical experts who actually use the stuff trying to make the standards. But even they respond well Hayekian evolution across different dimensions. (See for example the evolution of torrent standards, or read-only document displays.)
See also WalMart supply chain standardization. It is an endlessly fascinating interplay between planning and local knowledge. WalMart’s genius in that respect has been to harness the price signal to drive all sorts of supply chain efficiency improvements. It is a very interesting interplay between planning and evolutionary knowledge. WalMart spurs supply chain innovation through direct threats of price pressure to its suppliers. A huge number of supply chain improvements come about which they share with their other suppliers, helping all of them become more efficient. (Though as someone who has worked for a supplier, I’ll admit they can be annoyingly ruthless about it). But their form of standardization can be very helpful (see for example their Green Packaging initiatives, where they forced suppliers to use dramatically less wasteful packaging forms by scoring them on a large variety of space to product ratios).
In fact if you are really interested in how price mechanisms can work through large planning institutions to promote efficiency, WalMart is probably the best place to study it.
Henry 10.06.10 at 6:35 pm
bq. The reason I have trouble with it is because is because if that is all you are saying, it isn’t a problem with Hayek. So couching it as this enormous problem you have with Hayek, when you don’t makes it seem like a political advocacy game.
But it _is_ a problem with Hayek – just not with the bits of Hayek that you are perhaps familiar with. What I (as distinct from e.g. Scott) have a problem with, is his theory of institutional change, which is as far as I can see not of any great worth.
On the topics of standardization and large markets versus local markets, exotic imports and all of that, I think (perhaps I am mistaken) that your primary argument is with Scott rather than with me. To “quote myself from a couple of years ago”:https://crookedtimber.org/2007/10/31/delong-scott-and-hayek/
bq. Thus, there are trade-offs. Italian firms in small-firm districts are excellent at gradual innovation and refinement of knowledge – in part because of their reliance on metis. They are not so good at producing profound, industry-changing forms of innovation. They also tend to stick closer to home than their equivalents in other countries (somewhat ironically, they replicate the logic of Avner Greif’s mediaeval Maghribi merchants far more than the behaviour of his Genoese traders).
bq. To return to the more homely example of food, Florence has an excellent restaurant culture, where you can eat out cheaply and incredibly well if you avoid the tourist traps.1 But it systematically emphasizes local cuisine, along with a few imports from the South (pizza and pasta) and the north (some Bolognese and Milanese dishes). Chinese food in Florence is (or was when I was there) terrible, and Indian food was relatively very expensive and no better than mediocre in quality. In contrast, most US cities of my experience have a lower overall standard of food, but a much greater variety of restaurants producing different cuisines, sometimes at a quite high standard of quality (if rarely as high as in the cuisine’s home countries or regions). US cities are far more open to different kinds of food than Italian cities. I suspect that much of this can be attributed to the dominance of particular forms of local knowledge in Italy, which on the one hand preserve certain traditions of quality that would be infeasible to preserve in the US, but on the other hand make people less likely to branch out into new forms of production and consumption that don’t fit with their prior experience.
bq. This allows me to come back to the roots of my disagreement with Brad. Brad is a fan of markets, and believes that they contribute in very important ways to human freedom. I agree with him on this. But I think that Brad sometimes underemphasizes the real trade-offs that markets may involve, and overstates his criticisms of people who are concerned with these trade-offs. Sometimes, perhaps often, these trade-offs are relatively slight – as Brad says, many forms of redundant local knowledge can be discarded without compunction. Sometimes, these trade-offs are real, but still worthwhile – while we should acknowledge the costs of markets, we should acknowledge that the benefits of introducing them are higher. And sometimes they are not worth paying – there are areas of social life where marketization has more downsides than advantages. (the question of which areas of social life fall under which category is obviously important, but this post is much too long already).
bq. 1 I seem to remember (although I can’t find the post) Brad rudely disagreeing a couple of years ago with someone who suggested that delicious cheap food was available in European cities in a way that it wasn’t in the US, and claiming that this was an illusion of the upper middle classes who could afford to eat well anywhere, or words to that effect (my memory could be flawed, in which case I apologize in advance). For what it’s worth, as a grad student with a relatively meagre stipend in Florence, I could afford to eat out three nights a week in good restaurants.
It’s perhaps worth noting that “Brad has gone back”:http://delong.typepad.com/sdj/2010/09/henry-farrell-continues-his-war-on-rubber-tomatoes.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+BradDelongsSemi-dailyJournal+%28Brad+DeLong%27s+Semi-Daily+Journal%29 to his claim that
bq. you have to either live in the countryside or live in the city and be really rich to say that rubber tomatoes suck. For those humans who live in the city and are not really rich, rubber tomatoes provide a welcome and tasty and affordable simulacrum of the tomato-eating experience.
A claim that I believe I have refuted through achieving the triplet of having (a) lived in a city, (b) not been at all wealthy (monthly student stipends are very good things to have – but they do not tend to put you in the higher income brackets, and (c ) having been able to find and eat delicious tomatoes without any problem.
Comments on this entry are closed.