The latest round in the Republican War on Science is a report prepared for US Representative Joe Barton aimed at discrediting the ‘hockey stick’ analysis of global temperatures first undertaken by Mann, Bradley, and Hughes, and subsequently supported by many other studies. For reasons that aren’t entirely clear, this peripheral issue in the analysis of climate change has attracted disproportionate attention from denialists, most notably Ross McKitrick and Steve McIntyre. One result was that the US National Academy of Sciences recently reviewed the work, reaching conclusions broadly supportive of MBH.
The report for Barton was prepared by three statisticians, Edward Wegman, David Scott and Yasmin Said , and its only novel contribution is a social network analysis, which is meant to show that the various independent studies aren’t really independent and that peer review has broken down, since the same group of interlinked academics is reviewing each others’ papers.
Kieran and Eszter are the CT experts on this stuff, and I’ll be interested to see what they have to say. But in the meantime, I have a couple of observations (feel free to correct errors in my interpretation).
Note: A reader (who indentifies as TCO in the thread below) asked for this in another thread, but I couldn’t find it again when I posted.
Two network analyses are presented, of which most weight is placed on the first, consisting of a database of 43 individuals. The conclusions reported by Wegman, Scott and Said are as follows:
The block (cluster) structure is very clear. Michael Mann is a co-author with every one of the other 42. The black squares on the diagonal indicate that the investigators work closely within their group, but not so extensively outside of their group. The occasional off diagonal boxes indicate that some investigators have joint papers with investigators outside of their immediate group. The order of the authors on the vertical and horizontal axes is the same. Unfortunately, there is overprinting on the horizontal so that individual authors are not readable. However, it is immediately clear that the Mann, Rutherford, Jones, Osborn, Briffa, Bradley and Hughes form a clique, each interacting with all of the others. A clique is a fully connected subgraph, meaning everyone in the clique interacts with every one else in the clique.
The group of 43 is described as follows
The first specifically focusing on Dr. Mann was developed by first considering all of his co-authors and then examining the abstracts produced by the co-authors. We focus on Dr. Mann because he is the lead author of MBH98/99 and because he is extremely influential in this area as can be seen by his high degree of centrality.
In other words, if I understand things correctly, the first key finding is that (drumroll) Mann has co-authored a paper with every one of his co-authors This obviously demonstrates his “centrality” to the group consisting of his co-authors.
The finding that “Mann, Rutherford, Jones, Osborn, Briffa, Bradley and Hughes form a clique, each interacting with all of the others” can be verified using Google. All those listed were among the authors of:
Mann, M.E., Ammann, C.M., Bradley, R.S., Briffa, K.R., Crowley, T.J., Hughes, M.K., Jones, P.D., Oppenheimer, M., Osborn, T.J., Overpeck, J.T., Rutherford, S., Trenberth, K.E., Wigley, T.M.L., On Past Temperatures and Anomalous Late 20th Century Warmth,Eos, 84, 256-258, 2003.
This automatically qualifies them as a “clique”. So the second finding can be rephrased as (another drumroll) Some of Mann’s papers have lots of co-authors BTW, it appears that Wegman, Scott and Said didn’t catch all the co-authors.
The second analysis uses the 75 most published authors in the field (a much more reasonable choice) and comes to the conclusion
There are some interesting features. Although Michael Mann remains an author with high centrality, Tett, Briffa and Cook emerge as belonging to their own cluster and they also exhibit high centrality. Schweingruber and Collins also appear to have relatively high centrality. One interesting observation is that although Tett is fairly central, he has no direct linkage to Mann. Similarly the Gareth Jones-Allen-Parker- Davies-Stott clique also has no direct linkage to Mann. There are two Joneses. Gareth Jones is not the same person as the person previously labeled as Jones.
My summary (no drumroll this time). There are several leading research groups in this field. Some of them are fairly closely linked to Mann and his group and others are not.
{ 70 comments }
joel turnipseed 07.15.06 at 1:30 am
Well, the other thing to be said about Mann is that he spent decades working his way to the top… only to be, perhaps, outdrawn by Gore’s “Inconvenient Truth.”
fnook 07.15.06 at 1:55 am
I blame the whole thing on the seductive and sweetly descriptive sound of the phrase “hockey stick” when used to describe the graphic results of some study industry disaaproves of. Talk about catch phrases that can be understood by a wide audience. From where I come from, hockey stick results are a tell-tale sign of market dominance, but I’m not up to speed on climate science.
w 07.15.06 at 2:11 am
The network analysis used a symmetric matrix. I’d like to point out there are two kinds of symmetric matrix, and sometimes they make a big difference in network centralities.
The first type of symmetric matrix is made by multiplying an incidence matrix (e.g. actor-membership matrix) by its transpose. Suppose the first author invites the other two for a 3-author paper. In this type of matrix, all of the three appear to have a relationship to one another, and their centralities are equal. It seems the case of the Wegman et al’s report.
The second type of symmetric matrix is made by summing up a directional-asymmetric (adjacency) matrix and its transpose. In this type, the first author has a relationship with the other two, but the second and third authors are disconnected from each other. Accordingly, the most central is the first author.
To my experience, Bonacich centralities of the two types of matrix are generally correlated at the level of .8. But the second type is generally preferred because it holds much more information than the other.
bad Jim 07.15.06 at 3:45 am
What part of et does anyone not understand?
James Wimberley 07.15.06 at 3:54 am
Peer review is review by people recognised as peers. So any such system is bound to look like a clique to outsiders it rejects. But it’s still the best bet for sorting wheat from chaff. Where it has failed in the past is in rejecting pyschologically disturbing observations (Semmelweiss) or revolutionary new theories (Wegener). Neither of these factors applies to the climate change deniers. They’re the chaff.
bad Jim 07.15.06 at 3:55 am
Sorry. Preview seemed to understand a superscripted t.
One would expect general comprehension that the curve described by an exponential function curves sharply upward. (One would also expect others to anticipate the consequences of their actions, to accommodate the coexistence of other similarly-sized primates in their immediate proximity, and to use their turn signals, and would be perpetually disappointed).
bad Jim 07.15.06 at 4:10 am
Actually, this, from DeLong, may have been what I had in mind. He excerpts Hansen:
Acknowledging that this isn’t necessarily a global warming thread, I’ll note that while the gods may or may not play dice with the universe, humans certainly do with their only planet.
bi 07.15.06 at 8:04 am
The million-dollar question then is this: what do the social networks around Wegman, Scott, and Said look like?
John 07.15.06 at 9:04 am
When one doesn’t allow full access to one’s data by others, peer review is really rather pointless, isn’t it?
Barry 07.15.06 at 9:07 am
I’d wager that they include some energy-company money, even if laundered through a ‘think tank’ or two.
Randolph Fritz 07.15.06 at 9:18 am
Well, yes, science is a social project. Duh.
What’s impressive about the work on global climate change is that almost everyone knowlegeable who can be persuaded by it has been; at this point this is hundreds of experts. There is, if you like, a “clique” of the most knowlegeable, but outsiders like the NAS experts who study their work are usually persuaded by them.
A social analysis of the Republican denialists would be a really interesting document, btw.
ob 07.15.06 at 9:30 am
a commenter–not myself–on Ezra Klein’s blog had this to say the other day:
“A short answer for the logic of the anti- global warming movement: “we can’t speak to the science, so we will speak to what we know.” What they know is character assassination and blowing smoke up people’s ass. 5 years from now- they will be arguing something different- with the same tune.
I think these are all trial ballons to see which load of shit they throw at the wall will stick. akaison
http://ezraklein.typepad.com/blog/2006/07/the_antiantiglo.html#comments
I think he’s right–they know they’ve lost on science, but if they can turn science into a matter of personal authority and he-said/she-said, or even better if they can turn it into our-team versus their-team, then they’ll at least be able to slow down their defeat.
And since they have bankrupted the country, destroyed the army, and failed to capture Bin Laden, pretty much all they’re trying to do is slow down their defeat, escape accountability and line their pockets before they’re driven out of power.
Barry 07.15.06 at 9:35 am
D*mn, I’m good. I swear that I hadn’t read Deltoid when I posted that those guys have got to have denialist funding.
Are there any honest, unpaid for scientists who are denying global warming?
derek 07.15.06 at 11:51 am
Delto-what? Knowing that’s a name for a muscle, I wasn’t optimistic that a google search on the word would explain barry’s remark.
I was wrong. A blog called “Deltoid” turns out to be the top hit: the “I’m Feeling Lucky” button would even have worked. Well done that blog.
JohnLopresti 07.15.06 at 12:03 pm
Recently I took interest in a discussion of how statistical representations need to take care about eliminating noise from baseline data; also, peer review is an easy way to refine conclusions, finding the noise before it confounds the ostensible discovery. I post the web addresses here for your browsing. They all relate to the protectors of climate denialists in the statistical model area. Sometimes skimming such articles yields more immediate value and insight than a laborious first reading, saving the study phase for the re-read, if you have the time to devote to such detailed examination. The authors who wrote the introductory articles all write lucidly, but their respective threads run the full gambit of the mundane…to the helpful.
—-
LINKS:
Sunstein on Cheney 1% new paradigm of civil unrest vs. 6% probability of global warming: http://uchicagolaw.typepad.com/faculty/2006/06/the_one_percent.html
climate change– http://www.realclimate.org/index.php/archives/2006/03/climate-sensitivity-plus-a-change/
http://www.realclimate.org/index.php/archives/2006/04/a-correction-with-repercussions/
http://www.realclimate.org/index.php/archives/2006/05/how-red-are-my-proxies/
Functional 07.15.06 at 1:52 pm
The report for Barton was prepared by three statisticians, Edward Wegman, David Scott and Yasmin Said , and its only novel contribution is a social network analysis
Are you taking it for granted that their other findings have all been proven true already, such that they are no longer novel?
E.g.:
“Overall, our committee believes that Mann’s assessments that the decade of the 1990s was the hottest decade of the millennium and that 1998 was the hottest year of the millennium cannot be supported by his analysis.”
Or perhaps the chart on page 34. Or the more detailed findings on pages 48-49.
TCO 07.15.06 at 2:14 pm
1. In all seriousness, thank you for addressing this. I want to hear your take.
2. I find it a bit strange that you give me no cite as I brought this up to you, but merely delete my posts from the places where I asked for your review (the deletion is fine, lack of cite not.)
More to follow.
TCO 07.15.06 at 2:49 pm
A. The post seems to say that Wegman thinks it notable that the Mann co-author group all contains coauthorship with him. If you read the report, it is clear that Wegman’s remarks are explanatory of a general feature of the network: are not meant to be showing some special discovery. More like explaining why you have a row of ones down the diagonal of a comparison matrix. This reminds me of the people who cackled on this site about Ross not knowing the difference between radians and degrees when the error was clearly one of input (that he did not know that the program needed him to convert conventional lattitude to radians before inputting). I think everyone knows that degrees and radians are different. They also know that a network selected to be coauthors, will all be coauthors! Sheesh.
B. Some examples: “as mentioned before, Mann is his own group since he has co-authored with all of the 42”, “The first database is Mann-centric with the idea of investigating relationships among his closest associates”. (There are a couple others, but I’m too lazy to retype from a pdf.) Anyhoo…it’s clear that this guy who has published networks is being explanatory to a general audience of a feature. Not citing a discovery.
TCO 07.15.06 at 3:07 pm
I agree that the social network is the novel thing about the report. It is NOT the only noteworthy thing about the report. What you have also is
A. A very, very good statistician (better then either Mann or McIntyre) looking at the Mann work, the McI critique and finding for McI. (One has to wonder, do you find any flaw with that part of the report? Did you already agree with McI about how the off-centering mines for hockey sticks? Did the rest of the field agree?
B. That same experienced statistician saying that Mann’s descriptions of method were vague and opaque in his papers, that he used non-quantitative language to describe his method’s skill, and that the data and code was insufficient to allow verification.
C. Some minor things (some new, some repeats) like that Mann confuses r and R.
D. Also new: that the method will mine for shapes IN GENERAL from white noise. Not just hockey sticks. But any low freq sample with white noise will be magnified. See Fig 4.7. (This actually interesting from purely intellectual stance.) That the Mannian method will extract a shape from 1 out of 70 noise samples, whereas conventional PCA would give a noise-looking PC1.
E. That Mann’s method can not be technically PCA (because of the transform when he offcenters). It’s not actually a PRINCIPLE component.
TCO 07.15.06 at 3:09 pm
I agree that the second database is more interesting.
TCO 07.15.06 at 3:13 pm
I would be interested in how this compares with other fields/networks. Obviously Wegman did not address this, but perhaps you, who have seen a lot of networks, can. My impression from work (not a network study) is that this is a very different sort of field from solid state chemistry for instance. In terms of the amount of interconnections and the groupings.
John Quiggin 07.15.06 at 3:51 pm
TCO, I saw a request in one of our threads somewhere but when I went to find it for a cite I couldn’t see it. My informal comparison with my own subfield is that any differences are accounted for my the larger size of research groups.
Functional, as I’m sure you’re aware, I mean that everything else in the report consists of issues that have been addressed by the NAS report, which is obviously more credible.
The fact that the main new contribution (the 43-author network analysis) is transparent nonsense tells me what weight to place on questions where Wegman and NAS disagree.
mcd 07.15.06 at 3:53 pm
This report uses a misleading ambiguity, by listing the conclusions at the end of “the committee” and hoping you’ll think that’s Wegman Scott & Said. But we don’t know what they think about the conclusions. Those reflect the “views” of Republican chairmen.
Mann also seems to be accused of being involved in too many different research papers, and ALSO being too isolated from the field (what field is left unclear)
TCO 07.15.06 at 4:30 pm
Thanks for the cite. It was not a big deal.
So you agree with MM, NAS and Wegman on the points where they all say Mann was in error? Why won’t he address what the off-centering was? Was it a mistake? If it’s a minor error, fine–but what does it show about a man that he won’t be pinned down? That is not scientific. It’s debate style.
I think in hard core statistics, Wegman has a superior background to anyone on the NAS panel. Look at all the awards and pubs on his CV. He is also head of the NAS Applied Stats Panel.
The scope of the Wegman report and the NAS report are different. NAS broadened their topic a great deal and then failed to deal with the initial particulars. In discussion of Mann practices, Wegman report, though short, has more detail and at least one new finding.
You haven’t shown how the social network was nonsense. It may be unremarkable. But not nonsense. And even if it were, why would that indict the stats discussion? Surely you can noodle through the logic of something written at that level. Not need to base your feeling about (btw the first part you’d read) one part of a document based on a very different part. Or an argument, based on the arguer?
Thank you also for your response on the similarity of Mann’s network to yours.
Steve Bloom 07.15.06 at 5:29 pm
TCO, assuming for the sake of argument that the criticisms of MBH 98/99 are correct, I’m curious why you think Mann and his co-authors should be compelled to admit that in public. Can you list other examples of this? Isn’t it the standard practice to allow such papers to be corrected by new work? Why should the science community respond positively to such a novel request for an “outing” if those demanding it are largely global warming denialists with no real interest in the science?
John Quiggin 07.15.06 at 5:41 pm
TCO, you asked specifically about the network analysis, so it’s a bit rich to complain that I’m focusing on it. I haven’t got time to read a fraction of what I should, so if the first thing I read in a report contains unsupported conclusions explainable only by incompetence or partisan bias, I usually don’t read the rest.
TCO 07.15.06 at 7:55 pm
23. I don’t think that the report is saying that Mann is isolated. It is saying that the group is heavily connected and relatively isolated from a larger group. The report also makes the point that there is not connectedness to hard core statistics (when what is being brought forward are novel statistical methods and general statistical analyses).
TCO 07.15.06 at 7:57 pm
25. Steve, yes I do think that errors should be admitted, clarified, etc. There may be things that are right in Mann’s work (or in any flawed paper). It is very important to show what is right and wrong and be very selfless about a pointed out error. To deny it or to evade it is childish. I say the same to Steve, btw. This is a basic philosophy of science. If Mann won’t admit an error, he confuses things. He should WANT to do so. Because he should WANT truth to be forwarded.
TCO 07.15.06 at 7:59 pm
26. Quig, I meant it that I appreciated your review of the network. I remember reading the Eurosong discussion a long time ago. I accept that you have not read the other parts of the report and can not discuss those at length.
TCO 07.15.06 at 8:03 pm
25. I think part of the problem is that Mann sees this as outing (or that you do). the right thing is to be completely candid and revealing about any error. And “moving on” is not acceptable. If people dwell on it after it is resolved (like the radian thing with Ross DID admit to and correct or the Christie error which he DID admit to and correct) then sure, that is silly. But bottom line, this should be treated just like in school. If you are a teacher and a student points out a mistake in a derivation or something. Just admit it! You don’t want to confuse the class. And damn well don’t some of your followers to think that it WAS NOT AN ERROR if it was. He needs to clarify. This is basic. It is like freshman chemistry.
Michael 07.15.06 at 8:21 pm
I must admit I don’t understand why this discussion has taken the course it has.
I’ve been vaguely following the Mann issues for a while, and been rather confused about the ‘he said; she said’ nature of it.
So I was very happy to see this report. The math is very clear, simple and unambiguous. It’s not a matter of appeal to authority, or the background or various players, or any such thing. It’s just the math.
I don’t understand why the comments about the background of Mann or the reports authors or anyone else is relevent.
If someone tells me that 2+2 != 5 under the normal rules of arithmetic, I don’t need to search their CV for hidden bias, I can simply see for myself that that’s right. If it was an assertion about simply connected homologies, I could understand it (don’t follow that stuff at all :), but basic matrix algebra is not exactly rocket science.
In this case, it’s really clear that the PCA is useless unless you fully center the data. If Mann didn’t fully center the data, then his analysis is useless and no conclusions can be drawn from it. This is really very basic math and logic. Why would anything else be relevent?
Frankly, I don’t understand why the distration of the network stuff is even in the report. That’s all very subjective and handwavy particularly compared to the PCA stuff.
Of course, taking my word for it would be pointless. Read the math in report regarding the PCA.
TCO 07.15.06 at 8:53 pm
Quig, could you audit some of Wegman’s other papers with social network stuff in them? Are they decent?
Eli Rabett 07.15.06 at 8:55 pm
To beat a dead horse. Said was Wegman’s student at George Mason, there are lots of rightwing policy types at George Mason which houses Fred Singer’s SEPP among other things. Wegman and Scott are not unacquainted with each other.
The point that shows that WSS were simply blowing smoke is that MBH joined together three communities, long term instrumental trends (Bradley), dendrology (Hughes) and reconstructions (Mann). MBH did not represent a single group.
Mark Shapiro 07.15.06 at 9:28 pm
Glancing at the CVs for Wegman and Scott, both are life long, extremely well accomplished statisticians, having worked in academe, industry, and government. Both have worked on computational statistics – working with huge datasets, so they are familiar with institutions with money and power. Both worked in the Office of Naval Research, Wegman on SDI, (Star Wars program), and Scott has worked several times for the National Security Agency. Both have worked for the Interface of Computing and Statistics Foundation, which is partly funded by the NSA.
So I would not argue with them about the statistics, and in a perfect world, they (or folks like them) could be useful to climatologists looking for analytical firepower. In fact, pausing to look at some puny dendrochronology datasets could be a comedown for them.
What motivates them?
Charlie (Colorado) 07.15.06 at 10:37 pm
I would be interested in how this compares with other fields/networks. Obviously Wegman did not address this, but perhaps you, who have seen a lot of networks, can.
Although I didn’t do it as formal network analysis (I’m not sure social network analysis had even been invented yet) I ran into a very similar phenomenon when looking at James Fetzer’s paper “Program Verification: The Very Idea”. Jim had come to some conclusions about the overall status of program verification which, when examined, turned out to arise because he’d started with a few famous people (EW Dijkstra, CAR Hoare) and followed references in their papers. It turned out that Dijkstra, Hoare, et al were central figures in what might be called the “Oxford group” in verification; because he wasn’t aware of the general literature, this led him to think the “Oxford group” was the only group.
The effect was to make it appear that there was a global consensus on a particular point, when in fact there was no such consensus.
Charlie (Colorado) 07.15.06 at 10:38 pm
I’d wager that they include some energy-company money, even if laundered through a ‘think tank’ or two.
can’t you even wait to have some evidence before you start the ad hominem circumstantials?
John Quiggin 07.16.06 at 2:36 am
As far as I can tell from a quick scan of his resume and research interests, Wegman has never worked on social networks before. Googling Wegman+”social network” produces Climate Audit (McIntyre’s site) as top hit.
BTW, the CA posters and commenters seem very impressed by this aspect, even though, as I think is clear from my summary, it shows nothing beyond tautologies like “Mann writes with his co-authors” and “all the authors of a paper are co-authors”.
mikep 07.16.06 at 6:18 am
A lot of ad hominem attacks here which are best ignored. Apart from the social network analysis there is another new point in the Wegman report. This is that the Mann method (decentred “PCA”) does not just mine for hockey sticks (I take it that we all agree that this has now been conclusively demonstrated, since you say this point is not new?), but will give such weight to any arbitrary shape included in a group of other series (however numerous) which are white noise that the shaped series dominates the so-called first principal component.
bi 07.16.06 at 6:39 am
mikep: I thought the “social network analysis” part was itself just a more elaborate form of ad hominem attack. If Wegman et al. just want to attack the papers’ arguments rather than their persons behind them, why include a social network analysis at all?
TCO 07.16.06 at 7:04 am
John:
1. I appreciated your taking a look at that part of the report.
2. I don’t think that it only showed tautologies. As discussed before, I think those remarks were explanatory, not “ahas”. I think you are mistaken on that reading. I really don’t think that he is so stupid to think it notable that a group of co-authors are all co-authors. But I understand that this is your reading.
3. There is more to it than that: comments on various nodes, the second database, the commonality of the proxies.
4. What I sense is missing (and perhaps this is your point) is some numerical expresssion of results in the network. I’m not an afficianado of the field, but I thought there were ways to describe connectedness, etc.
TCO 07.16.06 at 7:12 am
Eli (and John):
To me it is not a smoking gun or an indictment of Mann’s work that he is part of a club. If there are people at CA who think that this is evil or implicates Mann of mistakes, well that is wrong. (I haven’t seen so much of that, but if it occurs it’s wrong.)
Why, it would be just as wrong as not trusting Wegman’s results because he worked with former students! Or not trusting scientists who are conservative politically or have grants from Exxon.
No, to me, the work is EXPLORATORY. It says GIVEN that MBH made a bunch of mistakes (off-centering, writing papers that did not describe method, writing papers with vague/non-quantitative descriptions of statistical efficacy, bristlecones, etc.)*, GIVEN THAT, what can we learn by looking at the field, the network. So Wegman is not saying that the network proves that Mann made mistakes. He’s already shown that, MM has shown it, NAS has shown it. People are already starting to say that they accept that he made mistakes (only Mann remains truly truculent). No, what the networks does (and to me it is just exploratory, just playing) is say: look at this field. Look at how small it is. How interconnected. And WHERE ARE THE CONNECTIONS to main-line stats for people who are doing stats work, are developing “new methods”, are not qualifying the new methods theoretically in the stat literature, and hence are screwing up the stats and not discovering it until outsiders vett their work.
TCO 07.16.06 at 7:19 am
By the way, I’m not just kissing your butt, when I say that I appreciate your looking at the network. I know (well I would Bayesian bet on it!) that we don’t agree in politics, but I’m intrigued by the concept of networks. But I am intrigued by your field.
Also, to be clear: my interest is casual. I have no training in it. Have maybe looked at a few papers (some of them in places like HBR or SA or AS), a long time ago. Oh…plus I read the Euro-song kerfuffle here. Was really bummed that the comments were closed! The idea of social scientist butt-slapping phycisists for math stuff was too delicious…
mikep 07.16.06 at 8:38 am
Re 35 Presumably to explain why papers which were so vulnerable to the criticsims that M&M made got through the peer review process. As they state
“if a given discipline area is small and the authors in the area are tightly coupled, then this process is likely to turn up very sympathetic referees”.
The analysis indeed shows that the discipline area is small and tightly coupled (and moreover reuses many of the same proxies).
bi 07.16.06 at 8:54 am
mikep: Then it’s a very silly and half-baked explanation. Have Wegman et al. compared Mann et al.’s social networks against their own social networks? Have they considered that, even if there were mistakes in Mann et al.’s work, the referees might have accepted them due to oversight rather than ideology? See e.g. the Bogdanov affair — Baez says, “I have obtained the referees’ reports on three of the Bogdanoff’s papers, which confirms that indeed, some referees were more interested in correcting minor typos than checking the logic of the papers.”
Surely, Wegman et al.’s work looks pretty vulnerable as well.
TCO 07.16.06 at 9:16 am
I agree with the need for comparison to other networks to understand if this one was remarkable. Sure, WEgman might have a tight network. But, he doesn’t MAKE STATS ERRORS the way that Mann does. (So the network comparison might refute the peer review fault hypothesis, but it won’t hurt Wegman’s reputation.) He’s not trying to say Mann is a dork for the network he’s in. He’s saying Mann is a dork for his stats and how did he stay alive? Clearly, by being in a field with people who couldn’t really challenge the stats effectively. By publishing “new methods of stats” in a backwater of mathematics rather then by vetting them in front of real statisticians.
TCO 07.16.06 at 9:26 am
Quig:
Thanks for your take on his other social network papers. I thought he had some, but now that I look back over his CV, while there are some network statistics papers, don’t seem to be hardcore network papers.
http://www.galaxy.gmu.edu/stats/faculty/wegman.html
http://www.galaxy.gmu.edu/stats/faculty/wegman.resume2.htm
Off-topic: what is the best general overview of the field of social networks?
TCO 07.16.06 at 9:56 am
I’d be interested in the shape of other people’s networks of other fields. Is paleoclimatology different in connectedness then solid state physics? Or main line statistics? Anyone have a good source for similar networks as done in Wegman report but for other fields? Hmm…will start Google scholaring.
TCO 07.16.06 at 9:56 am
Heck, let’s see Wegman’s network. I’m all about learning.
bi 07.16.06 at 10:28 am
TCO: maybe Wegman et al. don’t make statistical errors, but if they just pick one hypothesis and try to prove that without considering any competing hypothesis, that’s worse than a “mistake” — it’s not even science.
Who _are_ the referees of Mann et al.’s papers, anyway? This bit’s not mentioned in the report.
Hank Roberts 07.16.06 at 10:47 am
“There is a quotation by George Box that is well known among statisticians.
All models are wrong. Some models are useful.
– George Box
“Most phenomena can be modeled in many ways. There is a danger of a statistician unfamiliar with the subject matter of a study choosing a model for it mathematical tractability or familiarity rather than because it aids understanding. A statistician working on paleoclimate data needs to have an understanding of climate processes in order to create truly useful models. I’m not sure Wegman et al. realize this.”
Mentioned here:
http://www.realclimate.org/index.php/archives/2006/07/the-discovery-of-global-warming-update/#comment-15693
Assistant Village Idiot 07.16.06 at 1:53 pm
I will admit at the outset that much of this is over my head, and I may be missing something basic. But perhaps because it is over my head, the errors in thinking that trickle down to my level should be all the more worrisome.
Mann, et al, made several claims. Others disputed one key claim. Accusations went up that the “others” must have bad motives. Lots of kicking and screaming ensues.
Wegman, et al, conclude that the key claim was indeed wrong. They give evidence how it might have gone wrong, or how its wrongness was missed. Their method suggests that there may be an idea-reinforcing nature to Mann’s professional network which reduces objectivity.
This seems unremarkable to me. Perhaps there is enormous worry that warming denialists will over-interpret the result and say “Mann is just a bunch of hooey, neener neener.” That may be a legitimate problem but it is not Wegmen’s problem. It certainly doesn’t invite the sort of sneering attack of the “Well Wegman’s worse” variety I see in the comments here. There seems to be much that is NOT claimed by Wegman that remains important.
Even with the hockey stick graph in disrepute, much other evidence for warming may hold up. Wegman has taken a swipe at an important bit, but not the whole idea.
Mann’s network, if it were ten times more narrow would be an embarrassment, and if ten times more expansive would be impossibly comprehensive. Wegman’s suggestion is that the network may be a bit too small and self-reinforcing, but not that it is withour value, or even that it is particularly unusual. I would imagine that most areas of study look like this, and Wegman may be merely showing one example of the limitations of the networks in any controversial field.
OT: I generally think that the data, as well as I understand it, supports the notion of warming. But the sneering anger and childish insults that are commonly directed at the GW denialists causes me to doubt the objectivity of the accusers. You aren’t helping yourselves make your points.
John Quiggin 07.16.06 at 7:23 pm
A good general reference on networks (recommended to me by Eszter) is:
Wasserman, Stanley, and Katherine Faust. 1993. Social Network Analysis: Methods and Applications. Cambridge: Cambridge University Press. Pp.1-424, 461-599.
The broader issues in this debate have been covered by the NAS report. Again, to restate the obvious, a partisan document like the Wegman report is redundant where it agrees with the NAS study and unreliable where it does not.
bi 07.17.06 at 5:26 am
Assistant Village Idiot: interesting you’re so much against “sneering”, considering that you’ve spoken out in the past against Political Correctness. What’s wrong with some sneering if those who sneer are in the right?
And why should one pull any punches in pointing out that Wegman et al. is worse, if indeed it _is_ worse? If you start out with only one hypothesis and try to prove that, then it’s not science, it’s begging the question.
Functional 07.17.06 at 10:18 am
Functional, as I’m sure you’re aware, I mean that everything else in the report consists of issues that have been addressed by the NAS report, which is obviously more credible.
The fact that the main new contribution (the 43-author network analysis) is transparent nonsense tells me what weight to place on questions where Wegman and NAS disagree.
If you think it’s transparent nonsense, then I’m not sure what to say.
The first part of Wegman’s analysis is to show this: How many people Mann is connected to, and how strong the connections are. It’s not useful to poke fun at this first step as if it merely showed that Mann co-authored papers with his co-authors.
The second part of Wegman’s analysis is to ask: How many people publish in this field, and to what extent do they overlap with the network of people connected with Mann? Wegman finds that there is a large degree of overlap, which makes it less likely that Mann’s “peers” are really able to review his work independently. Wegman also finds that Mann’s “peers” often “corroborate” his work by using the exact same datasets — which makes the fact of corroboration not very surprising.
Do you have any legitimate reasons to think that this analysis is worthless? Let alone to think that the rest of Wegman’s conclusions are worthless as well?
JohnLopresti 07.17.06 at 10:46 am
JQ, I thought of a similar problem to Mann’s, discussed in 2001 in a series of 3 issues of the periodical Infinite Energy; here is a link to one of the 2001 IE issues. Perhaps you are acquainted with the early experiments in physics of relativity; search the linked IE document for the “Michelson” experiment.
Replication of experiment results, and peer review are essential. But, in the rarified atmosphere of some new trancepts of science, occasionally the peer groups are diminutive and isolated. I once heard a lecture by a volcanologist that reminded me of this from that perspective. However, in Mann’s field there is the IGU’s yearly meeting, a gathering of thousands. I have found the guidance of one of the contributors, above, HR, a sure beacon through the diverse field of interests in climate change. In picturesque fashion, by comparison, the reading of the Correa article about the Michelson experiment in Einstein’s era reveals some of the same pitfalls of peerage in a time when communications were by penned correspondence before the dawn of the typewriter.
In politics we simply call it the two party system, a natural antipodal configuration of a healthy public governance dialog.
In science, the results of ‘schools’ of thought upon the process of peer review can have a more deleterious effect upon the pace of progress and the usefulness of insight which it yields. Some of these shortcomings, I suppose, are part of your topological analysis at the outset of your article above.
bi 07.17.06 at 11:20 am
Functional: there isn’t even a large degree of overlap. The report itself says that “Tett, Briffa and Cook emerge as ”belonging to their own cluster” and they also exhibit high centrality” (emphasis mine).
Bryan McRoberts 07.17.06 at 1:29 pm
Sorry – but how exactly is getting a statistics group to analyze a group of reports so easily lumped into the ‘war on science’? It seems to me that they’re using science and providing their facts and why they lead to a different conclusion than the conclusions in the opposing reports. Isn’t this what science and scientists have done all along? If it were a war on science, they would fall back on non-scientific rationale… i.e. “because I say so”, or “because everyone knows this”. It appears that Mr. Quiggin thinks that the war on science is anything that attempts to debunk anything he accepts as scientific fact. It was accepted fact that the earth was flat and the center of the universe. We’ve come a long way since then, despite those who resisted challenging the accepted science of the day. The networking part of the report is interesting and would, at best, be an interesting reason for why the opposing group of reports could all be viewed as being produced by a single entity. In the end you still need to go through and look at each report critically, no matter what the connections are between the authors of those reports.
John Quiggin 07.17.06 at 5:27 pm
“How many people Mann is connected to, and how strong the connections are. It’s not useful to poke fun at this first step as if it merely showed that Mann co-authored papers with his co-authors.”
Since “connected” = “co-authored with”, this is precisely what the first graph shows, and all it can show.
The second analysis might be interesting if it showed anything significant, but as bi restates, it doesn’t.
John Quiggin 07.17.06 at 5:28 pm
I did a count and, as it happens, I also have 42 co-authors. Can I now claim to be a central figure in economics?
fulldroolcup 07.17.06 at 6:33 pm
Is the “al” in et al. our peer-reviewed global climate expert Al Gore?
Just askin’, is all….. ;)
John Quiggin 07.17.06 at 8:07 pm
If you’ve been following the debate at all, bryan, you’ll be aware that Mann’s work has been repeatedly attacked by right-wing think tanks and people associated with them (such as McKitrick). These groups don’t have any interest in Principal Components Analysis or related issues, and would take entirely the opposite line if the policy conclusions were reversed. Barton’s committee was part of that process, as I mentioned in the intro.
On your “flat earth” point, I’d be interested to in the names of the scientists who advocated the flat earth theory, and even more interested if you could point to right-wing think tanks (or their historical equivalent) who took the opposite view.
JohnLopresti 07.17.06 at 9:46 pm
JQ, Next I am going to be referring your method to a noted scholar who published a book on the history of presidential signing statements; though, I think his difficulty requires a neural networks equation. The problem as I understand it is Boston Globe’s Charlie Savage has counted about 780 Bush statements since 2001 which “purport” to negate statute; but the OLC person who represented the administration in Senate Judiciary Committee hearings approximately last week cited only 112 GWBush constitutional blue pencillings; that is quite a disparity of estimates. The scholar to whom I refer likes to examine each memorandum’s cites by statutory provision; I think the OLC person counts as one statement a memo that enumerates, perhaps a dozen or more displeasing statutes. Then there is the permutation factor if one takes the scholar’s approach: negation of ten provisions in the current memorandum multiplied by the number of other laws which rely upon the negated statute’s individually blue pencilled provisions for validity. Sociologic statistics seems to approach neural networks, though. I will send the scholar person to glance at this nice thread you have launched, and let him take it from there, should he be interested in exploring the math of signing statements further.
chrisl 07.18.06 at 3:31 am
One man’s “peer review” is another man’s “war on science”
And a REPUBLICAN war on science
{Boo Hiss)
Spinning nicely Mr Quiggin
John Quiggin 07.18.06 at 4:18 am
If your “peer review” is the kind undertaken by Barton’s Committee and the Marshall Institute then it is indeed my “war on science”.
chrisl 07.18.06 at 5:22 am
Again, to restate the obvious, a partisan document like the Wegman report is redundant where it agrees with the NAS study and unreliable where it does not.
La la la I’m not listening
T J Olson 07.18.06 at 6:12 am
Hurrican prediction expert William Gray at Colorado State University has joined “the denialists.” http://www.westword.com/Issues/2006-06-29/news/feature.html
Actually, according to this survey of 530 climatologists a few years ago, “the denialists” have lots of company.
http://www.sepp.org/NewSEPP/Bray.htm
While these scientists definitely lean toward anthropogenic global warming, there is definitely no “consensus.”
Far from being accurately characterized as “right wing” as John Quiggin believes, neither Steve McIntire nor William Gray (see article) are sensibly described as such. The matter of ACW has become a religious issue for many people – an article of faith beyond falsifiability, as evident here on this thread.
That’s unfortunate. Considering that billions of dollars annually are spent on such research, one would think that the people had a right to know if such consequential work could be corroborated or not. Apparently, the journal “Nature” believed that even knowing about dissent among climatologists was not important enough to publish.
John Quiggin 07.18.06 at 6:26 am
Linking to SEPP is not exactly the best way to convince anyone that opposition to AGW is apolitical. And you’ve apparently not read the Aschenbach interview in which Gray describes AGW theory as a plot to bring in world government. As for McIntyre & McKitrick, they are about as deeply enmeshed in righwing political networks as it is possible to be.
John Quiggin 07.18.06 at 6:28 am
“La la la I’m not listening”
Indeed you’re not, but you don’t have to announce the fact.
chrisl 07.18.06 at 7:50 am
Hmmm I was referring to you
Tim Lambert 07.18.06 at 9:23 am
TJ Olson, Bray’s survey of climatologists was rubbish. See here.
Comments on this entry are closed.