|| comments closed||trackbacks off|
|Lies from The Lancet about Iraq|
Oct. 25th, 2007 at 10:33 pm
A controversial October 2004 report from The Lancet compared pre-invasion and post-invasion mortality rates in Iraq. Discussion of this report surfaced in comments to Dan Ellsworth’s excellent post on the Iraq War here and here (by Mark IV and Naismith respectively). As usual, I agree with Mark IV: The report by The Lancet deserves as much scorn as we can heap upon it.
The report tries to compare rates in the 18 months preceding the war and the 18 months following the war, estimating an increase of 100,000 to 150,000 deaths.
This is a very tenuous claim, and it results from a methodology that was far from rigorous. Moreover, The Lancet promoted the report using language that tended to deceive casual readers.
This article will first emphasize how The Lancet used deceitful hype to promote the report, and then discuss some of the short-comings of the results of the report.
The Lancet’s press release was deceptively entitled “100,000 Excess Civilian Deaths after Iraq Invasion.” This was also the title of the link to the report that appeared on the front of The Lancet web site when they promoted the report in late 2004. In like manner, the opening paragraph of the press release boldly announces,
The sensationalistic title and the wording of this press release led many to conclude that The Lancet accused America and its allies of killing 100,000 Iraqis. Only later in the article do we read:
Many readers will be surprised to learn that the actual title of the report was, “Mortality before and after the 2003 invasion of Iraq: cluster sample survey.” This accurately reflects what the report intended to measure and compare; viz., pre-invasion and post-invasion mortality rates, including (for example) infant mortality. The promotional material seems designed to imply that this prestigious, peer-reviewed journal has made much less modest claims than the report actually makes.
But the report’s claims, modest though they may be, do not withstand even a cursory examination.
The report sought to measure pre-invasion (Jan 2002 – Mar 2003) and post-invasion (Mar 2003 – Sep 2004) mortality rates using largely unverified reports from households in Sep 2004. Nearly everyone was asked to estimate how many people died in their household and when. Seriously — that’s all. This is not an accepted method for determining mortality rates, and it’s appalling to see such work put forth in a prestigious, peer-reviewed journal. All we have is data suggesting that people tend to underestimate the passage of time since someone’s death once it has receded into the indefinite past, and The Lancet is using it to inflate mortality rates.
The report itself states that it has the following short-comings:
This is not the type of report that warrants a claim of 100,000 to 150,000 additional deaths.
Bias in human knowledge makes individuals less motivated to critically evaluate opinions with which they are sympathetic, and more motivated to evaluate the opinions with which they are not sympathetic. I’m happy to admit that I’m motivated to criticize the report based on my biases. I just wish that those who embrace the report could be more candid about why they tout it as peer-reviewed.