unsurprisingly, contained data that contradicted the popular hypothesis. Crucially, no other papers refuted or critiqued this contradictory data. Instead, those publications were simply ignored.
Using the interlocking web of citations, you can see how this happened. A small number of review papers funnelled large amounts of traffic through the network, with 63 per cent of all citation paths flowing through one review paper, and 95 per cent of all citation paths flowing through just four review papers by the same research group. These papers acted like a lens, collecting and focusing citations – and scientists’ attention – on the papers supporting the hypothesis, in testament to the power of a well-received review paper.
But Greenberg went beyond just documenting bias in what research was referenced in each review paper. By studying the network, in which review papers are themselves cited by future research papers, he showed how these reviews exerted influence beyond their own individual readerships, and distorted the subsequent discourse, by setting a frame around only some papers.
And by studying the citations in detail, he went further again. Some papers did cite research that contradicted the popular hypothesis, for example, but distorted it. One laboratory paper reported no β amyloid in three of five patients with IBM, and its presence in only a ‘few fibres’ in the remaining two patients; but three subsequent papers cited these data, saying that they ‘confirmed’ the hypothesis. This is an exaggeration at best, but the power of the social network theory approach is to show what happened next: over the following ten years, these three supportive citations were the root of 7,848 supportive citation paths, producing chains of false claim in the network, amplifying the distortion.
Similarly, many papers presented aspects of the β amyloid hypothesis as a theory – but gradually, through incremental mis-statement, in a chain of references, these papers came to be cited as if they proved the hypothesis as a fact, with experimental evidence, which they did not.
This is the story of how myths and misapprehensions arise. Greenberg might have found a mess, but instead he found a web of systematic and self-reinforcing distortion, resulting in the creation of a myth, ultimately retarding our understanding of a disease, and so harming patients. That’s why systematic reviews are important, that’s why incremental mis-statement matters, and that’s why ghost writing should be stopped.
Publish or Be Damned
Guardian , 4 August 2005
I have a very long memory. So often with ‘science by press release’, newspapers will cover a story even though the scientific paper doesn’t exist, assuming it’s around the corner. In February 2004 the Daily Mail was saying that cod liver oil is ‘nature’s superdrug’. The Independent wrote: ‘They’re not yet saying it can enable you to stop a bullet or leap tall buildings, but it’s not far short of that.’ These glowing stories were based on a press release from Cardiff University, describing a study looking at the effect of cod liver oil on some enzymes – no idea which – that have something to do with cartilage – no idea what. I had no way of knowing whether the study was significant, valid or reliable. Nobody did, because it wasn’t published. No methods, results, conclusions to appraise. Nothing.
In 1998 Dr Arpad Pusztai announced through the telly that genetically modified potatoes ‘caused toxicity to rats’. Everyone was extremely interested in this research. So what had he done in his lab? What were they fed? What had he measured? A year later the paper was published, and it was significantly flawed. Nobody had been able to replicate his data and verify the supposed danger of GM, because we hadn’t seen the write-up, the academic paper. How could anyone examine, let alone have a chance to rebut, Pusztai’s claims? Peer review is just the start;