IN 2009, health authorities around the world were stockpiling neuraminidase inhibitors such as oseltamivir (Tamiflu) in anticipation of a feared influenza pandemic.
Billions of dollars were spent based on the belief the drugs would help prevent and treat the disease, providing a windfall for the pharmaceutical industry, particularly Tamiflu manufacturer Roche.
I’ve written about the Tamiflu saga before.
The original stockpiling decision was based on a 2000 Cochrane Review of eight trials that concluded neuraminidase inhibitors were effective for prevention and treatment of influenza.
But that conclusion has since been undermined by a number of researchers, including Bond University’s Professor Chris Del Mar, in a turnabout that provokes questions about some of the foundations of evidence-based medicine.
The most recent Cochrane Review on the subject —of which Professor Del Mar is one of the authors — concludes the only benefit of oseltamivir for which there is clear evidence is a slight reduction in duration of symptoms (by around 21 hours).
Not really billions of dollars worth of effect there.
So why the different conclusions?
Essentially, it all comes down to the data researchers look at when conducting these kinds of reviews.
For decades, the systematic review of published randomised controlled trials has been considered the gold standard in medical research, and this was what the original Cochrane reviewers did.
By combining data from all published trials on a particular subject, researchers are able to see effects in much larger numbers of people than would typically be included in a single trial, in theory making their conclusions more powerful.
In theory. The problem in reality is a small thing called publication bias.
Some trials are simply more likely to be published than others, potentially skewing the results of this kind of meta-analysis.
It’s not just that industry sponsors are obviously going to be keener to publish trials that show benefits rather than harms. Researchers tend to be more enthusiastic about publishing positive results and journals are more likely to accept those papers too.
So any systematic review of published evidence runs the risk of including a disproportionate number of trials that are positive about the particular drug being studied, while omitting many of those that found no benefit or were inconclusive.
Kind of makes you question the whole enterprise, really.
And that is precisely what Professor Del Mar and colleagues did in revising the Cochrane Review.
They abandoned the published trial reports and went instead to the unpublished clinical study reports (CSRs) the industry relies on when seeking approval for new drugs from regulators — when they could get them, that is (the industry is not always willing to share).
CSRs are huge, lumbering beasts of things — one study found they had a median length of 1000 pages — and they contain far more data than can be found in the public record, including results of unpublished trials and unreported findings from published ones.
It’s interesting to wonder how many other pieces of accepted medical wisdom might be overturned if systematic reviews were routinely based on the data in these documents rather than on published trials.
A group of German researchers last week called for CSRs to be made publicly available, based on their comparison of 101 of them with published trial results for the same drugs.
They found a substantial amount of information on patient-relevant outcomes was missing from the public record, potentially undermining the basis for clinical decision making and for evaluation of comparative effectiveness of rival drugs.
In fact, the CSRs contained more than twice as much information on patient-relevant outcomes — benefits and harms — as could be found in the public domain (86% of outcomes were fully reported in the CSRs, compared with 39% in public sources).
The pharmaceutical industry appears unwilling to share this information without a fight but, if we’re really committed to evidence-based medicine, maybe it should be a condition of licensing that they do.
Jane McCredie is a Sydney-based science and medicine writer.