It has been a good while since I reported on drugs like epoetin, used to stimulate the body to make more red blood cells to counteract the anemia associated with end-stage kidney disease and cancer. My early post: http://brodyhooked.blogspot.com/search?q=dialysis-- commented mostly on the payment system that tempted dialysis and cancer docs to prescribe more of these drugs than was necessary or safe. But new information now suggests that the truth about the harms and lack of benefit of these drugs could have been known much sooner and many patients saved from bad outcomes. Sadly this is not the first such story in the history of the medicine-pharmaceutical industry relationship.
Dr. Daniel W. Coyne, a kidney specialist at Washington University-St. Louis, has both written a detailed paper and a briefer commentary about his research:
I'm going to go into some detail about Dr. Coyne's findings as they illustrate several points, both about how scientific studies can be fudged to create a positive drug marketing message, and also how our bureaucracy works (or doesn't) to protect patient safety.
Before more recent research in the mid-2000s showing quite definitively that higher doses of epoetin-type drugs produced more strokes and other bad outcomes, the main research trial informing kidney guideline writers was the Normal Hematocrit Trial, conducted in 1996 and published in the New England Journal in 1998: http://www.nejm.org/doi/full/10.1056/NEJM199808273390903 This study was stopped early because of concerns that the higher doses of epoetin were causing more adverse reactions (more on that later). The journal article reported that on careful statistical analysis, there were no serious safety issues found, but that by contrast, quality of life of patients improved in the groups receiving the higher doses of epoetin (that is, those whose red blood cells achieved higher levels, which usually requires higher doses of the drug).
What Dr. Coyne did was simple. The drug company, Amgen, had to submit its own report of the study data to the FDA. As usually is the case, the FDA is under no obligation publicly to release these data and is indeed usually prevented from doing so by the proprietary nature of a company-sponsored study. Dr. Coyne requested the FDA data under the Freedom of Information Act, and was able to obtain the data after being kept waiting a mere 3-1/2 years (more on that later too). He then sat down and compared in detail the study data reported to the FDA to the same study reported in the medical journal in 1998.
First the safety data. According to the New England Journal, the primary study endpoint, death or non-fatal heart attack, showed no difference between the high- and lower-dose groups. But the reason that the authors concluded this was that the data monitoring board, that stopped the study early, insisted on a tighter threshold for statistical significance, reportedly to correct for the fact that the same data had undergone multiple prior statistical analyses. The usual threshold is P = 0.05, but with the demand for the stricter level of P = 0.008, there was no statistically significant difference. When Coyne looked at the data the drug company submitted to the FDA, he saw that with the unadjusted significance test, there were more adverse events in the higher-dose group, with P = 0.0119, which would normally be interpreted as quite a significant result.
Time out for a sidebar on early stoppage of clinical trials. As I previous blogged--for example, http://brodyhooked.blogspot.com/2011/03/how-honest-reports-of-research-can.html-- in the past, when drug trials are stopped early because a drug seems superior to the control, it is commonly found with later research that this result is spurious and that had the trial been continued to scheduled completion, there would have been no difference between drug and placebo (or other comparator). What we see in the case of the Normal Hematocrit Trial appears to suggest a double standard for industry-sponsored trials. If the trial is stopped early because the company's drug looks good, then that result is trumpeted as the truth, even if more study would cast doubt on that conclusion. If on the other hand a study is stopped early because the drug causes people to die, then the investigators get to move the goalposts and fudge the statistics, so that it turns out that those people did not really die after all. Without going into all the issues about whether data monitoring boards are truly independent of the study sponsors, and how early stopping can lead to misleading results even of the boards are totally kosher, it would seem a valid take-home lesson that we should authomatically be very skeptical whenever a drug trial is stopped early.
Now, back to the main story and the question of benefits. Dr. Coyne found that the study as reported in the New England Journal indicated benefit in quality of life for patients who had achieved higher red blood cell counts. Reportedly these findings were statistically significant at P = 0.03. When he reviewed the same data as reported to the FDA, he could find no evidence of any statistically significant improvement, with a single exception--it was indeed true that patients getting higher epoetin doses ended up requiring fewer blood transfusions.
So the bottom line--in 1998 the kidney dialysis community was told that higher epoetin doses, leading to higher red cell counts, posed no significant risk of harm and improved patients' quality of life. Based mostly on that one study, the kidney gurus issued several practice guidelines calling for higher levels of red cells, which in turn required docs to prescribe higher doses of epoetin. Around 2006-8, new data emerged suggesting that this was unsafe. In hindsight we now realize that the data from 1996-98 actually show the same thing, and indeed demonstrate lack of any benefit to boot; so between 1998 and 2008, however many dialysis patients were exposed to serious risks of harm, including death, with no corresponding benefit. In total, says Dr. Coyne, Amgen profited to the tune of $37B in total sales of epoetin.
Now for the bureaucracy piece. Dr. Coyne filed his FOIA request for the FDA data in January 2008. He finally received the data in July 2011. Two weeks before he received the data, the FDA issued new labeling for epoetin, calling for lower red blood cell levels. In its warning, the FDA accepted the statistical tests of the data on file, meaning that the FDA now belatedly rejected the statistical fixes that had been published in the New England Journal. He allows us to read whatever we want into the timing of these events.
In his opinion article, Dr. Coyne also reports having contacted some of the academic authors of the New England Journal version of the Normal Hematocrit Trial. They claimed to him that they had tried to insert the information that he later discovered into the published paper, but that the editors at the journal rejected those amendments. Dr. Coyne admits to skepticism, since these same authors published several later papers and also served on the kidney guideline committees, but never made any attempt to alter the impression given by the original publication.
In HOOKED, I mention a couple of other instances where patients suffered due to a delay in revealing the truth about the benefits and harms associated with a drug--Vioxx being the poster child, having caused an estimated 144,000 excess heart attacks during the years when its dangers should have been known. We now have to add epoetin to this dishonor roll.
Many thanks to Dr. Barbara Roberts, author of The Truth About Statins, for calling my attention to this work.