Why is there so much crappy medical research?
The latest example, complete with press release, was just published in the British Medical Journal (BMJ). It’s entitled Early death after discharge from emergency departments: analysis of national US insurance claims data.
The authors found:
[pullquote align=”right” cite=”” link=”” color=”” class=”” size=””]It’s not clear that the death rate exceeds the background rate of death of Medicare patients.[/pullquote]
Among discharged patients, 0.12% (12 375/10 093 678, in the 20% sample over 2007-12) died within seven days, or 10 093 per year nationally. Mean age at death was 69. Leading causes of death on death certi cates were atherosclerotic heart disease (13.6%), myocardial infarction (10.3%), and chronic obstructive pulmonary disease …
Every year, a substantial number of Medicare beneficiaries die soon after discharge from emergency departments, despite no diagnosis of a life limiting illnesses recorded in their claims. Further research is needed to explore whether these deaths were preventable.
The press release is hardly judicious:
These early deaths were concentrated in hospitals that admitted few patients to the hospital from the ED, hospitals that are often viewed as models by policy makers because of their low costs. By contrast, deaths were far less frequent in large, university-affiliated EDs with higher admission rates and higher costs, even though the population served by these EDs was generally less healthy when they walked in the front door of the ED.
The lead author elaborates in The Boston Globe STAT section:
The study’s lead author said that while the data reflect a fraction of Medicare patient deaths, the finding raises questions about the adequacy of hospital resources in rural and underserved areas and whether the US government’s quest to cut costs — and reduce inpatient admissions from ERs — is also cutting out essential care.
“There’s no doubt there’s a lot of unnecessary hospital admissions, but this study suggests there’s also avoidable harm from sending people home that shouldn’t go home,” said Dr. Ziad Obermeyer, an emergency medicine physician and professor at Harvard Medical School.
The implication is that people are dying preventable deaths because they were discharged from the emergency room instead of being admitted to the hospital.
Is that what the data shows? There’s no way to be sure because the single most important piece of information necessary to reach that conclusion is MISSING from the paper. How many Medicare patients die in a typical week? Quite a few, it turns out.
That’s not surprising. Medicare patients are age 65 and older. They did because everyone dies. Does the rate of death after being discharged from the ER exceed the background rate of death? The authors don’t tell us; indeed they don’t appear to have bothered to check, an inexcusable omission in a paper of this type.
Approximately 4.5% of Medicare patients die each year, for a baseline death rate of 0.09%/week. The study patients represent a subset of Medicare patients [those aged ≥ 90, receiving palliative or hospice care, or with a diagnosis of a life limiting illnesses, either during emergency department visits (for example, myocardial infarction) or in the year before (for example, malignancy) were excluded]. Nonetheless, the baseline Medicare death rate in the group being studied represents a substantial proportion of the death rate reported in the week after discharge from the ER.
Therefore, the implication that 10,000 patients die preventable deaths each year as a result of being discharged from the emergency room is flat out false. The majority of those patients almost certainly would have died anyway.
The authors do tell us how the admitted patients fared, although they do so in a misleading manner.
This chart compares the death rates of patients admitted from the ER compared to those discharged from the ER, divided into quintiles based on the admission rate.
There’s a glaringly obvious problem. The scale from admitted patients is different from that of discharge patients, making it look as if the death rate in discharged patients is higher than admitted patients when it is actually far lower. The death rate for admitted patients was generally 20X higher than for discharged patients! The only exception is the lowest quintile that admitted the fewest patients from the emergency room; in that quintile, the death rate of admitted patients was only double that of discharged patients.
It’s hardly unexpected that getting admitted was associated with a massively increased risk of dying. These patients were sicker. But it also suggests that getting admitted did not necessarily prevent death. We’ve already seen that the majority of the purported 10,000 people who die in the week after ER discharge were going to die anyway. Now we can see that admitting them to the hospital would not necessarily have prevented their deaths, either.
The authors know, or should know this. Indeed, they admit in the abstract that they have no idea whether the deaths they observed were preventable at all, then proceed to imply the exact opposite.
What does this paper tell us? NOTHING!
It’s just another crappy paper that spins a fairy tale from an observation stripped of context. For all we know, every single one of those 10,000 people who died would have died regardless. The authors certainly haven’t demonstrated otherwise.