You Need a Quija Board

Related articles

A recent study from JAMA Internal Medicine documents that in many cases in which a patient has died, the electronic medical record – the baseline truth for Big Healthcare Data – lists them as alive. Does the term "garbage in, garbage out" ring some bells?

I am not a big fan of electronic health records (EHRs), at least as they are presently designed. They are more billing platforms than clinical data repositories, easily accessible to clinicians. With that bias in mind, this research letter from UCLA caught my eye.

In a review of about 12,000 EHR records for “seriously ill” primary care patients, 25% were recorded as deceased in their clinical records. Another 676 patients, 5.8%, were deceased according to records of the state of California but alive in the EHR. Could this be an example of “the ghost in the machine?”

The California Department of Public Health maintains a Public Use Death File denoting all individuals deceased within the state. Other states and the federal government support similar files. [1] The researchers point out that the death counts would be correct if this information were available to the EHR. The problem, of course, is the interoperability and data exchange between the systems. These problems have plagued EHR systems since they were hustled onto the scene and sold to health system administrators. In the instance of California’s database, it is available to law enforcement but not to health facilities – go figure.

The researchers point out that among those 676 deceased but still living in the EHR patients,

  • 80% had pending care appointments
  • 221 had received follow-up telephone calls
  • 338 had received follow-up messages on their portals
  • 221 had received 920 letters regarding unmet preventative care needs
  • 184 had orders placed for vaccinations or other clinical care

“Not knowing who is dead hinders efficient health management, billing, advanced illness interventions, and measurement. It impedes the health system’s ability to learn from adverse outcomes, to implement quality improvement, and to provide support for families.”

Much of the excess provided to the newly deceased is a feature, not a bug in EHRs. Letters, messaging, preventative care, and standard orders are automated, saving clinicians time but not, in this case, embarrassment. Even phone calls are automated, as anyone with a doctor’s appointment can readily attest.

The letter to JAMA Internal Medicine might be considered tongue-in-cheek, but it points to a far deeper problem with EHRs – the data is not always validated. While we may well trust laboratory findings, much of the subjective narrative allows for freely defined terms – my idea of the size of an abnormal collection of blood, a hematoma, can vary from minor and inconsequential when it is the result of my interventions, to large and concerning when it is the result of another physician. And before you suggest measuring the hematoma, let me assure you that our measuring devices can be equally suspect. The same can be said from much of the objective data collected – patting your abdomen does not constitute a thorough physical examination. EHRs, as demonstrated here, do not always capture the simplest of vital signs in their complete absence.

The AI algorithms may train on datasets that are cleansed of errors and validated, but they are then set loose in the world where data is not so clean or clear. AI algorithms are brittle; they work best when the data is most similar to their training set (think, teaching to the test). There is increasing evidence that the well-behaved algorithm in the laboratory makes mistakes in the real world and in medicine that can mean injury and death. There is a reason why medicine is slower to adapt to change; the Silicon Valley mantra of “move quickly, break things, and apologize later” has little value in our health care.


[1] The Feds have a Social Security Death Index as well as a National Death Index

Source: Consequences of a Health System Not Knowing Which Patients Are Deceased JAMA Internal Medicine DOI: 10.1001/jamainternmed.2023.6428