It’s never easy to hear bad news about a family member in the hospital.
For the family of Ernest Quintana, hearing it from a robot video device that rolled into his room made it worse.
“This was horrible for me and him,” Quintana’s granddaughter, Annalisia Wilharm, said in a Facebook post that included a picture of the machine, which features a live doctor on a video screen. Quintana, 79, died Tuesday, two days after the device delivered the news his lungs were failing.
Friends of the grieving Fremont family expressed outrage.
“This is not the way to show value and compassion to a patient,” family friend Julianne Spangler wrote Wednesday. “Shame on you Kaiser!!”
Michelle Gaskill-Hames, senior vice president and area manager for Kaiser Permanente Greater Southern Alameda County, said in a statement that “we offer our sincere condolences” and that “we take this very seriously and have reached out to the family to discuss their concerns.”
“This is a highly unusual circumstance,” Gaskill-Hames said. “We regret falling short in meeting the patient’s and family’s expectations in this situation and we will use this as an opportunity to review how to improve patient experience with tele-video capabilities.”
“This secure video technology is a live conversation with a physician using tele-video technology, and always with a nurse or other physician in the room to explain the purpose and function of the technology,” Gaskill-Hames said. “It does not, and did not, replace ongoing in-person evaluations and conversations with a patient and family members.”
Wilharm said in an interview Friday that her grandfather had been rushed to the emergency room Sunday. She was alone with him while her mother went home to shower when a nurse told them that a doctor would come by to deliver test results.
Then the video-device wheeled itself in. Another machine delivering oxygen through a mask to her grandfather made it so noisy she had to repeat the words over the video to him, and struggled to keep her composure as she realized the gravity of the situation when the doctor on the video told her “I don’t know if he’s going to get home” and suggested giving him morphine “to make sure you’re comfortable.” She videotaped the encounter, fearing she would forget what was said.
“When that robot said that to him, he looked over at me and said, ‘Well, I guess I’m going to go quickly,’ and put his head down,” recalled Wilharm, 33. “It was pitiful.”
Her mother, Cathie Quintana, said the hospital didn’t inform other family members about the grim prognosis, which they had to learn through her daughter.
“It was handled with no compassion at all by this robot, there was no bedside manner, no nothing,” Cathie Quintana said. “It needed to be a person, for God’s sake. My mom and myself should have been there. We want to never have this happen to anyone again.”
Gaskill-Hames bristled at the characterization of the video device as a “robot,” calling it “inaccurate and inappropriate” and insisting that “in every aspect of our care, and especially when communicating difficult information, we do so with compassion in a personal manner.” InTouch Health calls its device an iRobot.
“The technology allows a small hospital to have additional specialists such as a board-certified critical care physician available 24/7,” Gaskill-Hames said, “enhancing the care provided and bringing additional consultative expertise to the bedside.
Arthur L. Caplan, head of the Division of Medical Ethics at NYU School of Medicine in New York City, said he’d never heard of something like that happening before. There’s no hard rule in the medical field that says doctors must deliver bad news in person, he said, and with growing automation, such interactions may well become standard in the future.
But Caplan and other experts like David Magnus, director of the Stanford Center for Biomedical Ethics, said such life-and-death matters must be handled sensitively, and families shouldn’t be surprised.
“With dying individuals, care is as hands on as can be,” Caplan said. “Having remote discussions is something that would undermine the expectation. It’s not that it’s bad to do it. It’s new, and what I’d like to see is that it be introduced in a sensitive way, with consent and a push to let people know to expect it, and no surprises.
“You get very vulnerable people when they’re sick and their families are stressed,” Caplan added. “The mere fact that this family was upset tells me we’ve got to do better.”