Posted in EHRs

EHR voice assistant revolution arrives at Vanderbilt University Medical Center

Chris Nerney
Chris Nerney, Contributing Writer |
EHR voice assistant revolution arrives at Vanderbilt University Medical Center

Two healthcare experts writing recently in the Harvard Business Review are calling for a “revolution” in the usability of electronic health records (EHRs), primarily through a transition toward voice and gesture-based data input and retrieval.
One healthcare organization at the forefront of that revolution is Vanderbilt University Medical Center (VUMC), whose Department of Biomedical informatics and Health Information Technology Innovations recently developed a voice assistant for VUMC clinicians to interact with the hospital's Epic EHR.
Using natural language processing (NLP) and artificial intelligence (AI), V-EVA (Vanderbilt EHR Voice Assistant) is designed to understand and fulfill verbal requests from clinicians and other hospital staff. V-EVA still is a prototype, however, so is being tested by only a small group of users at the hospital to assess its impact on workflows.
"The idea to develop an in-house voice assistant came from the general frustration we heard from users about the difficulty navigating the EHR to find relevant information," V-ETA project leader Yaa Kumah-Crystal, MD, tells Healthcare IT News (HITN) Managing Editor Bill Siwicki. “There is a lot of information foraging that occurs in the EHR.”
This interface inefficiency no longer can be tolerated, Robert Wachter, MD, head of the Department of Medicine at the University of California, San Francisco, and Jeff Goldsmith, associate professor of public health sciences at the University of Virginia, argue in their Harvard Business Review piece:
“The electronic systems hospitals have adopted at huge expense are fronted by user interfaces out of the mid-1990s: Windows 95-style screens and dropdown menus, data input by typing and navigation by point and click,” they write. “These antiquated user interfaces are astonishingly difficult to navigate. Clinical information vital for care decisions is sometimes entombed dozens of clicks beneath the user-facing pages of the patient’s chart.”
Wachter and Goldsmith conclude that “voice and gesture-based interfaces must replace the unsanitary and clunky keyboard and mouse as the method of building and interacting with the record.”
Kumah-Crystal, who also is assistant professor of biomedical informatics and assistant professor of pediatric endocrinology at VUMC and Monroe Carell Jr. Children's Hospital at Vanderbilt, says advances in machine learning and NLP make voice interaction with EHRs “an idea whose time has come.”
"We think the incorporation of voice assistants in the provider workflow can enhance the delivery of care," Kumah-Crystal tells HITN. "One of our testers described the platform like a helpful intern always ready with an answer.”
The key, though, is knowing how to best communicate, or interact, with that intern. In that regard, voice user interfaces still present a challenge in that they must be taught to understand user voice requests within a broader context. So rather than simply responding to a question with only the information requested, voice user interfaces ideally interpret the query within the context of the user’s intent and a larger problem or situation being addressed.
VUMC will be carefully collecting and analyzing data to measure the effectiveness of V-EVA against the current interface workflow.
“User satisfaction and the perception of ease of use will also be essential metrics,” Kumah-Crystal says. “People are not very tolerant of failure when they are busy trying to get work done."