High-tech health

How digital medicine is improving patient care

Illustration by Bryan Christie

As digital technology has become more portable, easy to use and affordable, it has begun to capture the minds of medical researchers. From new imaging tools to mobile devices, Stanford researchers are exploring how to use technologically advanced tools to fill gaps in patient care. And it’s begun to make a difference.

The following four stories show ways researchers at Stanford are exploring the use of new technologies to solve old problems. An emergency room physician uses tablet computers to train community health care workers in underserved rural areas of Haiti and India. Radiologists transform holograms to assist in breast tumor removal. Heart doctors are dovetailing on society’s obsession with smartphones to try “pinging” people off the couch and onto their feet. And researchers are using Google Glass to provide at-home therapy for children with autism.

“There is a revolution in health care that is in large parts driven by technology,” said Michael Halaas, associate dean for industry relations and digital health. “There are a lot of great ideas emerging about how to transform health care that are digitally driven, but they need to be validated and thoughtfully introduced. We remain focused on developing digital health tools that can improve health while keeping the human element that is vital to care delivery.”


Years ago, Ayesha Khan’s grandfather was hit by a semitruck as he rode a bicycle along a road in rural Pakistan. With no emergency response system in place — no 911, no ambulance service — he lay severely injured by the side of the road until someone eventually drove by and delivered him to the hospital. After 30 more minutes, he died in the waiting room without receiving care.

Now Khan, MD, a Stanford emergency medicine physician, uses digital devices to address these kinds of voids in care in the developing world. Working first in Haiti and now in rural India, Khan and her team have developed an app-based curriculum to train community members in basic health care delivery — from half a world away at Stanford.

“My grandfather, he died in this sort of unceremonious way,” said Khan, Stanford Medicine clinical assistant professor of emergency medicine, who immigrated to the United States from Pakistan when she was 3. “Where my family comes from has somewhat inspired me. I’m passionate about health equity.”

Before using Google Glass and an app that helped him distinguish eight different facial expressions, Alex, 9, couldn’t always understand social cues, making interactions with other children difficult.

By studying animated, spoken lesson plans accessed on digital tablets, completing a work book and passing the tests provided on the tablet, health care workers with limited education have been successfully trained in first-line treatment for acute complaints. These providers are now controlling bleeding, and stabilizing airway obstructions and seizures.

They diagnose and treat urinary tract infections, sexually transmitted diseases, broken limbs, skin infections, fever, upper respiratory infections, diarrhea and high blood pressure. They care for severe wounds. And they triage patients toward more intense levels of care when necessary.

“My desire was that the program we created not rely heavily on people flying back and forth; that’s just not sustainable,” Khan said. “We developed an e-curriculum so that the program was not dependent on live trainers.”

Currently, five health care workers ages 19 to 21 who grew up as orphans are providing first-line care for patients within their community of 28,000 in Haiti. In rural India, 54 women from the states of Uttar Pradesh and Bihar provide care to patients in the 54 villages where they live and now work. Four local facilitators in India used the app to train the women, who are considered past childbearing age and are seen as a burden in their communities.

“This project has a twofold advantage,” Khan said. “It provides health care to communities without it, and it employs people marginalized within their own communities.” The workers are paid through the community where they work, and a Stanford grant helped fund their training. “Now that we see the program working, I’m so eager for the chance to grow,” said Khan, adding that she’s exploring opportunities to expand into Kenya. “There is so much scope for it to help around the world.”

Reading minds

Nine-year-old Alex, who has a high-functioning form of autism, has always had difficulty making eye contact and understanding social cues, traits that are typical of someone with his disorder. Making friends has been a challenge, particularly on the playground.

“In preschool, he was hit with a mallet and kicked in the face by children. They were upset with him, and he couldn’t see it coming,” said his mother, Donji Cullenbine. “Children were very scary for him.”

About a year and a half ago, Stanford researchers hooked Alex up with a Google Glass visual headset, which he thought was really cool. It helped teach him how to read other people’s emotions through their facial expressions.

The new form of behavioral therapy uses a Stanford-designed app paired with Google Glass to help children distinguish between eight classical facial expressions indicating happiness, sadness, anger, disgust, surprise, fear, contempt or neutral. The wearable computer links to the smartphone app through the local wireless network. The device has a glasses-like frame with a camera to record the view of the person wearing it, a small screen and a speaker for verbal cues.

“Within a couple of weeks, he started to flick glances at me. I had tried for years to get him to engage with my face, but he never stayed for more than a second.”

Researchers designed three different formats to help engage kids. The first is “free play,” which gives auditory clues about the emotions of others. The other two are games — “Guess My Emotion,” in which parents act out emotions for the child to guess what they are, and “Capture the Smile,” in which the child tries to elicit a certain emotion from the parent or other caregiver. Alex particularly liked the “Guess My Emotion” game and free play. The app seemed to his mother to make a difference.

By the end of a study using Google Glass and apps that helped him read facial expressions through visual and verbal cues, Alex, 9, told his mother, Donji Cullenbine, ‘Mommy, I can read minds!’

“Within a couple of weeks, he started to flick glances at me,” said Cullenbine, who agreed to have Alex participate in a clinical trial in 2017 to test the new home-based therapy. “I had tried for years to get him to engage with my face, but he never stayed for more than a second.”

The clinical trial included 14 families, each with a child who had been clinically diagnosed with autism. The children used the Google Glass setup over a 10-week period, according to the study, which was published in Digital Medicine in August.

One-on-one treatment with a trained therapist has been shown to be effective in treating autism, but a shortage of therapists means many children aren’t being treated early enough, said Dennis Wall, PhD, the study’s senior author and Stanford Medicine associate professor of pediatrics and of biomedical data science.

A window of opportunity is being missed, and that’s where Wall hopes this new digital health-based therapy can step in.

“The only way to break through the problem is to create reliable, home-based treatment systems,” he said. “It’s a really important unmet need.”

Results from early clinical trials have been overwhelmingly positive, Wall said.

“We’re seeing improved eye contact, emotional awareness, an ability to understand and appreciate emotions,” he said. And comments from parents have reflected this early success. “Parents said things like, ‘A switch has been flipped; my child is looking at me.’ Or, ‘Suddenly the teacher is telling me that my child is engaging in the classroom.’”

By the trial’s end, Alex recognized emotions so well in others that one day at home he exclaimed: “Mommy, I can read minds!”

“I thought, ‘He got it!’” his mother said. “He understands there is information on people’s faces that he can interpret.”

When Alex used the app, the Google Glass viewer displayed an emoji that indicated whether he correctly identified a human emotion.

Couch potatoes

Doctors know exercise helps prevent heart disease, but trying to motivate people to get off the couch is no easy task. MyHeart Counts, an iPhone app developed by Stanford researchers, not only collects massive amounts of research data from smartphone users to study cardiovascular health, it also pings them when it’s time to stand up.

“We are giving them customized prompts to encourage them to exercise,” said Anna Shcherbina, a graduate student in biomedical informatics on the MyHeart Counts team. “We’re trying to determine which prompts work the best to encourage exercise.”

If a user sits for more than an hour, for example, the Stanford MyHeart Counts app sends a reminder to get up even if just for a moment. Users who set daily goals of 10,000 steps will get a friendly prompt on the days they fall short, such as, “You are at 115 steps now, and you need 9,885 more to reach your goal. Walking to your next appointment will help you reach your step goal.”

The app also presents users with graphs that show how they compare with other users in terms of daily step counts, how happy they are, how much they sleep and even how many vegetables they are eating.

“Consumer adoption of smartphones really has opened up this whole new world,” said Steve Hershman, PhD, a member of the MyHeart Counts team and director of mHealth in cardiovascular medicine at Stanford Medicine. “It’s amazing the volume of information researchers can get from these apps. And they’re also just sort of fun to use. They help make research more human.”

The app, which now collects such data as daily activity levels, blood pressure, cholesterol and cardiovascular health from 50,000 users in the United States, Hong Kong and the United Kingdom, was designed in 2015. It was one of the inaugural mobile health apps launched on Apple’s ResearchKit platform.

“It’s amazing the volume of information researchers can get from these apps. And they’re also just sort of fun to use. They help make research more human.”

Researchers published their first study based on data collected from 49,000 MyHeart Counts app users in JAMA Cardiology in December 2016. The study found that use of apps for collecting large amounts of health care data could transform cardiovascular research. Results also showed that among groups of subjects with similar activity levels, those who were active throughout the day, rather than in a single, relatively short interval, reported better levels of cardiovascular health with lower rates of chest pain, heart attacks and atrial fibrillation. The next research study is expected to be ready for publication soon, Hershman said.

Today’s 2.0 version of the app also includes an added consent module that allows users who have a 23andMe account to securely share their genetic information with Stanford researchers. “At first it was just a way to collect data for medical research,” Hershman said. “Now we’re really hoping to change people’s health.”

Holograms in surgery

Looking to increase precision during the surgical removal of breast tumors, a Stanford research team developed a technique that brings holographic images into the operating room.

Surgeons refer to MRI images on computer displays to help guide their incisions, but there is still quite a bit of guesswork because tumors come in various three-dimensional shapes and sizes.

As a result, either too much tissue gets removed or too little, said Bruce Daniel, MD, professor of radiology and director of IMMERS, the incubator for medical mixed and extended reality at Stanford.

“The surgeon can’t always tell what’s what,” Daniel said. The team developed a mixed-reality system using Microsoft’s HoloLens headsets to reflect a three-dimensional image of a patient’s tumor, based on MRI scans, directly on the diseased breast. The surgeon looks through the headset, which includes a holographic computer, and aligns a floating holographic image of the tumor onto the surgical site. The goal is to use the tools to increase the precision of the removal of the entire tumor, leaving as much of the healthy breast tissue as intact as possible, said Brian Hargreaves, PhD, Stanford Medicine professor of radiology and of electrical engineering and co-director of IMMERS.

“It gives me X-ray vision,” said Amanda Wheeler, MD, clinical associate professor of surgery who is participating in a pilot clinical research study of 10 patients that uses the new system. Prior to surgery, Wheeler puts on the headset, then uses markers to sketch the reflection of the hologram onto the patient’s breast. “It helps me plan the surgical site, making sure I’m getting as much accuracy as possible. I love it.”

Among the 300,000 women who are diagnosed yearly with breast cancer, about half are eligible for radiation and a lumpectomy that removes the tumor and leaves the remainder of the breast intact, the American Cancer Society reported. But deciding whether to have a lumpectomy rather than a mastectomy — total breast removal — is often difficult. It’s further hampered by the fact that 20 percent of women who have lumpectomies require a second surgery because the surgeon didn’t remove all the cancerous tissue the first time.

“Because this new method helps surgeons determine exactly where to cut out the cancerous breast tumor, it should reduce the number of second surgeries,” Daniel said.

Author headshot

Tracie White

Tracie White is a science writer in the Office of Communications. Email her at tracie.white@stanford.edu.

Email the author