The right moves
Tracking surgeons’ motions to understand success
Deep in the halls of San Francisco’s Moscone convention center, surgeons gathered near the far corner of a conference’s exhibition space, attracted by a somewhat unusual scene: 10 identical tables, each topped by a tray of surgical tools, several bundles of sutures, a stopwatch and a section of pig intestine.
An ice chest sat on the floor, its contents labeled in large lettering: OLD BOWEL.
Just outside the waist-high barriers of the display, Carla Pugh, MD, PhD, a Stanford professor of surgery, mingled with curious passersby. “Would you like to know more about our study?” she asked.
Pugh was at the world’s largest surgical conference, the American College of Surgeons Clinical Congress, which was held in late October, and she was taking advantage of the foot traffic.
“We’re recruiting surgeons for our research, which uses several types of sensors to measure a surgeon’s movement during an operation, their decision-making and their brain waves,” she told a group of interested participants.
The more adventurous of them stepped forward and were presented with a scenario: A “patient” — a chunk of pig bowel — was in need of a small surgery. Each piece of tissue had been lacerated in two places. The participants were charged with finding the damage and repairing it — all under the monitoring of several data-collecting sensors, while being timed.
The surgeons palpated and gently pulled at the edges of the pink, crinkled bowel to locate the tears. With their choice of suture and stitching technique, they mended the intestine to their satisfaction. When they were finished, they called for their time and stepped back to let the study’s assistants evaluate the work.
The goal of the exercise was to collect metrics on a basic surgical procedure — repairing a tear with sutures — and use the data to understand how specific motions, decisions and approaches correlated with the quality of the work. The event was one of the first showcases of a vision that Pugh has developed for decades: applying data, for the first time, to understand what it is that surgeons do and how they do it.
“In the field of surgery, there are no metrics to back up what it is that we do, or the range of tactics we employ to get positive surgical outcomes,” said Pugh, who is the director of Stanford Medicine’s Technology Enabled Clinical Improvement Center. “We walk around with more detailed data about our bank accounts than how we perform clinical procedures, which are 10 times more complex. But I’m hoping to change that.”
Gearing up for data collection
Pugh’s research, part of a new multi-institutional collaboration called the Surgical Metrics Project, which she leads, harvests data from audio and video recordings of surgeons and wearable sensors that measure motion, brain waves and tactile pressure. She’s one of the first researchers to study surgical data analytics, a subspecialty that she guesses comprises only a few dozen experts.
When Pugh began pursuing this line of research, surgical wearables did not exist; in fact, the rise of everyday wearables like the smartwatch was still 10 years away. Her interest in the subject stemmed from an insight that struck her as a young scientist.
“In medical school, I saw that technology had huge potential to facilitate medical education and training,” Pugh said. “And so I took a bit of a detour through my own training and ended up getting a PhD in education and technology from Stanford’s Graduate School of Education.”
One of her graduate classes focused on human-computer interactions and sensor technologies. She began to understand how technology could enhance clinical performance and became convinced that data-collecting sensors were the key to teaching hands-on skills in a way that a textbook, video or lecture never could. Some of Pugh’s earlier work in wearables grew out of her experiences in cancer detection — for instance, a bra that collects data during a breast examination through force sensors built into the fabric.
Fast-forward to today: Pugh’s suite of operation-friendly wearables is gaining attention in the surgical community. In the past few months alone, Pugh and her team have collected data from hundreds of surgeons — the bulk of which came from conference-goers eager to donate their time and test their skills.
Andrew Wright, MD, a surgeon at the University of Washington, was one of the attendees keen to try Pugh’s technology. “I’m highly involved in surgical education and training, and I’ve known Dr. Pugh for a number of years through our shared interests,” Wright said. “This sort of technology could be used not just for training students and residents but also for helping practicing physicians maintain their skills and learn new surgical techniques.”
There are two key elements to successful integration of the technology, said Pugh: One is sleek, user-friendly wearables; the other is integrated data streams. The trick is to collect as much information as possible without impeding the natural pattern of a surgeon’s workflow.
As part of the study, each surgeon first undergoes a baseline electroencephalogram, which measures brain waves through wires encased in a brown, translucent sensor strip that sticks to the surgeon’s forehead. The strip measures brain activity while the participant performs a handful of mundane mental tasks: about 10 minutes of listening to music, meditating and recalling certain melodies.
Then, the surgeons suit up for data collection, each donning a special lab coat that holds a variety of wired sensors. Three motion sensors — so fine they fit under surgical gloves — poke out of the sleeves and are secured to the thumbs, index fingers and wrists with a piece of tape.
Finally, Pugh sets up audio and video recordings, which run as surgeons operate. The integrated approach to data collection shows not only how the surgeons’ hands move, but also how they talk through tricky parts of a procedure, and how their brain waves spike or dip.
The proof is in the data pattern
So far, the idea of surgical wearables has been met with mixed reactions, Pugh said. Mostly, there’s a sense of excitement and an eagerness to participate, she said. But there’s concern, too — namely, that the data would be used to unfairly judge a surgeon’s skills during a difficult procedure. It’s true that the wearables could be used to one day test surgical skill or review a case that went awry, but to Pugh, it would be a mistake to limit the data to that purpose.
“To me, collecting surgical data is less about evaluating the skill of a surgeon and far more about quantifying what it took to take care of a specific patient,” Pugh said.
She gives an example: Patients in intensive care units often need a central line, a type of IV that can withdraw fluid or deliver medicine. But inserting a central line into the vein of a frail 90-year-old patient is extremely different from doing so in a morbidly obese patient, or in a patient who has had multiple lines placed in the past, which can cause scar tissue and change how a vein is accessed.
“We all know the difference as practiced physicians, but there’s no data to show it,” Pugh said.
Pugh and her team are still just getting off the starting blocks, but the data they’ve collected — through recent, resident-fueled pilot studies and at a handful of medical and surgical conferences — have already started to yield intriguing insights through data patterns.
Instead of analyzing every data point from a surgery, researchers look for trends. The motion-tracking sensors feed visual data back to a computer, revealing movement patterns of a surgeon’s hands, including where they pause and where they spend more time.
“People would ask me, ‘Why would you want to measure surgical technique? Everyone operates so differently.’ But our data essentially shows the opposite,” Pugh said. “Whether surgeons use different instruments or add their own finesses to a procedure doesn’t really matter.”
The overall movement patterns that are generated are very similar, so long as there aren’t complications — such as abnormal patient anatomy or the rare surgical error.
Such data patterns can show where surgeries hit a snag. Take, for instance, some surgery with a movement pattern that looks roughly like the body and wings of a butterfly when performed successfully. Those who perform that surgery without complications will see that same butterfly-esque movement pattern. Those who don’t might have a pattern with lopsided wings or one with two bodies.
“The motion sensors that track that surgeon’s fingers and hands produce a very visual result,” Pugh said. “And what’s even more interesting to see is that there doesn’t seem to be a correlation with instrument choice or whether the surgeon switched step 5 for step 6 — it’s the patient’s anatomy that most accurately correlates to the end pattern.”
Big (data) dreams
The intertwining data streams from various wearables on the surgeon’s body can reveal quite a bit about the procedure and the patient on the table. But more than that, Pugh and her colleagues also see it as a data-first approach to teaching, learning and improvement.
“The innovative research led by Dr. Pugh’s team will provide incredible data-informed insights into surgeon efficiency of motion, tactile pressure and cognitive load while performing a variety of medical and surgical tasks,” said Mary Hawn, MD, who chairs the department of surgery and is the Emile Holman Professor of Surgery. “These types of data could be used to identify when a surgeon has mastered a procedure and when there may be a deficit.”
Some of the wearable applications are still a ways off from use in the operating room, Pugh said, as the technology is now used only for procedures on manikins and tissue bits. But there is one wearable Pugh has tested in patient surgeries: the electroencephalogram, or EEG, sensor.
During two surgeries, a gall bladder removal and an appendectomy, Pugh has volunteered to stick the brain-wave-reading sensor onto her forehead. “First, we just need to verify that it works in the OR and that the data comes in successfully,” Pugh said. So far, it does. Through the EEG data, Pugh’s team could see that the peaks of Pugh’s brain waves while operating corresponded with the most trying moments of the surgery, while lower level activity synchronized with menial surgical tasks, like suturing.
After a successful surgery, Pugh closed the patient and left the operating room, forgetting to remove the long strip on her forehead. “My colleagues who are aware of my research saw the EEG sensor and immediately knew what I had been doing,” she said. Now, Pugh’s often peppered with the same question: When can others test out the technology?
“This is an entirely new data endeavor; we’re learning in real time how best to propel this work, analyze the data and fast-track it in a safe way so that other surgeons can begin to use it in their ORs, too,” Pugh said.
“Right now, it’s just me who’s tested it during surgery, but my big dream is to have this be routine. I can’t tell you all the ways the data will be used, but it will definitely improve the care we provide.”