When surgeons are trained, they usually need the supervision of more experienced doctors who can mentor them on their technique. That may be changing due to a new artificial intelligence system developed by Caltech researchers and Keck Medicine of USC urologists that aims to provide valuable feedback to surgeons on the quality of their work.
The goal of the new Surgical AI System (SAIS) is to provide surgeons with objective performance evaluations that can improve their work and, by extension, the outcomes of their patients. When provided with a video of a surgical procedure, SAIS can identify what type of surgery is being performed and the quality with which it was executed by a surgeon.
The system was introduced through a series of articles in the journals Nature Biomedical Engineering, npj Digital Medicine, and Communications Medicine, which were published concurrently at the end of March 2023.
"In high stakes environments such as robotic surgery, it is not realistic for AI to replace human surgeons in the short term," says Anima Anandkumar, Bren Professor of Computing and Mathematical Sciences and senior author of the studies. "Instead, we asked how AI can safely improve surgical outcomes for the patients, and hence, our focus on making human surgeons better and more effective through AI."
SAIS was trained using a large volume of video data that was annotated by medical professionals. Surgeons' performances were assessed down to the level of individual discrete motions, i.e., holding a needle, driving it through tissue, and withdrawing it from tissue. After training, SAIS was tasked with reviewing and evaluating surgeons' performance during a wide range of procedures using video from a variety of hospitals.
"SAIS has the potential to provide surgeon feedback that is accurate, consistent, and scalable," says Dani Kiyasseh, lead author of the studies, a former postdoctoral researcher at Caltech and now a senior AI engineer at Vicarious Surgical. The hope, according to the researchers, is for SAIS to provide surgeons with guidance on what skill sets need to be improved.
To make the tool more valuable for surgeons, the team developed the AI's ability to justify its skill assessments. The AI can now inform surgeons about their level of skill and provide detailed feedback on its rationale for making that assessment by pointing to specific video clips.
"We were able to show that such AI-based explanations often align with explanations that surgeons would have otherwise provided," Kiyasseh says. "Reliable AI-based explanations can pave the way for providing feedback when peer surgeons are not immediately available."
Early on, researchers testing SAIS noted that an unintended bias crept into the system in which the AI sometimes rated surgeons as more or less skilled than their experience would otherwise indicate based solely on an analysis of their overall movements. To address this issue, the researchers guided the AI system to focus exclusively on pertinent aspects of the surgical video. Narrowing the focus mitigated, though did not eliminate, the bias, which the researchers are continuing to address.
"Human-derived surgical feedback is not presently objective nor scalable," says Andrew Hung, a urologist with Keck Medicine of USC and associate professor of urology at Keck School of Medicine of USC. "AI-derived feedback, such as what our system delivers, presents a major opportunity to provide surgeons actionable feedback."
The studies are titled "A vision transformer for decoding surgeon activity from surgical videos," "Human visual explanations mitigate bias in AI-based assessment of surgeon skills," and "A multi-institutional study using artificial intelligence to provide reliable and fair feedback to surgeons." This research was funded by the National Cancer Institute.