Intuitive Surgical, an American biotechnology company, introduced DaVinci surgical robots in the late 1990s, and they became groundbreaking teleoperation equipment. Expert surgeons could operate on patients remotely, manipulating the robotic arms and their surgical tools based on a video feed from DaVinci’s built-in cameras and endoscopes. Now, John Hopkins University researchers put a ChatGPT-like AI in charge of a DaVinci robot and taught it to perform a gallbladder-removal surgery. Kuka surgeries The idea to put a computer behind the wheel of a surgical robot is not entirely new, but these had mostly relied on using pre-programmed actions. “The program told the robot exactly how to move and what to do. It worked like in these Kuka robotic arms, welding cars on factory floors,” says Ji Woong Kim, a robotics researcher who led the study on autonomous surgery. To improve on that, a team led by Axel Krieger, an assistant professor of mechanical engineering at John Hopkins University, built STAR: the Smart Tissue Autonomous Robot. In 2022, it successfully performed a surgery on a live pig. But even STAR couldn’t do it without specially marked tissues and a predetermined plan. STAR’s key difference was that its AI could make adjustments to this plan based on the feed from cameras. The new robot can do considerably more. “Our current work is much more flexible,” Kim says. “It is an AI that learns from demonstrations.” The new system is called SRT-H (Surgical Robot Transformer) and was developed by Kim and his colleagues, Krieger added. The first change they made was to the hardware. Instead of using a custom robot like STAR, the new work relied on the DaVinci robot, which has become a de facto industry standard in teleoperation surgeries, with over 10,000 units already deployed in hospitals worldwide. The second change was the software driving the system. It relied on two transformer models, the same architecture that powers ChatGPT. One was a high-level policy module, which was responsible for task planning and ensuring the procedure went smoothly. The low-level module was responsible for executing the tasks issued by the high-level module, translating its instructions into specific trajectories for the robotic arms.