The robot's manufacturer, of course, if the surgical robot was acting autonomously, i.e., unsupervised, as the title suggests. AI-trained or hard-coded doesn't matter. It's the manufacturer's responsibility to ensure their robots are safe or, when they cannot guarantee safety, abort and ask for help. This does not mean that surgery success has to 100% -- that's a different matter. But in the course of it's operation it shouldn't cause undue harm.
Who is to blame when a treatment that's mostly helpful has a negative side effect?<p>Assuming the robot is prescribed because it's more effective than a human, and everybody involved understood the risks, I don't see an issue.<p>If it was done out of cheapness, it's the cheapskates fault.