The fundamental problem of self-driving cars in cities is that they are expected to be 100% safe. They could be 10 times as safe as regular cars, but it wouldn't save them from headlines of "robot car kills an innocent person". And given how chaotic and unpredictable situation on the roads in a densely populated city without designated spaces for cyclists can be, 100% safety is impossible. Even if AI itself if perfect, there's physics of mass, inertia, velocity, visibility, etc. - which sometimes will cause situations that lead to collisions, unless the self-driving cars either are physically segregated or limited to ridiculously low speeds where they become useless.<p>Solution 1: make separate bike paths. Totally works in many cities, likely can't happen in SF because of so many reasons. Solution 2: ban self-driving cars. Solution 3: pay megabucks to PR company which would create some kind of a narrative where safety is much less important than the benefits of the self-driving cars, and if you say otherwise, you are a bad person and should be cancelled. Not sure if it's possible but it probably less impossible than making a 100% safe self-driving car.<p>A sane solution would be to figure out how much safety we expect, take effort to make it as close to 100% as possible while realizing it's never 100%, and rationally investigate each case of failure, while recognizing that some amount of them are inevitable. But that would be totally outdated and unusual pattern of behavior, so I don't expect it to happen in practice.