This scares me. Self-driving vehicles should be able to (and can be to date) be controlled by a human driver. This presents a couple of issues:<p>1. The car is now under the complete control of a computer - not a human driver. There are only a limited number of things the passengers can do to stop the vehicle or improve their situation in case of an incident.<p>2. It _requires_ a computer to make complex moral decisions about the safety of the passengers, dictated by the preferences/views/biases of the manufacturer/regulators.<p>It seems that we have crossed the line between "should technology solve this issue" and "could technology solve this issue".