Self-driving cars will be the future, as long as they're all rock-solid on cooperation. Imagine normal intersections, with inefficient stop lights where half the cycle is idle (all the cars on the green side went through already) due to a change in traffic patterns from when they were programmed. Instead, cars that all cooperate can move through the intersection pretty well by taking turns.<p>About thirty percent of traffic in SF is due to people cruising around the block, looking for parking. [1]. Imagine if you could just tell your car, "go park somewhere, I don't care where", and have it come back when you need it. Heck, someone else could use it to get from A to B instead of letting it idle parked.<p>There's a lot of potential awesome here, but it all springs from rock-solid cooperation routines. If people manage to write aggressive / betraying self-driving algorithms, a lot of those traffic light scenarios could easily get a lot worse. Trust issues are a thing: should the car's driving routines be black-boxed? If there were FOSS car brains, would we trust people to not mod their cars to do illegal stuff? I guess traditional traffic cops could handle a lot of the same things they do now, we'd just need the standard autodrive package to account for and expect other cars to be driven in poor, uncooperative fashions.<p>[1]: <a href="http://shoup.bol.ucla.edu/CruisingForParkingAccess.pdf" rel="nofollow">http://shoup.bol.ucla.edu/CruisingForParkingAccess.pdf</a>