The level-2 driving that Tesla is pushing seems like a worst case scenario to me. Requiring the driver to be awake and alert while not requiring them to actually <i>do</i> anything for long stretches of time is a recipe for disaster.<p>Neither the driver nor the car manufacturer will have clear responsibility when there is an accident. The driver will blame the system for failing and the manufacturer will blame the driver for not paying sufficient attention. It's lose-lose for everyone. The company, the drivers, the insurance companies, and other people on the road.
Tesla's system doesn't have enough sensors. Musk forced his engineers to try to do this almost entirely with vision processing, and that was a terrible decision. Vision processing isn't that good yet. Everybody else uses LIDAR.<p>I've been saying for years that the right approach was to take the technology from Advanced Scientific Concepts' flash LIDAR and get the cost down. I first saw that demonstrated in 2004 on an optical bench in Santa Monica. It became an expensive product, mostly sold to DoD. It's expensive because the units require exotic InGaAs custom silicon and aren't made in quantity. Space-X uses one of their LIDAR units to dock the Dragon spacecraft with the space station.<p>Last year, Continental, the big century-old German auto parts maker, bought the technology from Advanced Scientific Concepts and started getting the cost down.[1] Volume production in 2020. Interim LIDAR products are already shipping in volume. Continental is quietly making all the parts needed for self-driving. LIDAR. Radar. Computers. Actuators. Cameras. Software for sensor integration into an "environment model". They design and make all the parts needed, and provide some of the system integration.<p>Apple and Google were trying to avoid becoming mere low-margin Tier I auto parts suppliers. Continental, though, is quite successful as a Tier I auto parts supplier. Revenue of €40 billion in 2016. Earnings about €2.8 billion. Dividend of €850 million. They can make money on low-margin parts.<p>Continental may end up quietly ruling automatic driving.<p>[1] <a href="https://www.continental-automotive.com/en-gl/Passenger-Cars/Chassis-Safety/Advanced-Driver-Assistance-Systems/Lidars/High-Resolution-3D-Flash-Lidar" rel="nofollow">https://www.continental-automotive.com/en-gl/Passenger-Cars/...</a>
The only industry to have produced truly driverless public transportation systems is the rail industry. Not aeronautics. Rail systems happens to be my business and what I read here makes me very worried.<p>I don't think the majority understands what safety means in mass transportation. It's not about running miles and miles without accidents and basically saying "see"? It's about demonstrating /by design/ that the /complete/ system over its /complete/ lifetime will not kill anyone. In terms of probability of failure it translates in demonstrated hazard rates of less than 1E-9 /including the control systems/. This take very special techniques and if that could've been done using only vehicle sensors, it would have been adopted by us long ago. I am also sorry to report that doubling cameras and sensor fusion will not get you an acceptable safety level. We've tried that too, rookies.<p>Is it "fair", to use Elon's argument? After all, isn't additional safety enough compared to existing situation. Ah but we have been there too! For driver assistance it is indeed better. Similar systems were deployed during the second half of 20th century (e.g. KVB, ASFA, etc). But the limit is clear. It only /improves/ driver's failure rate. It does not substitute for the driver. If you substitute, you have to do much much much better. Nobody will ride a driverless vehicle provided the explanation that it is, you know, "already an improvement when compared to a typical driver". Is it fair? Maybe not, but that's the whole point for entrusting lives to a machine.
What befuddles me is that in all these discussions about self-driving cars seemingly no one refers to the massive body of knowledge in this area that comes from the aviation world.<p>I've posted variants of this same comment several times and I'm starting to feel like a broken record.<p>Look at studies of efforts to make planes safer by removing the human element. While efforts like autopilot have made things safer it reaches the point where more automation can reduce safety as pilots are no longer alert and/or don't trust the instruments and/or can't fully manually override the automation.<p>Call it the uncanny valley of automation safety.<p>Bridging that last few percent for true automation (ie where vehicles aren't designed to have drivers or pilots at all) is going to be _incredibly_ difficult, to the point where I'm not convinced it won't require something resembling a general AI.<p>All of this is why I think driverless cars are going to take much longer than many expect.
Biggest news buried at the end. It says that several engineers have quit since October 2016 (including the head of autopilot) when Tesla started selling "fully autonomous driving" hardware upgrade packages. Says the engineers don't agree the hardware is capable of supporting this and that it was ultimately a marketing decision.
I just ordered a Model S with Autopilot, and as I've been reading the comments on the various Tesla forums, I'm not sure I'm ever going to use it. Some of the stories are honestly terrifying (sudden deceleration on the highway, swerving into other lanes, etc).
<i>> In May 2015, Eric Meadows, then a Tesla engineer, engaged Autopilot on a drive in a Model S from San Francisco to Los Angeles. Cruising along Highway 1, the car jerked left toward oncoming traffic. He yelped and steered back on course, according to his account and a video of the incident.</i><p>Is this video online?
A reference to Chris Lattner:<p>"In recent months, the team has lost at least 10 engineers and four top managers—including Mr. Anderson’s successor, who lasted less than six months before leaving in June."
Since we're finally getting some refutations to Self-Driving Hype, let me drop some quotes here:<p><i>“I tell adult audiences not to expect it in their lifetimes. And I say the same thing to students”<p>"Merely dealing with lighting conditions, weather conditions, and traffic conditions is immensely complicated. The software requirements are extremely daunting. Nobody even has the ability to verify and validate the software. I estimate that the challenge of fully automated cars is 10 orders of magnitude more complicated than [fully automated] commercial aviation."</i><p>- Steve Shladover, transportation researcher at the University of California, Berkeley<p><a href="http://www.automobilemag.com/news/the-hurdles-facing-autonomous-vehicles/" rel="nofollow">http://www.automobilemag.com/news/the-hurdles-facing-autonom...</a><p><i>"With autonomous cars, you see these videos from Google and Uber showing a car driving around, but people have not taken it past 80 percent. It's one of those problems where it's easy to get to the first 80 percent, but it's incredibly difficult to solve the last 20 percent. If you have a good GPS, nicely marked roads like in California, and nice weather without snow or rain, it's actually not that hard. But guess what? To solve the real problem, for you or me to buy a car that can drive autonomously from point A to point B—it's not even close. There are fundamental problems that need to be solved."</i><p>- Herman Herman, director of the Carnegie-Mellon University Robotics Institute<p><a href="https://motherboard.vice.com/en_us/article/d7y49y/robotics-lab-uber-gutted-says-driving-cars-are-not-even-close-carnegie-mellon-nrec" rel="nofollow">https://motherboard.vice.com/en_us/article/d7y49y/robotics-l...</a><p><i>"While I enthusiastically support the research, development, and testing of self-driving cars, as human limitations and the propensity for distraction are real threats on the road, I am decidedly less optimistic about what I perceive to be a rush to field systems that are absolutely not ready for widespread deployment, and certainly not ready for humans to be completely taken out of the driver’s seat."</i><p>- Mary Cummings, director of the Humans and Autonomy Laboratory at Duke<p><a href="https://www.commerce.senate.gov/public/_cache/files/c85cb4ef-8d7f-40fb-968c-c476c5220a3c/8BC0CC7E137483CEFD0C928ECB14E74E.cummings-senate-testimony-2016.pdf" rel="nofollow">https://www.commerce.senate.gov/public/_cache/files/c85cb4ef...</a> [pdf]<p>All quotes pulled from this article (which is really quite good and you should read it in full):<p><a href="https://www.nakedcapitalism.com/2016/10/self-driving-cars-how-badly-is-the-technology-hyped.html" rel="nofollow">https://www.nakedcapitalism.com/2016/10/self-driving-cars-ho...</a>
Level 2 still does not drive smoothly as many have confirmed on their forums. They do require you to jiggle the wheel every few mins to ensure you're alert. There's also
<a href="https://www.hbsslaw.com/cases/tesla-autopilot-2-ap2-defect" rel="nofollow">https://www.hbsslaw.com/cases/tesla-autopilot-2-ap2-defect</a>
Honestly, I don't understand why the automobile industry doesn't learn from the airline industry. Airplanes have worked out how to balance autopilot capabilities with the need for pilots to remain engaged and attentive for years. Simply implement a Drive-By-Wire, similar to Airbus' Fly-By-Wire systems. A driver's inputs to the controls would still be required, but the autonomous systems could prevent or limit certain actions (such as accelerating into a stopped vehicle or swerving off the road).
Please stop posting paywalled articles. Especially WSJ. This community represents the future of the internet. I don't know what the answer is for making sure content providers get paid, but the WSJ model isn't it. So let's vote with our attention (or lack there of) and kill this annoying practice before it makes the internet an even more walled and unpleasant place.