I have spent the last six years working in self-driving and disagree with the article. I am an autonomy engineer at a major self-driving focused company.<p>First: The choice is between “lidar and cameras” and “cameras alone.” I am not aware of any contenders who have are only using lidar. That means the only downside to using lidar is cost.<p>Second, the article is incorrect. lidar is extremely reliable for detecting dogs, pedestrians, and anything else you can think of. For lidars with sufficient intensity sensitivity, you can even read the text on signs.<p>Here’s a list of some tradeoffs for available sensors.<p>Cost: Sterling Anderson said in a talk at MIT a few years ago “there is no unobtanium in lidar.” Making lidar cheap is a matter of manufacturing scale. Not a matter of new physics. Cameras are still much cheaper and will remain so for some time. This alone might justify choosing cameras for consumer vehicles. The game-changing imaging radars that exist are not cheap.<p>Long-Tail Events: On a camera-based system without depth sensors, the vehicle must react based on correct identification of obstacles. Consider an image of a pedestrian painted onto the road. A system with depth sensors will not need to stop.<p>Depth estimation with multiple cameras leaves a lot to be desired. It is bad for untextured objects. Poor illumination conditions will prevent texture from being visible to the cameras. Poor illumination conditions have no effect on lidar/radar.<p>I would not bet my life on a estimated depth from a monocular camera, no matter how many layers the DNN has.<p>Weather: Lidar works fine in the rain and snow. Degraded, but fine. Radar works fine in the rain and snow. Cameras can be made to work well, especially if placed in enclosures that self-clean. ATG’s vehicles famously made “whooshing” sounds as their pneumatic lens-cleaners forced water off of their camera lenses.<p>Time-of-Day: visible light cameras will do poorly. Every system I have seen has degraded camera performance at night. Some systems include an NIR channel to help. You cannot bring enough onboard illumunation to compete with the Sun. Most lidars choose a wavelength that leaves them completely unaffected in day vs night. Ouster has different noise characteristics during the day, but not enough to matter.<p>Range: At long distances, no commercial sensor can beat the angular resolution of cameras. This is where they shine most. That’s why you see highway-focused systems emphasize cameras so much. Blackmore was a promising path to enabling highway-range lidar capability, but they were bought by Aurora years ago now.<p>It is possible that cameras are completely sufficient. It is possible that Tesla is even ahead. But this article’s reasons won’t be the causal factor.<p>The company that builds a functional autonomous car will introduce the largest sea change in transportation since the automobile. The value delivered by each car will be massive. An additional $8,000 for a single lidar is not a dealbreaker. And that’s at today’s costs for a nice Hesai. In 2025 it will be yet smaller. In 2016, the only good lidar on the market was the Velodyne HDL64, which cost $80k. Pucks failed too often.