TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Why Tesla removed radar and ultrasonic sensors [video]

241 点作者 shekhar101超过 2 年前

43 条评论

CharlesW超过 2 年前
I thought it was telling that Andrej immediately &quot;reframed&quot; the question because Lex asked the &quot;wrong question&quot;. This is a classic evasion technique one learns from experience and&#x2F;or media training. Lex&#x27;s comment immediately after was a clever and gentle dig at Andrej&#x27;s response.<p>It seemed like all the &quot;full cost&quot; negatives Andrej mentioned were related to Tesla&#x27;s ability to execute, and not what would actually produce better results. Tesla would have to be able to reliably procure parts, write reliable firmware, create designs and processes that won&#x27;t increase unexpected assembly line stops, etc.<p>Regarding results, the best Andrej can do is, &quot;In this case, we looked at using it and not using it, and the delta was not massive.&quot; In other words, the results are better, but not enough to make up for the fact that Telsa can&#x27;t support additional sensors without incurring a prohibitive amount of additional risk <i>to Tesla</i>. Risk to passengers doesn&#x27;t appear to be a consideration.
评论 #33398728 未加载
评论 #33399042 未加载
评论 #33397982 未加载
评论 #33398501 未加载
评论 #33398238 未加载
评论 #33402082 未加载
评论 #33405315 未加载
评论 #33405725 未加载
评论 #33400494 未加载
评论 #33398809 未加载
评论 #33404902 未加载
评论 #33400321 未加载
评论 #33398718 未加载
评论 #33401505 未加载
评论 #33401615 未加载
评论 #33402249 未加载
评论 #33399889 未加载
评论 #33400830 未加载
评论 #33401519 未加载
petilon超过 2 年前
I didn&#x27;t find his answers particularly convincing. His answer focused on costs mainly, and how &quot;the best part is no part&quot;. We have already seen multiple accidents caused by camera&#x27;s limitations [1] which would not have happened if Tesla used Lidars.<p>Cameras have poor dynamic range and can be easily blinded by bright surfaces. While it is true that humans do fine with only eyes, our eyes are significantly better than cameras.<p>More importantly, expectations are higher when an automated system is driving the car. It is not sufficient if, in aggregate, self-driving cars have fewer accidents. If you lose a loved one in an accident where the accident could have been easily avoided if a human was driving, then you&#x27;re not going to be mollified to hear that in aggregate, fewer people are being killed by self-driving cars! You&#x27;d be outraged to hear such a justification! The expectation therefore is that in each individual injury accident a human clearly could not have handled the situation any better. Self-driving cars have to be significantly better than humans to be accepted by society, and that means it has to have better-than-human levels of vision (which lidars provide).<p>[1] <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=X3hrKnv0dPQ" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=X3hrKnv0dPQ</a>
评论 #33400338 未加载
评论 #33400115 未加载
评论 #33400500 未加载
评论 #33400267 未加载
评论 #33400250 未加载
评论 #33400831 未加载
评论 #33400592 未加载
评论 #33398801 未加载
评论 #33402996 未加载
评论 #33400227 未加载
评论 #33401564 未加载
TheLoafOfBread超过 2 年前
This whole question about the vision boils down to &quot;humans don&#x27;t need it so cars should not need it too&quot; the problem with this statement is that humans does not have wheels to move around, they have legs, but wheels are ridiculously simple compared to 4 legs tapping 160km&#x2F;h on a highway. Same for birds - they also does not need jet engines to fly around, but imagine Airbus A380 flapping its wings and what kind of complexity would you need to flap 800km&#x2F;h through air.
评论 #33397872 未加载
评论 #33397915 未加载
评论 #33397845 未加载
评论 #33404398 未加载
评论 #33400199 未加载
评论 #33402531 未加载
评论 #33398838 未加载
评论 #33401059 未加载
woeirua超过 2 年前
Andrej&#x27;s argument about more sensors adding entropy strikes me as disingenuous considering that in the next question he then says that Tesla&#x27;s biggest advantage over everyone else is &quot;the fleet&quot;, which clearly introduces orders of magnitude more entropy into the system than anything else. Can you imagine the infrastructure required to gather video from &quot;the fleet&quot; anytime a car sees something unexpected? How about diagnosing what went wrong in that specific instance? How many thousands of these cases do they see everyday?<p>Given the progress of the FSD &quot;beta&quot; to date, and the fact that Andrej _left_ Tesla, I&#x27;d wager that he knows that this approach is a dead end, but he won&#x27;t say that because he&#x27;d get himself in hot water with Elon.
评论 #33401031 未加载
评论 #33401087 未加载
评论 #33402235 未加载
评论 #33401455 未加载
dreamcompiler超过 2 年前
It&#x27;s obviously a stupid decision to remove a direct source of range data (radar and ultrasound) in favor of an indirect one (vision).<p>But on second thought this doesn&#x27;t bother me that much because Tesla FSD is absolute garbage even <i>with</i> radar (and I don&#x27;t think Tesla will get away with selling the FSD snake oil for much longer), so if vision-only is good enough for the base-level lane-keeping autopilot functionality and it makes the cars cheaper, maybe that&#x27;s a good thing.
评论 #33401094 未加载
评论 #33402047 未加载
nelox超过 2 年前
“The world is designed for human visual consumption” and “[vision] has all the information you need for driving”. While vision may be sufficient, I would say that other senses, such as hearing, touch and smell augment driving very well. Especially with regard to situational awareness. e.g. The sirens of emergency vehicles are typically the first indication of their presence, which often can be felt as well. The wail of a tornado siren, similarly. Loud throbbing motorcycles do much to improve rider safety simply due to them broadcasting their presence. At railway level crossings, drivers should slow down, look and listen for oncoming trains. The smell of wildfire or bushfire smoke provides enhanced warning or nearby danger. So to say vision is sufficient, does not fully take in to account the driver experience, especially where safety and situational awareness are concerned.
评论 #33403955 未加载
评论 #33402521 未加载
hbarka超过 2 年前
Intuition + other examples tell you that radar and ultrasonic sensors work. Why do we twist ourselves to believe otherwise?<p>Elon removed the radar and ultrasonics for the simple fact that its supply chain logjam was screwing up the manufacturing schedule. They also realized that the profit margin can be sustained in an inflationary environment by simply removing these parts [1]. “Oh, we were going to remove them anyway because humans can see fine with just eyes and no radar, why can’t cars?” Tesla then turned up the marketing of the AI&#x2F;vision hype lever once more to toss another shiny tech object and get buyers to ignore the fact that there is a regression of features in the newer cars going forward.<p>[1] <a href="https:&#x2F;&#x2F;youtu.be&#x2F;LS3Vk0NPFDE" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;LS3Vk0NPFDE</a>
评论 #33403731 未加载
oxplot超过 2 年前
Here&#x27;s the summary (mixed with observations from Munro and past Tesla presentations):<p>- Costs money: the physical sensors (a dozen of them), wiring it up, assembling it, maintain inventory, code it, etc.<p>- Time spent on maintaining, improving software stack for the non-vision sensors as well as efforts needed to fuse the data with vision, takes away from focusing on vision alone. It also holds back vision in relevant areas.<p>- Existing non-vision sensors used by Tesla are orders of magnitude lower fidelity than vision. It has historically (as the case with radar) led to vision essentially having to overriding radar because vision just performed much better (see AI day 2021).<p>My take:<p>As with any new tech, it likely sucks at the start (think HDD and SSDs, and how a mechanical thing with lots of moving parts was way more reliable than SSDs at the start). However, by essentially moving past the local maxima, you get to innovate better, faster in the future.<p>In case of ultrasonic sensors, they are for low speed cases anyway and most people are fine without them. Majority of fatalities and injuries happen at higher speeds.
评论 #33401726 未加载
a-dub超过 2 年前
i&#x27;m not sure if i buy his argument that the &quot;delta is not big enough.&quot; i have some experience with realtime ai systems and i&#x27;ve noticed something interesting about them.<p>they have a non-smooth capability curve, where they can demonstrate proficiency in activities that in regular computer programs or people would imply a complete and continuous path of capability that has been mastered to achieve the demonstration, but ai systems are weird in that can do amazing things, but have loads of little holes and failure modes along the way.<p>for example: gpt-3 can write you a shell script that will emit a c program that prints a poem about people you know, but will fail at very basic logic, sometimes.<p>in light of that, having additional support data like radar or lidar seems like the right move for plugging all those little holes in capability that turn up in real ai systems.<p>because at the end of the day, when you&#x27;re driving a car in the real world and lives are at stake, simply interpolating or averaging over uncertainty seems awfully deadly and the only way to ameliorate that uncertainty seems to have multiple redundant sensory systems that can stand in for each other as conditions change. just like us!
评论 #33398888 未加载
01100011超过 2 年前
I still suspect it&#x27;s because they need to preserve compute resources for vision processing. Sensor fusion is likely eating up too much of their current HW and limiting their progress in other areas. I suspect Tesla will have to admit they need to upgrade the current HW before they ever &#x27;solve&#x27; FSD.
评论 #33398301 未加载
评论 #33402781 未加载
评论 #33398543 未加载
JaggerJo超过 2 年前
This sucks for parking. It is simply (physically) not possible for the existing cameras to see the area directly in front of the car.<p>So how would this work for parking?<p>A: Add more cameras so there are no dead areas in front of the car<p>B: build a model in vector space when driving towards a parking spot and assume blind spots don&#x27;t change. (still sucks)
评论 #33402975 未加载
yreg超过 2 年前
They are pretending as if the USS were there only for self driving.<p><i>I</i> use them as well!
评论 #33398372 未加载
throwaway4good超过 2 年前
Let me reframe the answer:<p>&quot;We removed them because they cost money. And we are trying to make money ... at least right now.<p>Listen, this pure autonomous self-driving car stuff is never going to work, so who cares if we have these gadgets or not ...&quot;
Animats超过 2 年前
From my DARPA Grand Challenge days, I used to have an Eaton VORAD automotive radar. This was an early design - 24GHZ, 1 scanning axis. It could see cars, but not bicycles, at least not reliably. For several months, I had one pointed out the window of my house, looking at an intersection. So I had a V-shaped wedge on screen, and could watch the cars go by.<p>It&#x27;s a Doppler radar, so you don&#x27;t get any info from things stationary relative to the radar, but you do get range and range rate. And the quality of that data is independent of distance. We used it mainly as a backup system for the world model built with LIDAR and (to a very limited extent) vision. The VORAD data could lower the speed limit for the rest of the system, and if a collision was about to happen, it would slam on the brakes independently of the world model.<p>The big problem with coarse automotive radar is that it can detect targets, but doesn&#x27;t tell you much about them. Cars, trash cans, and metal road debris all look about the same. There&#x27;s also a lot of trouble from big flat metal surfaces being mirrors for radar. We were willing to accept slowing down for ambiguous cases until the other sensors could get a good look. Drivers hate that if road-oriented systems do it.<p>Modern units are up around 70-80GHz and often have 2D scanning, which is a big help. I haven&#x27;t seen the output from a modern automotive radar. I was expecting that by now, low cost millimeter microwave systems (200-300GHz) would be available, providing detailed images somewhat coarser than you can get with light. You get range and range rate, and you can usually steer the beam electronically rather than mechanically. The technology exists to get high-resolution radar images, but is mostly used for scanning people for weapons at checkpoints. It hasn&#x27;t become cheap yet.
dane-pgp超过 2 年前
I think there&#x27;s an interesting general optimisation problem here of balancing the accuracy&#x2F;performance of a software&#x2F;hardware system, against the goal of making that system easier to iterate on and develop.<p>Presumably this is a matter of working out if you are at a local maximum or not, and thinking about what properties the ideal solution will have. It also matters if you have other competitors that might be racing towards the ideal solution faster than you, potentially patenting their progress along the way.
friend_and_foe超过 2 年前
I remember watching an interview with George Hotz when Comma.ai was young, where he essentially said this as a critique of Tesla. He&#x27;s a bit of a showman and likes to invite a little controversy when he says things, but I found myself agreeing with his point. It&#x27;s not surprising to see such a practical company like Tesla face the facts about all these sensors eventually.
评论 #33406541 未加载
60Vhipx7b4JL超过 2 年前
From an engineering perspective I would ask: Can your sensor package understand the environment to the required (low) failure rate?<p>Radar&#x2F;Lidar&#x2F;Ultrasonic is going to give you information that your camera systems will not give you. It does not matter if the delta of information is little. If this little is required because you can&#x27;t obtain it otherwise, you still need it.<p>If you just rely on the fleet, you rely on the things you have seen. What about the objects that you have not yet seen?
post_break超过 2 年前
I think this is the real reason: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=LS3Vk0NPFDE" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=LS3Vk0NPFDE</a><p>Cost cutting.
评论 #33400480 未加载
nova22033超过 2 年前
At 2:05. Suddenly you need a column in your sqlite telling you what type of sensor it is....<p>Seriously? This is a major technical challenge?
评论 #33404579 未加载
gnicholas超过 2 年前
How does it make sense to not even have sensors for parking? If you think they don&#x27;t help during normal-speed driving, that&#x27;s one thing. But they obviously help during parking, since (IIRC) they&#x27;ve had to disable certain autonomous features until they get their vision-based systems upgraded to be able to fill in this gap.
eachro超过 2 年前
So the key question is how much of an improvement does radar&#x2F;sensors&#x2F;etc give you over just using computer vision?
评论 #33398280 未加载
评论 #33397908 未加载
评论 #33398460 未加载
评论 #33400240 未加载
xnx超过 2 年前
Sensor fusion seems to be another thing that Tesla is not good at.
EVa5I7bHFq9mnYK超过 2 年前
From the first principles point of view, it comes down to radar and ultrasonic having much higher wavelengths than optical. Which results in much lower amount of incoming information, worse resolution and higher interference if many cars radiate the same signals on a busy street.
0xfffafaCrash超过 2 年前
Seems like a very political answer from Andrej. Of course he’s not going to outright say “yeah, we’re prioritizing the profit margin over accuracy and safety considerations” if he wants to keep his job, but that seems to be the short of it. Others may choose to follow, at least in the short term, but it won’t be because of “entropy” making the system worse (you can always build a model without a data source and then refine the results based on the added value of a data source) but doing so just because it will save lives doesn’t cut it when the goal is to cut corners and costs to maximize profit. I can believe that some types of sensors aren’t worth the trouble in terms of additional signal to noise ratio, but I can’t believe this is one of them.
评论 #33398913 未加载
jakeogh超过 2 年前
Giving human[1][2] drivers better situational awareness[3] is the future. Specifically open[4]:<p>a. Windshields that clean the inside as well as the outside.<p>b. Better eyeglasses[5].<p>c. User controllable hi-res HUD thermal IR overlay.<p>d. Headlights with adaptive notch filters so the oncoming vehicle can pick an empty spectral range... without the source being monochromatic (with required adaptive filters on the recieving end)... and&#x2F;or really good coronagraph&#x27;s.<p>e. Brake control[6].<p>Any entity capable of driving[7] in a population of humans (including adversarial humans) is sentient[8], and has real skin in the game. It would be unethical to lock one in a car:<p>[1] <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33213860" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33213860</a> (analog FPGA)<p>[2] <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=21106367" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=21106367</a> (general AI)<p>[3] <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=16646112" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=16646112</a> (2018)<p>[4] <a href="https:&#x2F;&#x2F;www.tesla.com&#x2F;blog&#x2F;all-our-patent-are-belong-you" rel="nofollow">https:&#x2F;&#x2F;www.tesla.com&#x2F;blog&#x2F;all-our-patent-are-belong-you</a> (2014)<p>[5] <a href="https:&#x2F;&#x2F;patents.google.com&#x2F;patent&#x2F;US7744217" rel="nofollow">https:&#x2F;&#x2F;patents.google.com&#x2F;patent&#x2F;US7744217</a> (2007)<p>[6] <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=18013388" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=18013388</a> (2018)<p>[7] no human behind the wheel, no human to correct impending mistakes, but (critically) with one or more humans in the car.<p>[8] The idea that non-biological machines can have &#x27;self&#x27; is a window into modern mass transformation. Please checkout the analog FPGA experiments linked above.
danbmil99超过 2 年前
Musk is also famously against using lidar. He doesn&#x27;t understand&#x2F;accept that an autonomous vehicle needs any sensors that humans do not posess.
sidcool超过 2 年前
I feel that was more an operational answer than an engineering one .. I still feel that depth perception of vision alone is unreliable.
dncornholio超过 2 年前
Just remember folks, we will have full self driving vehicles by the end of this year!
ra7超过 2 年前
If I were a Tesla fan&#x2F;investor&#x2F;FSD customer, I’d be very concerned that the former (effective) tech lead of FSD doesn’t know about sensor fusion or that it’s a solved problem for majority of the companies in this space.
lawrenceyan超过 2 年前
I can see a path where with only cameras, Tesla might be able to reach level 4 autonomy in perfect conditions.<p>But the biggest thing that comes to mind is what happens at night. Are they only going to enable self-driving during the day?
评论 #33402459 未加载
ornel超过 2 年前
Video summary:<p><a href="https:&#x2F;&#x2F;www.summarize.tech&#x2F;www.youtube.com&#x2F;watch?v=_W1JBAfV4Io" rel="nofollow">https:&#x2F;&#x2F;www.summarize.tech&#x2F;www.youtube.com&#x2F;watch?v=_W1JBAfV4...</a>
superkuh超过 2 年前
Humans don&#x27;t use radar or ultrasound sense to drive. If we want cars that drive like humans drive they should use the same senses. For example, in the northern parts of the USA there is snow cover for much of the year and lanes are emergent from flocking without any absolute reference to the actual location of the lanes. The reasons everyone choses the same places to drive are that they see the same environment with the same senses. Even if autonomous driving with radar and ultrasound was made to work if it picks the correct lane position and all the humans pick the wrong new lane position then the car is wrong, not the humans.
mavili超过 2 年前
Did anyone else catch Andrej&#x27;s &quot;sqlite&quot; comment? If that is not just a simple analogy, Tesla may be using sqlite in their cars? :D
sgjohnson超过 2 年前
Does this mean that now when someone smashes one of their bumpers on a Tesla, the insurance will no longer have to total the entire vehicle?
mongol超过 2 年前
What is it that makes Lidar so expensive? Is it something intrinsic to the technology that prevents costs from coming down?
评论 #33400523 未加载
评论 #33398452 未加载
bekantan超过 2 年前
He explains it quite well: all necessary information is already in the pixel-space and adding more sensors slows team down more than it improves the system performance. My understanding is that major blockers are not in perception area anyways, would be great if someone with relevant experience could comment if this is indeed the case.
评论 #33398248 未加载
评论 #33398279 未加载
nielsbot超过 2 年前
All I heard was &quot;cost savings, cost savings, cost savings&quot;
评论 #33402947 未加载
solardev超过 2 年前
Cuz Muskdaddy wanted mo money. There, mystery solved.
julienreszka超过 2 年前
Geohot said something similar years ago already
smrtinsert超过 2 年前
Hm I&#x27;d rather have someone from Twitter audit this decision
bigtex超过 2 年前
Did Lex ask him why Tesla love to crash into emergency vehicles?
评论 #33399074 未加载
KVFinn超过 2 年前
TLDR: Tesla thinks LIDAR hardware is more expensive than the performance improvement it provides.<p>I didn&#x27;t like his line of logic about how vision is necessary and sufficient, because that&#x27;s how humans drive. Okay sure, but if some combinations of non-human sensors <i>could</i> drive better and&#x2F;or cheaper than a vision only driving system, surely he would not argue for sticking with vision only? Maybe adding non-vision sensors lets you save hardware and software resources on the vision part of the system.
justapassenger超过 2 年前
TL;DW.<p>Tesla doesn’t know how to do change management.
评论 #33399081 未加载