TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Tesla’s self-driving technology fails to detect children in the road, tests find

75 pointsby philk10almost 3 years ago

17 comments

labradoralmost 3 years ago
I'm curious why governments don't have a driving test for "self-driving" AI's just like they have a test for 16 yr old teenagers claiming they are ready to drive
评论 #32402077 未加载
评论 #32401791 未加载
评论 #32401769 未加载
goethes_kindalmost 3 years ago
Pure madness. Can we stop worshiping software? If these selfish assholes were using their own private infrastructure I would have nothing against it, but they are imposing themselves on the innocent public.
评论 #32402120 未加载
raszalmost 3 years ago
Its not that it tries but fails, it doesnt even attempt to. Same goes for animals (deer running across etc) or debris on the road. FSD is only programmed to detect other cars and road markings. It doesnt even detect physical road boundaries and will happily drive into the ditch/concrete pillar if road markings tell it to.
评论 #32401727 未加载
评论 #32402180 未加载
lovetocodealmost 3 years ago
I wish Tesla would have focused more on using the CV technology to strengthen its collision avoidance and safety systems which is lacking. There was a lot of emphasis on marketing the FSD but I think FSD would have been a byproduct of a robust and reliable collision avoidance system. I don't care about FSD -- I care about getting my family to where we are going safely! Tesla should warn me if I am about to run a stop light or stop sign. It would also be nice to have an infant protection system that detects children locked in vehicles and takes action to protect them from overheating -- which seems plausible based on the hardware inside my 2021 Model Y.
评论 #32402071 未加载
评论 #32401802 未加载
jsightalmost 3 years ago
The actual report is here: <a href="https:&#x2F;&#x2F;dawnproject.com&#x2F;wp-content&#x2F;uploads&#x2F;2022&#x2F;08&#x2F;The_Dawn_Project___Tesla_FSD_Test__8_.pdf" rel="nofollow">https:&#x2F;&#x2F;dawnproject.com&#x2F;wp-content&#x2F;uploads&#x2F;2022&#x2F;08&#x2F;The_Dawn_...</a><p>It failed to detect a stationary mannequin in one specific scenario. Does this generalize? Does this mean there&#x27;s a major safety problem? These would be assumptions that this test can not prove.<p>A comprehensive safety analysis has to also include these scenarios: <a href="https:&#x2F;&#x2F;youtu.be&#x2F;hx7BXih7zx8?t=205" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;hx7BXih7zx8?t=205</a>
评论 #32402105 未加载
评论 #32402059 未加载
评论 #32401987 未加载
评论 #32401969 未加载
stockertaalmost 3 years ago
I just don&#x27;t get it how the hell is this legal? Live beta testing their shit on public roads.
评论 #32402062 未加载
tim333almost 3 years ago
It seems the test was faked due to Dan O’Dowd, a rival software maker who arranged it, having a long running grudge against Tesla <a href="https:&#x2F;&#x2F;electrek.co&#x2F;2022&#x2F;08&#x2F;10&#x2F;tesla-self-driving-smear-campaign-releases-test-fails-fsd-never-engaged&#x2F;" rel="nofollow">https:&#x2F;&#x2F;electrek.co&#x2F;2022&#x2F;08&#x2F;10&#x2F;tesla-self-driving-smear-camp...</a><p>They said FSD failed but it wasn&#x27;t even on.<p>He also used to go on about the &quot;linux threat&quot; of people using linux rather than his software <a href="https:&#x2F;&#x2F;www.eetimes.com&#x2F;odowd-then-and-now-on-linux&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.eetimes.com&#x2F;odowd-then-and-now-on-linux&#x2F;</a>
rvzalmost 3 years ago
Here is a comparison of this test with an older video:<p><a href="https:&#x2F;&#x2F;twitter.com&#x2F;TaylorOgan&#x2F;status&#x2F;1478802681141645322" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;TaylorOgan&#x2F;status&#x2F;1478802681141645322</a> (This is January 5th 2022)<p>Same author, same test, months later:<p><a href="https:&#x2F;&#x2F;twitter.com&#x2F;TaylorOgan&#x2F;status&#x2F;1556991029814886404" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;TaylorOgan&#x2F;status&#x2F;1556991029814886404</a> (This is August 9th 2022)<p>The difference? There is <i>absolutely</i> no improvement.<p>Expecting that the FSD (Fools Self Driving) beta software would get better over time as promised by Elon, this video shows that it doesn&#x27;t look good at all with this latest failure which drivers on the road, and even pedestrians were no better off and are no safer with these updates.<p>It is not early days anymore and even with these updates, it is still an unsafe contraption which not only has been falsely and deceptively advertised by Tesla, it is not even near the claims of the robo-taxi level of FSD (Level 4 - 5) with the number of price increases over the years with a product which doesn&#x27;t work looks more like a scam.<p>At this point, FSD is essentially Fools Self Driving.
评论 #32403185 未加载
评论 #32401495 未加载
petilonalmost 3 years ago
There is a fundamental difference between Tesla&#x27;s approach vs. Waymo and Cruise. Waymo and Cruise are starting at the top. They use expensive, bulky, unsightly hardware and only allow their self-driving cars to operate in limited geographic areas. This has a much higher probability of success, and once the tech works, they expand its capabilities. Eventually their tech will become mass-market. They don&#x27;t kill anyone in the process, and they get very little bad press.<p>Tesla on the other hand is trying to make a mass-market version work right off the bat, even if it means debugging on the streets. Tesla will get a lot of bad press, will kill a few people, and the public will become sceptic about the tech. I think Elon Musk&#x27;s approach is not just wrong, it is going to make it harder for Waymo and Cruise by making consumers distrust self-driving cars in general.
satisficealmost 3 years ago
It may be detecting them and intentionally hitting them. Why else would the Teslas stop, reverse, and run over them again?
asdajksah2123almost 3 years ago
Has anyone heard about this project before? It seems pretty &quot;out of the blue&quot;.
beeboopalmost 3 years ago
Here are things wrong with this video and the group (person) that made it. While I&#x27;m not a fan of Elon, I am fan of actually verifying information and placing criticism where criticism is due.<p>* This video was made by the The Dawn Project, whose goal is literally to make Tesla&#x27;s self driving <i>illegal</i>. Their current goal isn&#x27;t to make all unsafe self driving illegal, only Tesla&#x27;s.<p>* The Dawn Project is founded and operated by Dan O&#x27;Dowd, who is campaigning for senate. His biggest platform issue is banning Tesla&#x27;s self driving. He&#x27;s definitely getting the press coverage he wants as a result of this publicly advertised test.<p>* Dan O&#x27;Dowd is the founder and owner of Green Hills Software, which makes self driving software for car manufacturers, and has more than a dozen partners with deep connections to the automotive industry. Dan does not disclose this conflict of interest.<p>* <i>Green Hills Software has Ford and Toyota as direct customers. Dan does not disclose this conflict of interest.</i><p>* The Dawn Project explicitly outlined this test as &quot;a small child walking across the road in a crosswalk&quot; and it fails in both of these goals - the &quot;child&quot; isn&#x27;t walking and the road isn&#x27;t marked as a crosswalk.<p>* There is zero coverage of trials where Tesla did successfully brake. The test circumstances are clearly setup to make it fail. While noteworthy they were able to find the right conditions, not disclosing the work that went into making the test scenario only further fuels the bias of this test.<p>* It is unclear to me how they managed to get a Tesla to work in full self driving mode while plowing through clearly marked parking spaces in a parking lot. These are clearly not conditions that anyone would be using FSD. This is further marked by the fact that the Tesla braked hard from 40MPH to 25MPH as soon as FSD was enabled because it knew going that speed in a parking lot and driving over road markings it shouldn&#x27;t was a stupid thing to be doing.<p>* FSD was enabled only seconds before being introduced to the stationary mannequin.<p>TLDR: There is a <i>massive</i> conflict of interest here that isn&#x27;t disclosed anywhere. There is a massive incentive to use this as advertising for his senate campaign. The test doesn&#x27;t actually put in any baseline effort to replicate what they claim to be testing. They put the Tesla in a situation that no one would ever actually be using FSD.
评论 #32405863 未加载
Tiktaalikalmost 3 years ago
so it&#x27;s about as good as the typical SUV&#x2F;Pickup Truck driver. Impressive job Tesla!
edhelasalmost 3 years ago
So Elon is indeed working on climate change issues (sorry).
maxdoalmost 3 years ago
The guy is trying to promote lidar tech in every post. He has financial interests. Why this scam is here?
评论 #32406546 未加载
maxdoalmost 3 years ago
That guy is an obvious scam. He ordered ads after that research everywhere possible trying to push his tech. And that &quot;research&quot; is buzzing around artificially for several month. I can see tesla is rather prone to false positives. My FSD Beta on intersection detected a 10 inch marble statue of a kid as a pedestrian.
ZeroGravitasalmost 3 years ago
This seems very suss.<p>&gt; In several tests, a professional test driver found that the software – released in June – failed to detect the child-sized figure at an average speed of 25mph.<p>The car was 40mph when put into self drive mode, so the car was slowing itself down even though the mannequin, which looks like a cone to my eyes, wasn&#x27;t moving.<p>Are these people legit but counterproductive to their claimed goals or entirely bogus?<p>Ah, prominent mentions of LIDAR suggests it&#x27;s just an advert disguised as a consumer safety campaign.<p>&gt; We Demand Software that Never Fails and Can’t Be Hacked<p>Or possibly they&#x27;re just weird?
评论 #32401795 未加载