TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

New video of Tesla crash demonstrates the problem of semi-automated driving

297 点作者 frxx超过 2 年前

45 条评论

maury91超过 2 年前
I&#x27;m trying to reason &quot;why the Tesla did stop?&quot;, in the second video on TheIntercept website, I can notice a left exit before the point where the Tesla stopped. Something I noticed with Google maps ( I don&#x27;t have a Tesla, and maybe the navigation system of the Tesla is similar ), is that sometimes thinks tunnels can access the road above them ( in a tunnel near my home Google maps always tells me &quot;turn right&quot; in the middle of the tunnel without understanding that I need to get out of the tunnel first and take the ramp ). With this information I think the navigation system of the Tesla could have been similarly stupid and tried to turn left on a ghost road just to discover there&#x27;s no road and so decided to stop.<p>This is just a possible reason, and I&#x27;m definitely not suggesting that this is what did happen.<p>This is Google maps saying &quot;go right&quot; in the middle of a tunnel: <a href="https:&#x2F;&#x2F;goo.gl&#x2F;maps&#x2F;G89cyQT2APUQuu6g6" rel="nofollow">https:&#x2F;&#x2F;goo.gl&#x2F;maps&#x2F;G89cyQT2APUQuu6g6</a> ( the two roundabouts are connected by a tunnel )<p>PS: If you use street view you will see how it was 3 years ago, before the construction of the tunnel
评论 #34351914 未加载
评论 #34353242 未加载
评论 #34354889 未加载
评论 #34352248 未加载
somethoughts超过 2 年前
I think what&#x27;s interesting here is it&#x27;s likely the first instance where Tesla FSD has been involved in an accident which affected other drivers. [Edit] From the video the Tesla is making a lane change and stopping simultaneously which means there could be a case of the Tesla FSD&#x2F;driver doing a unsafe lane change.[1]<p>Most of the time FSD just wrecks the Tesla itself or injures the driver of the Tesla (i.e. running into trees&#x2F;dividers, running into much heavier freight trucks).<p>It will be interesting if Tesla comes in to provide monetary support for proving the legal case that Tesla FSD is not at fault or the Tesla driver (and his insurance) will be left to fend for themselves.<p>In the short term I could see Tesla not supporting the driver and absolving themselves via fine line&#x2F;TOS, etc.<p>But the long term effects of not legally supporting any driver with Tesla FSD accidents will be that new customers won&#x27;t trust this $10000 upsell product offering that&#x27;s highly profitable for Tesla.<p>I could also see 3rd party (non-Tesla) insurance companies refusing to sell coverage to Tesla FSD drivers.<p>It could also make Tesla 1st party insurance also untrustworthy to customers and could become a huge liability for Tesla.<p>It seems like it will be a great litmus test to see if Tesla has the guts to step up for its own product.<p>[1] First video shows a potential unsafe lane change <a href="https:&#x2F;&#x2F;theintercept.com&#x2F;2023&#x2F;01&#x2F;10&#x2F;tesla-crash-footage-autopilot&#x2F;" rel="nofollow">https:&#x2F;&#x2F;theintercept.com&#x2F;2023&#x2F;01&#x2F;10&#x2F;tesla-crash-footage-auto...</a>
评论 #34350022 未加载
评论 #34349477 未加载
评论 #34349509 未加载
评论 #34348282 未加载
评论 #34350898 未加载
评论 #34348454 未加载
评论 #34358160 未加载
评论 #34349188 未加载
pvg超过 2 年前
Discussed yesterday, 190 and 60ish comments per:<p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=34327763" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=34327763</a><p><a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=34333883" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=34333883</a>
Animats超过 2 年前
Is there enough info yet to know if the lane change was initiated automatically? That&#x27;s apparently possible. Tesla support site:<p><i>Auto Lane Change</i><p><i>To initiate an automated lane change, you must first enable Auto Lane Changes through the Autopilot Controls menu within the Settings tab. Then when the car is in Autosteer, a driver must engage the turn signal in the direction that they would like to move. In some markets depending on local regulations, lane change confirmation can be turned off by accessing Controls &gt; Autopilot &gt; Customize Navigate on Autopilot and toggle ‘Lane Change Confirmation’ off.</i>
评论 #34348143 未加载
sebastianconcpt超过 2 年前
This blind trust in technologies is very very wrong. We are part of the system and things should always be designed to keep us as powerful fallbacks&#x2F;system-degradation resource.
评论 #34353119 未加载
herodotus超过 2 年前
I have a VW Golf and it has what it calls &quot;adaptive cruise control&quot;. What makes it different from my older (simple) cruise control is that it will slow down and speed up again as necessary without my intervention. For example, if I set the speed to 70 mph and it gets close to a car ahead that is only going 65 mph, it will slow down and maintain what the software believes is a safe distance. Similarly, if a car in an adjacent lane changes lanes in front of me, it will slow down if necessary.<p>I do use it, but less than I did the old system: I just do not find it relaxing because I cannot really grasp intuitively when I need to override it. On standard cruise control, it was obvious to me when I needed to take over. Therefore I am more rather than less vigilant than I was with the old system.<p>I don&#x27;t want to be too hard on the Golf: it has other safety features I really like, such as lane assist and automatic breaking. But I am not a fan of the adaptive control, and I think the article help me understand why: its a level 2 problem!
评论 #34348074 未加载
评论 #34348198 未加载
评论 #34351653 未加载
评论 #34348145 未加载
评论 #34348087 未加载
评论 #34351472 未加载
评论 #34348716 未加载
评论 #34350288 未加载
评论 #34351018 未加载
评论 #34351595 未加载
评论 #34352687 未加载
评论 #34347996 未加载
评论 #34348033 未加载
评论 #34347983 未加载
nailer超过 2 年前
FTA: &gt; the system is called Full Self-Driving<p>Also FTA quoting Tesla (<a href="https:&#x2F;&#x2F;images-stag.jazelc.com&#x2F;uploads&#x2F;theautopian-m2en&#x2F;repo" rel="nofollow">https:&#x2F;&#x2F;images-stag.jazelc.com&#x2F;uploads&#x2F;theautopian-m2en&#x2F;repo</a>...):<p>&gt; It does not turn a Tesla into a self-driving car<p>Is it self driving or not?
评论 #34353283 未加载
mschuster91超过 2 年前
There is one additional part in play: the drivers following the Tesla <i>clearly</i> didn&#x27;t keep their distance. Normally, at least here in Germany, drivers are supposed to keep enough distance from those driving in front of them that, even in the situation that something like a brake defect forcing the car in front to a complete stop or a truck blowing a tire, they do not crash into it.
评论 #34350729 未加载
trabant00超过 2 年前
&gt; if the Level 2 paradigm was flipped, where the driver was always in control, but the semi-automated driving system was doing the monitoring, and was ready to take over [...] but would be less sexy in that the act of driving wouldn’t feel any different<p>I think this is the the most important point of the article and largely ignored here in the comments who seem to focus mostly on who was to blame for this specific accident.<p>We know the strengths and weaknesses of both humans and tech at this point in time. Humans are overall better decision makers but aren&#x27;t 100% focused 100% of the time. Tech gets confused a lot but is never tired or inattentive. So if your goal is safety you let the humans drive but take over in emergency situations when the human is not reacting. Which is what most car manufacturers do right now. Letting the tech drive and expecting the human to provide perfect reaction time every time the tech fails is playing on the weaknesses of both. This is focusing on cool marketing at the expense of safety.
评论 #34354442 未加载
评论 #34356454 未加载
zestyping超过 2 年前
Tesla is primarily at fault for deceptively naming the function &quot;Full Self Driving&quot;. It is indefensible mendacity.<p>I do not understand why the company has not already been sued into oblivion for an obvious lie that has killed people.
roguecoder超过 2 年前
Video understanding engineer here.<p>Tunnels and underpasses are the worst. They are a pain in the ass, because shadows mess with all the edge detection and motion models and anything else visual. Humans compensate by thinking &quot;I&#x27;m in a tunnel: things are weird.&quot; But without a reasoning model that can take context into account, the computer is stuck.<p>In the video from behind, you can see the shadow ahead of the car on the floor of the tunnel that it carefully stops just before it would hit. A person would notice that EVERY OTHER car had driven straight through the thing it thought was an obstacle, but that is also context this car isn&#x27;t going to take into account.
评论 #34349002 未加载
评论 #34375043 未加载
评论 #34350265 未加载
评论 #34348681 未加载
jayd16超过 2 年前
This is an honest question here but why is the pile up so bad if driving conditions were ideal? The next 6 drivers were not keeping a safe distance or paying attention?
评论 #34348351 未加载
评论 #34349179 未加载
评论 #34348516 未加载
评论 #34348549 未加载
beeforpork超过 2 年前
A good analysis. The stated vigilance problem is well known, actually, but it is ignored, supposedly in the hope that humankind is on a progressive path to fully autonomous self-driving, and that we need this phase of experimentation to advance technology.<p>Completely autonomous self-driving cars (without any steering wheel, so even incapacitated or clueless people may &#x27;drive&#x27; (like drunk or in labour or children)) indeed seem like a good solution. (Except we need less individual traffic for env reasons.) Unfortunately, the problem is very hard, technologically, and the current interim solutions will stay for a while.
评论 #34351741 未加载
aseerdbnarng超过 2 年前
I’m always curious how different insurers cover the use of hands-free driving. Would anyone still buy the FSD feature if people thought insurers would reject accidental damage claims? It feels like its sitting on sketchy ground
osigurdson超过 2 年前
I think the insistence on having data for every possible situation for training purposes is indicative of the problem. Humans only require a small amount of training and can extrapolate this to many situations.
评论 #34349574 未加载
twizod超过 2 年前
Can someone please explain to me why the system cannot tell you why it stopped? What prevents the program from explaining why it performs certain actions?
评论 #34351836 未加载
评论 #34351827 未加载
MadQ超过 2 年前
Obviously, the Tesla FSD Autopilot made a big mistake, but didn&#x27;t the driver react? If my car slows from 40 to 0 MPH in 5 Seconds, and there is no obstacle in front of me, I must assuredly would step on the accelerator.<p>Not that FSD Autopilot is what it&#x27;s marketed as, but this is the responsibility of the driver and not the car.
nextstep超过 2 年前
This story also highlights just how little Elon Musk actually supports free speech.<p>Since fighting for release of this video and publishing the story, Ken Klippenstein has been censored on Twitter through shadow bans and inability to find his profile through search.<p><a href="https:&#x2F;&#x2F;twitter.com&#x2F;stevanzetti&#x2F;status&#x2F;1613295292283236358" rel="nofollow">https:&#x2F;&#x2F;twitter.com&#x2F;stevanzetti&#x2F;status&#x2F;1613295292283236358</a>
评论 #34356590 未加载
tinus_hn超过 2 年前
In other news, no car has ever seized up and stopped at an inconvenient moment before Tesla made this horrible mistake, and keeping a safe distance would not be necessary if it weren’t for those dang automated drivers. Now get off my lawn!
评论 #34351921 未加载
评论 #34351242 未加载
cm2187超过 2 年前
From what I can tell, the car behind the tesla didn&#x27;t crash into the tesla but stopped before. And so did the car behind that one. It is the cars after that that crashed into each others.<p>While automatic cars doing random things is certainly problematic, clearly the cause of the crash here isn&#x27;t the tesla, it is other cars not respecting minimum safety distances and not able to stop when there is a traffic jam ahead.
评论 #34351040 未加载
评论 #34351491 未加载
评论 #34350886 未加载
评论 #34351689 未加载
评论 #34350981 未加载
评论 #34353045 未加载
评论 #34351512 未加载
评论 #34350826 未加载
评论 #34350900 未加载
评论 #34352221 未加载
KETpXDDzR超过 2 年前
Is it legal to print out a &quot;stop&quot; sign, attach it to the back of my car, and watch all the Teslas doing full stops on highways?
throwaway892238超过 2 年前
I&#x27;m still waiting for autonomous-only lanes. For extremely long and boring highways (usually full of trucks), this just seems like the logical conclusion of all self-driving tech to me. Dedicate one lane to self-driving, have the vehicles all on autonomous mode, and have them keep enough distance so their braking performance prevents them from crashing in an unexpected stop. You should be able to literally just sleep until your exit comes up in 250 miles. For any unexpected problems, have the vehicle turn off onto the shoulder of the highway and stop.
评论 #34349305 未加载
评论 #34349831 未加载
评论 #34349668 未加载
评论 #34349422 未加载
评论 #34349511 未加载
评论 #34349390 未加载
评论 #34349579 未加载
评论 #34350777 未加载
评论 #34350232 未加载
kiratp超过 2 年前
This looks like the vehicle tried and failed to get the driver to engage and pulled over best it could. There is no shoulder there so…<p>Let’s wait to get more info as to what the driver was doing and if they were incapacitated.<p>The FSD beta is pretty aggressive on making sure the driver is paying attention via both steering sensors, the in-cabin camera and the touch screen.
评论 #34348128 未加载
评论 #34348104 未加载
评论 #34348552 未加载
sneak超过 2 年前
Seems to me that the cars behind the Tesla are the omas at fault?
评论 #34348076 未加载
评论 #34348097 未加载
评论 #34348082 未加载
评论 #34348130 未加载
lucaslee超过 2 年前
Doesn&#x27;t look like phantom break, which is usually very sudden and no lane change. Most likely the driver was hands off the wheel and was warned multiple times before the car pulled itself over.
评论 #34349187 未加载
评论 #34350329 未加载
评论 #34352440 未加载
dvh超过 2 年前
There is no video!
评论 #34351158 未加载
bambax超过 2 年前
&gt; <i>You can’t program your way out of the problem with Level 2; in fact, the better the Level 2 system seems to be, the worse the problem gets. That problem is that human beings are simply no good at monitoring systems that do most of the work of a task and remaining ready to take over that task with minimal to no warning.</i><p>This is incredibly true and obvious.<p>It&#x27;s likely people monitoring their car aren&#x27;t doing anything at all: sleeping, reading a book, whatever.<p>But if anything, actual monitoring is <i>harder</i> than doing: it means actively watching what the machine does, understanding it, thinking about what one would have done, doing a diff and analyzing it, all in real time, all the time. It&#x27;s exhausting.<p>That a situation like this is even legal is amazing.
评论 #34349935 未加载
leaving超过 2 年前
This mirrors my experience of driving in America. People do not adjust their speed or following distance to the conditions. They drive like they have some sort of right to a road free of obstructions and to drive at the speed limit or above, no matter the circumstances.<p>It&#x27;s as if they expect the speed limit sign to adapt to the circumstances. They have the attitude, &quot;Well if I need to slow down, why does the sign say I can go 70mph?&quot;<p>It could be freezing rain with black ice or a fog with zero visibilty, and the overwhelming majority of American drivers would not change their driving one bit. I have seen this in every corner of the country and it is noticeably different from other countries.<p>There will always be unexpected obstructions in the roadbed of major highways. Stuff falls off. People lose control and roll and end up obstructing the lanes.<p>So while the Tesla and its self-driving mode were the proximate cause of the obstruction that led to the collision in this particular instance, they were absolutely unrelated to the actual cause of this entire category of accidents.<p>This attitude affects many other aspects of American life. Notably, gun violence. Exactly the same feeling of entitlement to go full speed ahead and damn the consequences dictates the occasional outcome of both highway driving and eating your lunch quietly in the school cafeteria.<p>There is little difference between an American in a pickup truck barreling down out of the Alleghenies at 80mph in fog and freezing rain and and American with a not-meaningfully-regulated gun barreling into a classroom and opening fire.<p>Both are the direct result of a sense of entitlement, and both regularly lead to mayhem and death.<p>You can&#x27;t blame the Tesla for this.
评论 #34348470 未加载
评论 #34348817 未加载
评论 #34349198 未加载
评论 #34348554 未加载
评论 #34348724 未加载
choeger超过 2 年前
Everyone claiming that the autopilot is at fault for stopping suddenly get something very important very wrong: Stopping a car must be possible at all times. Any automation in a car should always be able to stop the car as the safe default. The same holds for human drivers, of course: Consider the possibility that the vehicle in front of the Tesla just lost some of its load. <i>Of course</i> the Tesla then has to stop suddenly. Considering this behaviour a safety problem is a lazy excuse. At worst, it is a nuisance.<p>As for the safety distance: This is a valid point. If the Tesla has not kept a safe distance to the following car when changing lanes, there is a problem with the automation. It should always be as defensive as possible. I don&#x27;t see that from the police report, but the video shows the distance to be rather short, maybe two car lengths. Call it 10m. The second car doesn&#x27;t really seem to brake very hard (no emergency braking assistant?) and that crash is definitely on the Tesla as it cut the distance, unless the speed limit in that tunnel would be around 20kph or so. But now it gets interesting: The third car <i>does</i> stop successfully. The big pileup happens afterwards.<p>In conclusion: The big pileup happens because the third car suddenly stops - and rightfully so but the following cars <i>don&#x27;t</i> keep their distance to that third car. The Tesla getting rear ended is just a lazy excuse for these drivers that simply drive recklessly.
评论 #34350225 未加载
评论 #34350286 未加载
评论 #34350228 未加载
pessimist超过 2 年前
A beta that we are all helping test . Simply unbelievable that the NHTSA allows this to happen.<p>The Tesla safety report which the company boastfully released wouldn&#x27;t have counted this incident since the Tesla itself wasn&#x27;t harmed.
评论 #34349670 未加载
评论 #34348036 未加载
pensatoio超过 2 年前
If autopilot starts braking even slightly and it shouldn&#x27;t, I immediately take control. For this to happen, the driver clearly was not prepared to take control in any way. It&#x27;s like driving on cruise control with your legs crossed.<p>The automated cruise control and lane-keep on my previous car, an Audi, would occasionally do crazy things too. The hyper-focus on this being Tesla&#x27;s fault continues to baffle.
评论 #34348851 未加载
评论 #34348839 未加载
评论 #34348811 未加载
评论 #34349404 未加载
评论 #34348880 未加载
Sophistifunk超过 2 年前
This is a bug in the meat, and is <i>not</i> going away. The problems are well understood, thanks to decades of examples in semi-automated trains, boats, airplanes, and industrial monitoring systems.<p>Until &quot;self driving&quot; is good enough that you&#x27;re legally allowed to ride it drunk, it should not be allowed to do anything more complicated than radar cruise and lane-keeping.
评论 #34348185 未加载
评论 #34348030 未加载
评论 #34348222 未加载
DesiLurker超过 2 年前
Just ignore the name Tesla and think for a minute how would you feel if this were any Japanese or Korean automaker&#x27;s car. would NHTSA be still as forgiving? we should feel exactly the same as in that case.
评论 #34349930 未加载
dusted超过 2 年前
Now, as much as I wanna take the piss on Tesla, I&#x27;m more interested in this American tradition of simply ramming into anything in front of you that&#x27;s not moving at your current speed or faster..<p>Like, how do people reason about this? It seems uniquely American to me.. These videos of people simply driving into stuff that&#x27;s in plain sight right in their lane? What do they do if there&#x27;s a fallen tree, tire, shopping cart? Simply pile into it instead of braking?<p>I&#x27;m wondering because, in my country, when you drive a car, you&#x27;re supposed to pay attention to the road, and always be ready to break if something gets in your way ? But in USA, the thinking seems to be &quot;I&#x27;m not driving too fast, I have the freedom of way here&quot; ?<p>As for the article, all points are entirely valid! Systems need to either be fully autonomous, or require some level of constant engagement.. This is kind of analogous to the previous stuff written about the deskilling of labor and automation.. It&#x27;s an impossible position to put someone in &quot;this works in every easy case, almost never fails, and when it fail, it might get arbitrarily complex and require the exact skills that the operator has very rarely any opportunity to practice&quot;
评论 #34350140 未加载
评论 #34350159 未加载
评论 #34350157 未加载
bdcravens超过 2 年前
Perhaps &quot;Semi-Automated Driving&quot; is a more accurate term than &quot;Full Self Driving&quot;, but I&#x27;m guessing FSD is a better acronym to use than SAD.
评论 #34349154 未加载
评论 #34349174 未加载
pengaru超过 2 年前
You know what else is &quot;semi-automated driving&quot;?<p>A six year old behind the wheel sitting in your lap.<p>That&#x27;s also known plainly as &quot;illegal&quot;.<p>I have no fucking idea why this is being tolerated on our public roads, nor why SAE has even classified these child-equivalent levels of autonomy as if they&#x27;re <i>acceptable</i>.
评论 #34348330 未加载
0xAFFFF超过 2 年前
&gt; You can’t program your way out of the problem with Level 2; in fact, the better the Level 2 system seems to be, the worse the problem gets. That problem is that human beings are simply no good at monitoring systems that do most of the work of a task and remaining ready to take over that task with minimal to no warning.<p>Well, you could have extremely visible&#x2F;audible signals for the driver to take over when the driving system is failing (internal lighting becoming red and blaring alarms), but that wouldn&#x27;t be very popular I guess, especially with half-assed self-driving systems.
评论 #34351796 未加载
评论 #34352226 未加载
malkia超过 2 年前
Seems like the driver is litigation lawyer (from the full police report, link in the article), with some notable portfolio (IANAL, so don&#x27;t know but what it looks like). Wonder what&#x27;s gonna happen...
评论 #34348339 未加载
jmac01超过 2 年前
Tesla: we have an experimental self driving system that we need to test.<p>Person: lol how are you gonna pay people enough to test something that might endanger them?<p>Tesla: no they will pay us for the privilege
评论 #34348209 未加载
elcomet超过 2 年前
Articles like this are really annoying, it&#x27;s just based on some feelings and don&#x27;t really prove anything. I&#x27;m not saying this accident is terrible and shouldn&#x27;t be investigated of course. And of course Tesla should do better and fix those issues.<p>&gt; then I’ll see a crash like this, [...] and I realize that, no, people still don’t get it.<p>That&#x27;s just an example. You can&#x27;t make conclusions based on a single example, you need lots of data. Your <i>feelings</i> that this is less safe than regular drivings are not enough to justify your conclusions that L2 is less safe than regular driving. I have the feelings of the opposite (I think it reduces the number of accidents), but I don&#x27;t know wether this is true. Accidents that were avoided by L2 don&#x27;t make it in the news, there&#x27;s a huge selection bias here.<p>I think this kind of article might be very dangerous, because it makes people more afraid, and if L2 is actually safer on average, then reducing its usage will just increase the number of accidents.
评论 #34352214 未加载
kevin_thibedeau超过 2 年前
Hopefully Tesla will be paying out 7 digit settlements for their grossly negligent behavior in exposing the public to this broken technology.
评论 #34349815 未加载
评论 #34348542 未加载
giantg2超过 2 年前
I&#x27;m kind of surprised that the feds allow it since the vigilance issue is pretty much a given.
评论 #34348461 未加载
评论 #34348568 未加载
leot超过 2 年前
Cars fail and need to pull over regularly. It&#x27;s hard to understand why this particular incident is all that notable.
评论 #34348086 未加载
评论 #34348035 未加载
评论 #34348053 未加载
评论 #34348138 未加载
评论 #34348048 未加载
评论 #34348079 未加载
评论 #34348047 未加载
评论 #34354846 未加载
评论 #34348050 未加载
epivosism超过 2 年前
The question that is never answered in this kind of story is:<p>1. How many fatalities per mile does tesla FSD have, compared to its best alternative if we ban it?<p>2. Focusing on specific failure cases is not relevant since the political decision to allow it is all or nothing.<p>3. It&#x27;d be perfectly logical to accept road testing FSD, even if it&#x27;s significantly worse in some areas, <i>as long as</i> this is not exploitable, AND the net gain from allowing it is still overall positive.<p>I&#x27;d like to hear reasonable disagreements like &quot;Tesla isn&#x27;t actually all-in net positive&quot; or &quot;Here&#x27;s why we should judge policy by something more than just net lives lost&#x2F;saved&quot;.<p>Edit: the discussion fragments into two versions depending on the state of facts, which isn&#x27;t clear:<p>A) Tesla FSD is <i>not actually</i> safer per-mile. If this is the case, I and most people would probably agree not to allow FSD on public roads. That&#x27;s not really what this is about.<p>B) Tesla FSD is <i>actually, on net</i> safer per mile, <i>but we should not allow it anyway</i>.<p>I&#x27;m welcome to hear other options, too, but while replying I&#x27;d like to hear which one you think.
评论 #34348484 未加载
评论 #34348479 未加载
评论 #34348311 未加载
评论 #34348158 未加载
评论 #34348530 未加载
评论 #34348154 未加载
capableweb超过 2 年前
Besides the fact that you shouldn&#x27;t just flat out stop on the highway, how are 8 drivers stupid enough to be close enough to not be able to stop in time? Is there no limit on how close people drive to the car in front of them in the US?<p>Defensive driving was one of the first thing I learned, before even getting my drivers license, how come others seem to flat out ignore things like that, even for their own safety?
评论 #34353322 未加载
评论 #34353375 未加载
评论 #34353287 未加载
评论 #34353285 未加载