> “Over one hundred people lose their lives every day on American roadways, and countless others are badly injured, Cruise said in a statement sent via email.<p>I’m so tired of this angle being accepted at face value. Current driverless cars are not even capable of safely navigating situations that below-average human drivers rarely struggle with, like avoiding driving into clearly marked wet concrete, or not crashing into an emergency vehicle that has its sirens and lights blazing.<p>Nobody should trust their evidence-free assurances that they will also be better at the less routine, more complex situations that human drivers (who almost universally have better sensors and better cognition) currently fail at.
> The latest Cruise incident occurred Thursday night when a Cruise robotaxi and an emergency vehicle crashed and left a passenger injured. Cruise said in a social media post that one of its self-driving Chevy Bolt EVs entered an intersection on a green traffic light at Polk and Turk streets when it was struck by an emergency vehicle that appeared to be en route to an emergency scene.<p>Noticing and responding to emergency sounds/signals in the road, distant but approaching, is yet another subtly complex aspect of human driving I hadn’t thought of but which seems obvious in retrospect. Seems solvable in theory but also involves a lot of uncertainty and contingencies. Then I suppose the cone-blocking hackers will start playing siren recordings to confuse the robotaxis…
From another article:<p>> According to a CPUC press release, Cruise could charge for service between 10 p.m. and 6 a.m. without a safety drive, while Waymo could charge for passenger service at any time but only with a safety driver in the car. Waymo was also able to operate its cars completely autonomously but was unable to charge passengers for those services.<p>Once again, why the <i>fuck</i> was and is GM/Cruise getting special privileges up and over Waymo when Waymo doesn't have this level of critical failures on the road?<p>There's some point where we need to start leaning in and actually fining AV companies for this horseshit and taking actual action. Limiting the fleet is one thing, but actually stripping them of special privileges would be a better idea. It sounds like Cruise needs to go back to safety drivers in every vehicle.<p>On top of this, GM's allowed to market and sell Super Cruise (their Autopilot-esque clone) on public highways. Why isn't the NHTSA looking a lot closer at GM and scrutinizing their products?<p>@kvogt has some explaining to do.
Everywhere I've lived, it's the responsibility of the emergency vehicle to avoid a wreck when they are crossing an intersection against the signal, whether they're lit up and blasting the siren or not. If the Cruise vehicle really did have a green and the emergency vehicle struck them, then this doesn't actually seem like their fault.
<p><pre><code> > The latest Cruise incident occurred Thursday night when a Cruise robotaxi and an emergency vehicle crashed and left a passenger injured. Cruise said in a social media post that one of its self-driving Chevy Bolt EVs entered an intersection on a green traffic light at Polk and Turk streets when it was struck by an emergency vehicle that appeared to be en route to an emergency scene.
</code></pre>
It sounds serious. But more importantly, it’s a self driving car and a public emergency vehicle. I expect both to have a high def video of the accident and it should be released soon for the public benefit. The US is one of the most driving countries and it’s one of those rear cases when it’s about a public safety and everyone is an expert.
Lots of top level comments here are making sweeping generalizations about self driving cars from this event, but the truth is that we won't know which vehicle is at fault here until some video comes out of the incident.
The amount of mental gymnastics in the comments here to justify the Cruise vehicle's actions baffles me. Emergency vehicles have right of way, always, in the USA. This is non-negotiable, <i>especially</i> if sirens are blaring. It does not matter if other vehicles have a green light.
This is like cracking down on all e-cigarettes because some people got sick from for some crappy e-liquid, while regular cigarettes kill nearly half a million people a year without anyone batting an eye. Classic misdirection while ignoring the real problems.
I am a little confused at how the reduction request is coming from the CA DMV…<p>> The California Department of Motor Vehicles, the agency that regulates the testing and deployment of autonomous vehicles in the state, requested the reduction in operations.<p>… while it was the CPUC that allowed an expansion of their fleet:<p>> The CPUC, the agency that regulates ride-hailing operations including those involving robotaxis, approved Cruise and Waymo on August 10 for final permits that allow the companies to operate 24 hours a day, seven days a week, expand their fleets and charge for rides throughout the city.<p>I guess Cruise is still officially allowed to run the expanded fleet, but the DMV has the ability to say "Stop everything" and so their request has the weight of an order?
“The Cruise AV did identify the risk of a collision and initiated a braking maneuver, reducing its speed, but was ultimately unable to avoid the collision.”<p>Yea, but why? The article doesn’t actually explain the critical issue.
Good idea. Let's extend that to all gasoline vehicles. They can only be driven (or robo-ride) every other day. Cuts congestion and gas consumption. Makes parking easier.<p>Every driver can still get to all of things they <i>have</i> to. They just have to learn to schedule things better. Car pool, transit, whatever. Most cars aren't driven many hours a day anyways.
The headline is weird. New York Times headlines conveys more information: "Cruise Agrees to Reduce Driverless Car Fleet in San Francisco After Crash" <a href="https://www.nytimes.com/2023/08/18/technology/cruise-crash-driverless-car-san-francisco.html" rel="nofollow noreferrer">https://www.nytimes.com/2023/08/18/technology/cruise-crash-d...</a>
Absurd: Are we going to call for a 50% reduction in human operators next time there is a crash caused by something with a pulse?<p>We can't have progress if we don't get over this inane sentimentality(?) over how accidents, deaths, and dismemberments are just 'how things are' so long as they are caused by people and 'Safety of the traveling public is the California DMV’s top priority[0]' when a robot does it.<p>[0]Editorial comment: it sure as shit isn't the rest of the time.
Pedestrian safety is to self- driving cars what child porn is to encryption. It is the excuse used by government to conceal its true motives. How many millions of dollars of traffic fine revenue will government be deprived of when self- driving cars follow every law governing it?<p>This isn't about safety. It's about money.
I've said it before when a self-driving Uber killed a woman, and I'll say it again:<p>I don't understand why authorities allow self-driving cars to beta test out in public. I thought Waymo was the best with something like millions of hours on the road but when they started accepting passengers, it failed to proceed past a traffic cone:<p><a href="https://youtu.be/zdKCQKBvH-A?t=742" rel="nofollow noreferrer">https://youtu.be/zdKCQKBvH-A?t=742</a><p>I wouldn't trust any of the others like Tesla and Uber that think self-driving is as easy putting it out on the road. And Tesla with their cost-saving no LIDAR nonsense.<p>These companies should be fined hefty amounts and barred from testing out in public.
Unbelievable these things are on the street and more have been deployed!<p>A bunch of Cruise cars lose internet connection and become disabled blocking a street. All because the 5G network is being slammed by a music festival 6 miles away. Laughable! Has Verizon, ATT, etc solved that issue of too many connections at a Taylor Swift concert for example?<p>Then a passenger is hurt because AI can't hear or detect emergencies vehicles rushing to save lives, but in the end the EMT vehicle is blocked from saving lives and hurts another life. All in the name of rich tech moguls trying to win the AI robot car market. Progress will be deadly killer ... Uber's car already killed a pedesterian.<p>I have a bit more faith in Waymo as Google has been at it since 2005 or 2007. Cruise seems like another Uber ... do whatever it takes to win the AI robot car race... knowing all along they are not ready for prime time and are learning as they go. What's next a Cruise car gets hacked and then is used as a weapon.