CA DMV statement [1] has some more information:<p><i>Today’s suspensions are based on the following:</i><p><i>13 CCR §228.20 (b) (6) – Based upon the performance of the vehicles, the Department determines the manufacturer’s vehicles are not safe for the public’s operation.</i><p><i>13 CCR §228.20 (b) (3) – The manufacturer has misrepresented any information related to safety of the autonomous technology of its vehicles.</i><p><i>13 CCR §227.42 (b) (5) – Any act or omission of the manufacturer or one of its agents, employees, contractors, or designees which the department finds makes the conduct of autonomous vehicle testing on public roads by the manufacturer an unreasonable risk to the public.</i><p><i>13 CCR §227.42 (c) – The department shall immediately suspend or revoke the Manufacturer’s Testing Permit or a Manufacturer’s Testing Permit – Driverless Vehicles if a manufacturer is engaging in a practice in such a manner that immediate suspension is required for the safety of persons on a public road.</i><p>Misrepresenting safety seems like a big deal.<p>[1] <a href="https://www.dmv.ca.gov/portal/news-and-media/dmv-statement-on-cruise-llc-suspension/" rel="nofollow noreferrer">https://www.dmv.ca.gov/portal/news-and-media/dmv-statement-o...</a><p>Edit: This Vice article [2] says Cruise tried to withhold pedestrian injury footage from the DMV.<p>> <i>In the Order of Suspension, the California DMV said that the Cruise vehicle initially came to a hard stop and ran over the pedestrian. After coming to a complete stop, it then attempted to do a “pullover maneuver while the pedestrian was underneath the vehicle.” The car crawled along at 7 mph for about 20 feet, then came to a final stop. The pedestrian remained under the car the whole time.</i><p>> <i>The day after the incident, DMV representatives met with Cruise to “discuss the incident.” During that meeting, Cruise only showed footage up to the first complete stop, according to the Order of Suspension. No one at Cruise told the officers or showed any footage of the subsequent pullover maneuver and dragging. The DMV only learned of that from “another government agency.” When DMV asked for footage of that part of the incident, Cruise provided it.</i><p>[2] <a href="https://www.vice.com/en/article/4a3ba3/california-dmv-suspends-cruises-self-driving-car-license-after-pedestrian-injury" rel="nofollow noreferrer">https://www.vice.com/en/article/4a3ba3/california-dmv-suspen...</a>
From my perspective as a person who lives in San Francisco and also drives a LOT (10-20k miles per year, and many small drives within the city): Cruise cars do not perform acceptably.<p>They manage to avoid collisions by driving extremely conservatively, but the way they traverse, say, a left turn against traffic is absurd. They slow everyone down, including emergency vehicles and public transit, by performing far below the level of most Human drives.<p>They don't work in the rain, they can't handle construction, they block garages and driveways.<p>Waymo vehicles are objectively far better. They drive like Humans do. Still some issued with weather and construction, but they work well alongside busses, trucks, and private cars without slowing anyone down.
The incident last month where that woman got dragged and pinned by the Cruise vehicle was pretty disconcerting. The initial collision wasn't the fault of the cruise vehicle (another vehicle hit her pushing her into the cruise vehicle and fled the scene), but the cruise vehicle then proceeded to try to "pull over" and just dragged this woman and rolled over her leg. Weird situation but definitely something made much much worse by not having a person in the loop once the incident occurred.
But not Waymo. Which confirms all my personal biases.<p>More seriously, the statement cites four specific clauses that caused the permit to be suspended. The second one seems <i>really</i> interesting as it implies that there was a lie of omission where the others are more or less what I’d expect regarding safety risks:<p>> Today’s suspensions are based on the following:<p>> 13 CCR §228.20 (b) (6) – Based upon the performance of the vehicles, the Department determines the manufacturer’s vehicles are not safe for the public’s operation.<p>> 13 CCR §228.20 (b) (3) – The manufacturer has misrepresented any information related to safety of the autonomous technology of its vehicles.<p>> 13 CCR §227.42 (b) (5) – Any act or omission of the manufacturer or one of its agents, employees, contractors, or designees which the department finds makes the conduct of autonomous vehicle testing on public roads by the manufacturer an unreasonable risk to the public.<p>> 13 CCR §227.42 (c) – The department shall immediately suspend or revoke the Manufacturer’s Testing Permit or a Manufacturer’s Testing Permit – Driverless Vehicles if a manufacturer is engaging in a practice in such a manner that immediate suspension is required for the safety of persons on a public road.
I had an epiphany about self driving recently late at night on the streets of San Francisco. It was a somewhat deserted part of town, and the roads were empty. There was a lone Cruise car chugging along and constantly stopping at red lights. The signals were so badly optimized that the car would be the only one waiting at a completely empty intersection for minutes at a time, only to be stopped again at the next one, and the one after that. A single mile of empty road must have taken this car 15+ minutes to traverse. All the top engineering, sensors and ML algorithms in the world and the car was still at the mercy of a basic city planning failure.<p>We have been conditioned to think of autonomous driving as the answer to the traffic nightmares in every city, but tech will never ever be able to solve a social problem (just like how building a wider road will never ease congestion). Had even a fraction of the hundreds of billions of dollars that have been poured into self driving so far been spent towards, say, improving sidewalk quality, making traffic lights smarter, putting sensors along roads, coming up with a standard communication protocol for these sensors, building out public transit, improving urban planning etc. over the last 15 years, we would all be living better lives today. But a VC would rather set that money on fire to have a small chance at a 100x return.
This is just anecdote, but having taken trips in both Waymo and Cruise in SF, I feel very comfortable in Waymo: it drives carefully but confidently. It alerts me when cyclists are near when I'm getting out. It navigates coned-off work areas. Cruise required a lot of (remote) human intervention during my trips.
Vice has the details. Cruise tried to failed to disclose their video evidence of their vehicle nearly killing a pedestrian.<p><a href="https://www.vice.com/en/article/4a3ba3/california-dmv-suspends-cruises-self-driving-car-license-after-pedestrian-injury" rel="nofollow noreferrer">https://www.vice.com/en/article/4a3ba3/california-dmv-suspen...</a><p>One wonders WTF they were thinking. Were they so sloppy they just forgot to show the video? Were they hoping DMV would never see it? Either way it's a terrible stupid and short sighted mistake. A major problem for Cruise is convincing governments their technology is safe and responsible. Hiding evidence is not going to help that case.
As a pedestrian I'm not a fan of being an unwitting part of a private company's beta testing. Before this devolves into whatboutism vs the safety of human drivers, my opposition is to adding yet another <i>unknown</i> safety layer to walking down the street.<p>Usually when private companies are developing products that could kill, drugs for example, it is on the company to exhaustively prove the safety and effectiveness <i>before launching it to the public</i>. It seems odd that self driving cars have been skipping this step.
It's wild that Cruise intentionally decided to hide the information of them dragging a person under the car for 20 feet from regulators. However, now that the news is out, their PR team says they showed the video to DMV folks while DMV denies that. Well, someone is lying and I bet it is Cruise.<p>Here is Cruise initial NHTSA filing: <a href="https://static.nhtsa.gov/odi/inv/2023/INOT-PE23018-11585P2.pdf" rel="nofollow noreferrer">https://static.nhtsa.gov/odi/inv/2023/INOT-PE23018-11585P2.p...</a> They mention every detail from start to end but don't mention the fact that they dragged a person under the car.<p>Let me be very clear, if Cruise just collided with a person due to the mistake of another driver, that's cool. However, this is something way worse. Cruise car stopped, and then decided to drive 20 feet dragging the pedestrian under the car because someone on Cruise team designed a rule that says "if there is a collision, move away from the spot after stopping to avoid blocking traffic". No human driver will ever do that.
TxDMV needs to follow this. Fuck these companies.<p>They are testing their garbage in dense urban areas. Too many times their vehicles just stop in the middle of the road.<p>If it rains, vehicle stops.<p>If the light is out, vehicle panics and stops.<p>Somebody puts a cone on the hood, vehicle panics.<p>If a single variable not accounted for in their algorithms shifts beyond expected range, it fucking panics.<p>So much money dumped/wasted into “autonomous personal vehicles”. How about we invest in making our cities more pedestrian, bike, and public transportation friendly? Let’s move people more efficiently rather than further digging ourselves into this hole.
According to Cruise's X(Twitter) account [0], the suspension seems to be related to an incident in which a human driver hit-and-run propelled a pedestrian in front of a Cruise AV and the Cruise vehicle couldn't avoid hitting the person. When that incident was first reported [1], it seemed like a situation where if it was a human driver in the Cruise vehicle, they wouldn't be found to be at any fault. However given this suspension, I wonder if there's more to it than that.<p>[0] <a href="https://twitter.com/Cruise/status/1716877217995894934" rel="nofollow noreferrer">https://twitter.com/Cruise/status/1716877217995894934</a><p>[1] <a href="https://www.sfchronicle.com/bayarea/article/woman-run-autonomous-vehicle-san-francisco-18403044.php" rel="nofollow noreferrer">https://www.sfchronicle.com/bayarea/article/woman-run-autono...</a>
if people weren't VC brained they would realize that the way to prevent traffic deaths isn't to build fancy self driving cars, it's to invest in safer, more pedestrian friendly infrastructure and use the boring technology that already exists to get people where they want to go. the future is on buses and trains and bikes and pedestrian friendly 15 minute communities
I'm disappointed to hear this news. So many people are killed by cars every year. We just take that for granted. Self-driving cars have the potential to drastically improve this and save many lives. I worry that these regulators are weighing the costs heavily while ignoring the benefits of innovation.<p>That said, I don't have access to the data and I don't really know if Cruise has been acting inappropriately. I'm glad that Waymo seems to be generating far fewer negative press reports, which seems to be a good proxy for whether regulators are going to clamp down. So I hope Waymo continues to operate successfully (and expands their service to me in particular) and I hope that Cruise improves their system and gets back online.
Link to Cruise review of incident:<p><a href="https://getcruise.com/news/blog/2023/a-detailed-review-of-the-recent-sf-hit-and-run-incident/" rel="nofollow noreferrer">https://getcruise.com/news/blog/2023/a-detailed-review-of-th...</a>
> then attempted to pull over to avoid causing further road safety issues, pulling the individual forward approximately 20 feet.<p>That’s a pretty enormous screw up and not one I’d heard about when this was initially reported. Interesting the statement from the DMV implies they were lying…
Just this morning on my bike ride to work, a Cruise car was sitting in the middle of a road after turning right at an intersection. It was almost blocking the road entirely. The cherry on top was the Waymo that was stuck on the other side of the intersection because of this. People were not happy and had to snake their way around the Waymo.<p>Having ridden in Cruise cars a couple of times, I am glad I was able to ride one while I had the chance. The novelty factor was very high, and while being hard to hail + very slow, it was a great thing to show to visitors to the city. Hopefully these safety issues can be resolved. As a biker, I feel much safer around a cruise than your average prius.
This has got to be a huge relief for Waymo. Cruise's atrocious safety was bad press for Waymo every single time it hit the news, and would be a significant contributor to local sentiment against all AVs. It's like OpenAI; when you're leading in a product category and your name is synonymous with the tech, other people's failures with the technology make you look bad.
I feel safer as a cyclist and pedestrian in SF with Cruise and Waymo vehicles driving around than I do any other vehicle.<p>At least the autonomous vehicles actually follow the rules: they are patient and always watching the bike lane and crosswalks. I've never had one try to cut me off, make an illegal turn, or aggressively turn into me while in the crosswalk. These things are daily occurrences for me with human drivers due to the sheer lack of traffic enforcement.
Cruise started as a fake-it-til-you-make-it operation. See if you can find their first demo video. That mindset apparently continues.<p>Waymo has been very cautious. This turned out to be much harder than expected. Waymo's stats get better each year, and it looks like this is going to work.
I took one ride in a cruise car in San Francisco. A person on a moped in front of me accidentally dropped their phone on the road. The person stopped their moped in the middle of the road but the cruise car insisted on continuing and started jerking around.<p>Later, the car in front of me was going down the hill and my cruise car kept accelerating and then braked very hard once its sensors were able to see that something was in front of it.<p>It was an okay experience, but I’ll be honest, I don’t think I would be willing to take myself in it again or my family anytime soon. And honestly, I can’t believe that moped driver didn’t get hit/I didn’t get whiplash.
I had a Cruise driverless car here in Austin, TX blow through a cross walk and get really close to me. I had a broken leg at the time and was using a walker. I almost hit the cruise car with my walker.<p>Before this, I was really impressed by Cruise and actually bought shares in General Motors because I was so blown away. But yeah those cars aren't perfect. However, maybe a human was controlling the car when it blew threw that cross walk? I would not be surprised if a person were to do something like that because people drive horrible all the time, but I would expect better from a computer.<p>Maybe that's part of the problem? We have bad drivers who are ultimately writing the software that controls the cars.<p>Anyway, those are my thoughts. If someone from Cruise happens to read this, try and add some code to prevent these cars from blowing through cross walks with people in them. (I'm not sure what the law says about this, but my interpretation is that if the cross walk sign is lit up, and there's a pedestrian in the cross walk, you should stop. Maybe it's okay to go if they haven't made it to your side of the street yet, but you should never get so close to a pedestrian that they can strike the vehicle out of anger with an object.)
Well, the problem is that our AI systems don't have a clue.<p>At all.<p>They are classification systems. They have no understanding whatsoever of anything around them. Which means it is likely that the AV which is the subject of this accident didn't understand that it hit a person, what a person is, how a person could be stuck under the car, how a person could be dragged by a car, how a person could be hurt by a car, how a person is more fragile than a car, why it might be better to not move in any of the above contexts, how a human driver would more likely get out of the car and make sure it was safe to move (and render first aid), etc.<p>We are calling this crap "Artificial Intelligence". It isn't. It is not at all. It's a parlor trick devoid of anything that resembles understanding. A dog understands the world far better than the most advanced self-driving-car.<p>This lack of understanding issue carries with it layers of significance. For example, a human driver understands consequences to his/her actions. Yes, of course, there are a-holes in every crown...yet, the vast majority of people are good. If that were not the case society would not function as it does. Lacking the concept and understanding of consequences, just assigning scores to things and path planning accordingly, removes important decision-making layers and drives suboptimal AV decisions.<p>You can add responsibility, fear, financial consequences, emotion, care, consideration and a whole host of human condition realities to the list of important factors completely lacking from AV's today.<p>Driving is fine when reality is reduced to easy-to-classify monotonically varying experiences. Not so easy when reality becomes, well, real.
I've said this previously when a self-driving Uber killed a woman, and I'll say it again:<p>I don't understand why authorities allow self-driving cars to beta test out in public. I thought Waymo was the best of the lot with something like millions of hours on the road and no incidents, but when they started accepting passengers, it failed to proceed past a traffic cone on a sunny day:<p><a href="https://youtu.be/zdKCQKBvH-A?t=742" rel="nofollow noreferrer">https://youtu.be/zdKCQKBvH-A?t=742</a><p>The government should initiate a comprehensive test on a self-driving car's capabilities. In their current form, I wouldn't trust any of the others like Tesla and Uber that think self-driving is as easy putting it out on the road. And that too with Tesla and their cost-saving no-LIDAR nonsense.<p>These companies should be fined hefty amounts and barred from beta testing out in public.
"Ultimately we develop and deploy autonomous vehicles to save lives." Of all of the self-serving counterfactual horseshit I've seen sneak out of a corporate mouthpiece that right there might take the cake. Y'all deploy AVs to make money, The End. If saving lives was a primary goal GM would have stripped center console infotainment displays out of their vehicles a decade ago, or maybe never put them there to begin with.
43k deaths per year due to human drivers. DMV should stop sticking their heads in the sand. <a href="https://www.iihs.org/topics/fatality-statistics/detail/state-by-state#:~:text=Posted%20May%202023.-,Fatal%20crash%20totals,in%20which%2042%2C939%20deaths%20occurred" rel="nofollow noreferrer">https://www.iihs.org/topics/fatality-statistics/detail/state...</a>.
What's the ratio of humans back at headquarters ready to take over vs the number of cars active on the road? That's a really interesting number that I don't think I've ever seen reported anywhere.<p>It does feel like a lot of these weird/longtail situations should be more quickly resolved with some better combination of letting a human override in person (like when a AV car blocks a firetruck) and/or quickly getting a human to remotely take over, but yet we keep seeing these reports of where the car does something dumb for a long period of time. Is it as simple as they don't have enough employees on-hand to intervene?
Good. These things have been a real danger around my neighborhood. They are not currently safe. They do not understand crosswalks, and will drive towards pedestrians who are on them until they move. Additionally every time i've encountered a safety issue Cruise have not been accessible to take the report. So I went above and beyond to make sure i reported it. <a href="https://www.dmv.ca.gov/portal/dmv-autonomous-vehicles-feedback-form/" rel="nofollow noreferrer">https://www.dmv.ca.gov/portal/dmv-autonomous-vehicles-feedba...</a>
Their behavior reminded me of Uber's behavior..frat bro ... do anything... killing it, etc mantra yet people's lives are at stake here.<p>I've been writing my distaste for Cruise for the past few month or so here, because of this. Of course getting downvoted by those with skin in the game but c'mon you can't act like how Travis Kalanick did...you are pushing technology onto the streets that can kill people! As well Waymo (Google who im no fan of) has been working on their robot car tech since 2007 ... Cruise no way for that long.<p>There were reports they are pushing their tech onto many city streets. Sorry to say and hopefully other cities/states follow CA's lead!
IMHO,<p>Accidents...manageable.<p>Hitting a pedestrian...still manageable.<p>Deliberately withholding evidence....burn them at the stake and regulate them to an inch of their corporate lives.
I saw a Waymo in my neighborhood stopped in the middle of the road. Cars backed up behind it, and the driver quickly took over and accelerated. The car then returned to driving 20 mph in a 25 mph zone.<p>Tesla is going to smoke these companies. Theirs is the only software trained on high quality actual driver behavior (which they can pull from their fleet because they have telemetry in the form of a real-time safety score for every driver).<p>It's like every other company is trying to write an algorithm to construct strings, and Tesla has GPT-4.
Related: robotics pioneer Rodney Brooks just wrote up an incident where a Cruise cab he was riding in stopped suddenly in the middle of an intersection, nearly causing a bad collision: <a href="http://rodneybrooks.com/autonomous-vehicles-2023-part-ii/" rel="nofollow noreferrer">http://rodneybrooks.com/autonomous-vehicles-2023-part-ii/</a>. Consistent with the discussion here of Cruise cars being dangerously timid.
So sad.<p>We, I mean humanity, need an example that L4 self driving is possible.<p>When SpaceX showed that private company can access space and build a product in the area world changed.<p>Startup founders, investors, government changed their opinion and many space companies were created and trying to do things that they would not even consider before SpaceX: Astra, Rocket Labs, Planet Labs, Firefly, etc<p>(Great book on the topic "When Heavens went on sale": <a href="https://amzn.to/46NRx9l" rel="nofollow noreferrer">https://amzn.to/46NRx9l</a>)<p>If we get one example of self driving as a product on the roads, it may have the same effect. We are going toward self driving winter and unless Cruise, Waymo or similar companies will do the impossible new iteration will happen in 10+ years.<p>It is mentioned in the text that in Q3 GM spent 700M on Cruise.<p>Not that many companies in the world:
1. have similar level of talent as engineers at Cruise
2. can spent similar amount of money
3. are willing to spend them<p>And we need all three.<p>There are so many corner cases that without large scale as close to the real life deployment as possible...<p>I feel very sad.<p>P.S. Worked for 3 years in a self driving company, till our division was sold :(
I took a Cruise for the first time a month ago. It arrived and it sat there for almost ten minutes. It kept telling me that my ride was getting ready. Someone from the operations center also called into the car and asked if I was ok and reassured me the ride would start soon.<p>This was at night with zero traffic. I have no idea why it took so long.<p>When the CPUC was set to vote on expanding Cruise’s operations several months ago, SF first responders and other residents spoke out against expanding operations. CPUC ignored all the feedback and concerns and voted to expand. I believe the press pointed out the CPUC people that voted to expand Cruise operations would benefit off the expansion (can’t recall if they were former employees or tied to the driverless industry). But despite emergency responders giving numerous examples of how dangerous the cars were in emergency situations, they were ignored.
I was in SF last month and saw a Cruise car get stuck in an intersection on Divisadero. I have no idea what it was trying to originally do, but it kept just going back and forwards perpendicular to the incoming traffic. After about 5-10 mins of that, it looked like someone took remote control and got it out of the intersection.
Any speculation on how big a deal this is for the prospects of Cruise as a competitor in the AV space?<p>As far as I can tell, they were open in 4 metro areas (SF, Austin, Houston, Phoenix). So this is their original testing zone, and 1 out of 4 of contemporary zones down.<p>Will this impede their ability to retain and expand existing zones?
Safety is certainly a huge concern with this class of vehicles. But by programming these cars to drive extremely conservatively, they cause disproportionate congestion as well. This seems to be underplayed in approval discussions around these vehicles, but is a major cost.
Are there any reliable statistics of something like accident per mile driven that compare between autonomous vehicles and human drivers?<p>What I found so far is this:<p>- Waymo and Cruise have driven about 8 million miles driverless (mostly in San Francisco) and reported about 100 crashes (mostly minor like scraping shopping cart, etc) and many that were fault of another car. (as of September 2023)<p>- Apparently a human drives am average 100 million miles between fatal accidents. In that case we do not have enough data, yet.<p>I'm not sure what to think. Is the rate of accidents a cause of concern? Or is this blown out of proportion and driverless cars are actually safer? Or (most likely) we simply do not know, yet?
Until cars are capable of simulating real-world human decision making (of a smart and considerate driver), it's not going to be enough. It simply won't be possible to eliminate all the teeny tiny liability laden edge cases otherwise.
From Europe and wasn't aware there currently were at least 2 different self-driving cab companies already operating in (at least some parts of) the USA. Feels like I just aged 15 years in 5 minutes
The large opposition to AVs gives these companies zero-margin for error especially in a place like SF. Hopefully they'll be back on the road soon. My experience in Waymos have felt much safer than the average Uber.
As someone who has had FSD for the past 2 months, I would never trust it with left turn merges into fast lanes, any weird situation (I’ve seen it deal with construction fine but once an excavator was blocking the road and it kept trying to inch forward in the smallest of gaps even though it was clear that the only solution was to wait till it was off the road) and extreme turns (>150 degrees).<p>My drive to work is 20 miles and a really good stretch of that is just 1 lane where I’m stuck behind a car or the road is empty. In those cases I have no problem letting FSD handle everything, the road is extremely hilly, bendy, lane lines vanish every once in a while and it’s fine so I do find it useful at times, though not worth 200$ a month by a long stretch. I’ll probably cancel the subscription soon.<p>Its problems appear very clearly whenever it needs to accurately predict what other cars on the road will do to make its decision. It is honestly horrible at that and while a few times I tried to not intervene, I realized I was just creating a nuisance and now stopped allowing FSD to deal with those situations. It is also bad if there are wrong lane lines (happens at one final stretch of my daily drive)<p>Finally, as someone working in AI, I’m actually very optimistic about their next approach of just learning an end to end model to drive the car. I think this is the only scaleable way to reach human level driving. Any set of rules based + optimization based planner will have those edge cases that ruin the cars capability, in fact it turns out the long tail is really long. When a human is not drunk and focused he is a remarkably good driver. On the other hand, if you just need to capture the edge cases data and add it to your training set to solve it, you can capture all the possible driving conditions that come up on road. This reminds of how there were hundreds of failure modes and jailbreaks for ChatGPT early on while now there are close to None, presumably because they just added them to the training set. If a good human level self driving car appears, I’m convinced it will be an end to end approach, whether the Tesla team are capable of this, who knows.<p>Also for those who think, OpenAI’s scale based approach only works for language, I have access to ChatGPT Vision and its capabilities astound me. Scale for sure works
GM vehicles are generally regarded as not particularly well engineered or reliable, based on what I've seen and heard from mechanics.<p>Why aren't they investing in core engineering, instead of moonshots?
Thanks for digging in and providing sources. The Vice article seems to have been updated to include that Cruise is contesting the narrative where they withheld the full video. It seems like this story is still playing out.<p>> In a statement, Cruise spokesperson Hannah Lindow disputed that Cruise failed to provide the full video during the first meeting with the DMV. “I can confirm that Cruise showed the full video to the DMV on October 3rd, and played it multiple times,” Lindow told Motherboard in a statement.
It's always fun when the Silicon Valley "move fast and break things" ethos meets real world government bureaucracy and regulation.<p>Filling out forms with made-up bullshit just to get an approval works great when you are putting together a SaaS sales presentation, not so much when it's the state Department of Transport on the other end and you are working on matters of public safety.
So much is discussed about self-driving car safety because the implications are really big but also uncomfortable. However, I always get stuck on how little enforcement there is for human drivers who habitually break driving laws and cause dangerous situations. DUIs are one thing sure but honestly most drivers who are bad and dangerous are sober and face no consequences for reckless behavior.<p>Is there something specific that caused this halt on Cruise's testing permits?
I'm glad that it's becoming clear to more people that Cruise and Waymo are at very different levels of maturity. Waymo is so far ahead that it's hard to see anyone catching up. We really need some alternative so there isn't a pure monopoly 10 years from now. I'm not sure there are any mergers that would bring anyone close ("two turkeys don't make an eagle" and all that).
In terms of self driving car technology I do wonder once it matures how new entrants will ever be able to get into the market.<p>Why would you let, by definition, more dangerous cars on the road in order to perfect their software, when perfectly good solutions exist?<p>Perhaps in the end Waymo will be forced to license the base technology somehow.
Good. While people may argue that these things are safer, their success will only lead to the development of more car centric living spaces. Streets will be wider, harder to walk, less people centric. IMO its cheaper in the long run to just have transit and bike lanes.
As long as we mix human and machine control, I think we will continue to have problems. If we take humans out of equation somehow, we may eliminate a lot of edge cases like this. Self driving cars need their own lanes. After that, we could couple them together. Then maybe put them on rails...
Good. Why socialize the cost of these platforms and privatize the profits?<p>Understandably it is expensive. But the profits will similarly be great.<p>They ought to build their own mini cities to test on as part of their own Manhattan Project.
After the CPUC voted to approve the self-driving expansion, the same people that tried to block self-driving cars there pulled two regulatory levers.<p>First, the California DMV, with result seen here.<p>The incident that triggered this event happened when a human driver, who is still at large, hit a pedestrian who flew into the path of the Cruise vehicle.<p>The car executed a maneuver to pull over for a safety stop, but failed to recognize the woman who was dragged some distance.<p>Second, they reached to the Federal agencies via Nancy Pelosi’s office, with primary focus being the NHTSA, the National Highway Traffic Safety Administration.<p>The request was to gather data, which given the friendly posture of the NHTSA at this time will likely result in the opening of an investigation.<p>Both of these moves have some bite. The California DMV can suspend licensure as happened here.<p>But they’re fairly straight-shooter civil service types in the end, and they’ll eventually clear the vehicles for use. You can see this in part because of the action against Cruise only, rather than Waymo.<p>Some may attribute this to Waymo’s greater political sway in California. But my experience of the products themselves is that Waymo has a higher quality self-driving system.<p>The NHTSA is a bit trickier to predict.<p>The NHTSA’s powers are broad in theory and as granted by statute. They could theoretically issue an order to remove a type of vehicle from the road for imminent threat to public safety.<p>However, their typical modus operandi is to issue recalls and work with the auto companies. They won’t want to test these powers under the current court, so my opinion is that they won’t take much action.
I ran a non-profit website in Brazil where schools could look up a question-by-question breakdown of how their students did in the high-school exit exam. I was a bit flummoxed by the process of maintaining a decently sized database with that data, and realized that there were only so many visualizations I'd need for it. So, over a week, I just had an algorithm compute everything I would need for each school for the several million students in the db, and then handily save a file for each institution with that data in an S3 bucket. No databases, no complex logic, just a few pennies of cost for S3... and it made building out the visualization such a breeze!
I'll take the down votes but none of these should be on the road. Especially when a coning is enough to disable the car.<p>After leaving the state, I'm glad to not have to drive on the same road as these.
I do not understand why we're letting these companies test autonomous cars without someone clearly on the hook for both financial and criminal liability <i>for each car</i>.
Really disappointed at the suspension. Seems like a knee jerk reaction, although cruise should have been more transparent.
At this point we should have 4 or 5 self driving car companies testing on public roads not just 2 companies (Waymo and Cruise I don't count Tesla to be self driving but driver assist, the 2 are waymo and Cruise), in Two cities (Phoenix and San Franscio, although LA< Houston Dallas coming soon).
In 2015 I anticipated by early 2020s we would have self driving taxis in a dozen US metro cities not just two. What can we do to accelerate the transition towards this fantastic future? Also Canada so far behind.
I just started seeing Cruise cars driving around Santa Monica this week. Are they already paused in SF and we're somehow later or is pausing just a SF thing?
I know that the geography of North America and the timing of industrialization basically made this a pipe dream…<p>Yet, trying to create near-peer artificial human driving intelligence—just so cities can continue to be designed around horseless carriages, minivans, and truck nuts—well, that may not have been the best solution to the problem.<p>Ideally, autonomous driving would be for “the edge” of a transportation network. Like a house on a dirt road, or a camping site, or the home of a misguided friend who moved to south Gilbert, AZ ;)<p>You shouldn’t need to be watching for basketballs and red lights on interstate highways. Everyone is going the same direction for long distances between transfers. Cars are often individualism, which I’m certainly not against.<p>The arrogance of “driving driverless should be easy” probably comes from a lot of people seeing driving as a part of their basic humanity. The car is nearly a cyborg extension of their mobility and physical power.<p>Me? I just wanna get to the concert. Take the humans out of the equation where it makes sense. I’d rather go with light rail; not my friend who offered to drive, then road rages over a merge, then follows some minivan psychopathically for a few minutes (true stories).<p>We shot ourselves in the foot because mass transportation in America has failed to impress us as much as a Jaguar XK or Jeep KJ, or whatever your Soul Car is.
How many surplus people will be killed by human drivers today, tomorrow, next year, and next decade because of this suspension I wonder? No one ever seems to consider them in these decisions[0].<p>[0]I can see myself not agreeing with the conclusions, but to not even see the question broached in these terms is maddening.
This industry is evil, and I can't fathom what kind of a market would exist for driverless cars. KITT is science fiction. In reality, human drivers are almost always going to be safer. I would never trust my life or those I care about to ride inside a computer-operated vehicle.<p>Those that do are making a costly mistake. Any robot-assisted car still requires a human driver to be paying attention to the road, but by definition, because the robot does most of the driving, the human failsafe is going lulled into a false sense of security, or bored by effectively driving without actually driving. They will slack, and when the critical moment of awareness from the human is needed, he will be caught off guard, as opposed to the actual human driver who was aware the entire time, because he always had to be.<p>More to the point, even if I had a 100% guarantee of safety, I would never want to have the joys of driving taken away from me in the first place! Driving is liberating. It's power. It's the freedom to go anywhere and do anything. It's, dare I say, cathartic.<p>And for those that are unable to drive, Uber & Lyft exist to fill in the gap. They are themselves a relatively new technology, just under a decade old, and you can always find a quick, affordable ride on demand to just about anywhere.<p>What gap in the market would a self-driving car even fulfill?<p>This is yet another example of the arrogance of big tech, pushing out unsafe and socially deleterious products in the hopes of a get rich quick scheme. It's no different to what social media did a decade ago or what "generative a.i." is doing now.