At the risk of sounding pedantic this is not a Level 4 demo. It is a Level 2 demo where the driver didn't have to intervene during the demonstrated route.<p>What's the difference? Apart from the legal requirement to have a diver ready to take over in test vehicles (which necessarily makes it Level 2), the fundamental difference is that you'd have to show <i>a lot more than one demo</i> to establish that you've achieved Level 4. Level 4s are supposed to be able to operate without human intervention at all within prescribed domains (e.g. downtown cities). That doesn't mean operate one trip or one day or one month without a disengagement -- that's still Level 2.<p>I'm super impressed by the demo but Cruise will have to show more data to back up a Level 4 claim.
Nice. It's annoying that they provide only 10x sped up video. Watching this slowed down is helpful.<p>Notes:<p>* There are frequent steering twitches to the left. This may be associated with passing parked cars. There are similar twitches to the right when in the left lane of a one-way street.<p>* Crosswalk behavior when turning needs some work. The vehicle enters the intersection, then stops in the intersection before the crosswalk with people in it. This is a hard problem, because the system needs to recognize people waiting to cross but not yet in the roadway. When the light turns green, both the pedestrians going straight and the turning vehicle can enter the intersection, the pedestrians having right of way. The pedestrians now block the vehicle, and the vehicle blocks the bike lane.<p>* Left turns into multi-lane streets are too wide and into the wrong lane.<p>* On two occasions, the vehicle is stuck behind a doubly-parked vehicle engaged in loading. The options are to wait or to cross a double yellow line. There's a delay of several seconds, then forward movement. Suspect manual intervention.
"Level 4" means that no human intervention is required and in case of conditions going from good to bad, the car can autonomously put itself in a safe state.<p>That was nice and clean city driving in the video clips but nothing that distinguishes it from "Level 3" (human intervention may be required within ~15 seconds or so) or even "Level 2" (human intervention may be required within seconds, current state of the art).
Level 0: Automated system has no vehicle control, but may issue warnings.<p>Level 1: Driver must be ready to take control at any time. Automated system may include features such as Adaptive Cruise Control (ACC), Parking Assistance with automated steering, and Lane Keeping Assistance (LKA) Type II in any combination.<p>Level 2: The driver is obliged to detect objects and events and respond if the automated system fails to respond properly. The automated system executes accelerating, braking, and steering. The automated system can deactivate immediately upon takeover by the driver.<p>Level 3: Within known, limited environments (such as freeways), the driver can safely turn their attention away from driving tasks, but must still be prepared to take control when needed.<p>Level 4: The automated system can control the vehicle in all but a few environments such as severe weather. The driver must enable the automated system only when it is safe to do so. When enabled, driver attention is not required.<p>Level 5: Other than setting the destination and starting the system, no human intervention is required. The automatic system can drive to any location where it is legal to drive and make its own decision.<p>From the Society of Automotive Engineers<p><a href="https://en.wikipedia.org/wiki/Autonomous_car#Classification" rel="nofollow">https://en.wikipedia.org/wiki/Autonomous_car#Classification</a>
Watching this video what struck me most was how overbuilt almost all the streets were - they're designed for high speeds and devote essentially all the space to cars. We know how to get at least a 3x reduction in fatalities, as the Netherlands has done, with engineering changes that make human drivers less likely to make mistakes and make their mistakes less costly.
Very cool. But am I mistaken that they didn't attempt to go through a 4-way stop with massive pedestrians, like in PacHeights or Marina? I would be curious to see how they fared with the obnoxious drivers skipping their turn as well as the constant flow of pedestrians.<p>I'm also curious how it would react to going up Mason and California, where there's a traffic light at the top of a steep hill. Last time I had to physically pull myself up via the steering wheel to see anything, and as a seasoned driver I was a bit worried.
Pigeons and human have a pact. You continue and they will fly away in the last second. Cars don't need to stop for pigeons.
<a href="https://youtu.be/xPCZtrac-Ss" rel="nofollow">https://youtu.be/xPCZtrac-Ss</a>
Dumb question: how do these systems distinguish a traffic light? Here in New England at least, there's quite a variety of lights, many different modes (red, yellow, green, flashing red, flashing yellow, no-turn-left red arrow, no-turn-right red arrow, and different light technologies: old-fashioned style, Fresnel lens, slotted shade.<p>Add to this complexity the weather conditions. Suppose the sun is shining straight at you and you need to squint and shade your eyes just to make out what the light is -- this happens to me frequently -- can the camera see the traffic light and distinguish its color clearly under such conditions?<p>What about when it's raining, misting or drizzling, snowing heavily, etc. and the traffic lights are these fragmented outlines that you, the human, can heuristically distinguish but a machine might not?<p>One last thought: suppose it's right turn on red and first car in line is a self-driving vehicle. Can it really look left and safely determine there's enough time to beat the cross traffic? If it's highly conservative and just waits until green, there could be ten irate motorists behind it and guaranteed to honk and curse.<p>It's exciting technology but there are some very difficult problems to solve. I worry that if these machines can't demonstrate 110% of a human's ability to drive, they simply won't be implemented in many places except some very well defined rigid routes that are free of problematical challenges and variations.
Random side thought: How do I signal in an ambiguous situation to the car on who's going first. Like Super high traffic and I want to merge in front of them or let them go first? Currently I do that with a wave. Which is super effective on a motorcycle as people almost always say "sure we can squeeze you in here" which lets me get places way faster.
Cruise was a sponsor of ROSCon in 2014, but had no booth or presenters: <a href="http://roscon.ros.org/2014/#program" rel="nofollow">http://roscon.ros.org/2014/#program</a><p>Nor do they have any presence I'm aware of on Github. This is in contrast to BMW, for example, who have made a number of contributions: <a href="https://github.com/bmwcarit" rel="nofollow">https://github.com/bmwcarit</a><p>Anyway, just curious to what extent Cruise used (or still uses) ROS and open source software in their stack.
I said it before and I'll say it again:<p>You might not be able to turn an aircraft carrier on a dime but when you do you've got an aircraft carrier.<p>GM (and the other automotive manufacturers for that matter) decided they wanted in on self driving and electric vehicles, had a few meetings, wrote a few checks and a few years later look at the result. Are Google and Tesla going to reply with similar videos?
What I'd liks to see with level 4 is millions of miles driven by a car with a SOFT outer shell. That way collisions don't hurt other cars or even pedestrians. Make the car's top entirely out of foam or something, when driving without a driver.
It's interesting how the autonomous car evolution is to a very large extent a result from the private sector. Companies could have been asking the federal government to install sensors in all stop signs, and under the street to support this evolution. Rather, they look at the cities as they are and build something that works with it. Sometimes I wonder if government should take a more active role and ease the adoption by adapting cities to autonomus vehicle (rather than the other way around as it is today).
Each time the Cruise starts from a stop light, it's too slow.<p>Every time ... cars pass it, and dive into its lane (a typical reaction to slow moving vehicles).<p>Still, nice accomplishment.
Example edge case: Waymo cars know that police cars often stop behind other cars and to expect people to be walking nearby.<p>Thought: Once we have 99% self driving cars it will be quite easy to convert a portion of roads to pedestrian only at times when traffic is light: bollards go up, lighting changes, cars informed to reroute.
This looks very amateurish, no roundabouts, no give-way intersection. It's really easy to stop at red lights or where then car in front of you breaks and it emits another red light: if ( light == red) stop() ....
This video was posted quite awhile ago. Are we sure its not just Nvidia's reference app configured by the Cruise people? Most of this stuff was in the Tesla and other Nvidia Drive demos.
Now that tesla has started to pretend selling autonomous cars now is making sense, the field went from research ( where researcher honestly reports shortcomings themselves) to the horrendous world of SV start up, where everything should be considered a lie until the day someone can actually buy the stuff and test for himself.<p>And so you end up with posts like this trying to analyse a video frame by frame to assess the reality of the technology, and yet everyone including the author tries to guess where's the catch ( is the green light really trustworthy ? Why is the video accelerated ? Etc..).