> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.
You're omitting the context provided by the article. This wasn't just a random scenario. Not only was this by an elementary school, but during school drop off hours, with both children and doubled parked cars in the vicinity. If somebody doesn't know what double parking is - it's when cars parallel park beside one another, implicitly on the road, making it difficult to see what's beyond them.
So you are around young children with visibility significantly impaired because of double parking. I'd love to see video of the incident because driving 17mph (27kph for metric types) in this context is reckless and not something human would typically do, because a kid popping out from behind one of those cars is not only unsurprising but completely expected.
Another reason you also slow way down in this scenario is one of those cars suddenly swinging open their door which, again, would not be particularly surprising in this sort of context.
> It's likely that a fully-attentive human driver would have done worse.
We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.
The car reacted quickly once it saw the child. Is that enough?
But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.
The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.
If you drive in Sweden you will occasionally come up to a form of speed reduction strategy that may seem counterintuitive. They all add to make driving harder and feel more dangerous in order to force attention and lower speed.
One is to merge opposite directional roads into a single lane, forcing drivers to cooperate and take turn to pass it, one car at a time.
For a combined car and pedestrian road (max speed of 7km/h) near where I live, they intentionally added large obfuscating objects on the road that limited visibility and harder to navigate. This forces drivers to drive very slow, even when alone on the road, as they can't see if a car or person may be behind the next object.
In an other road they added several tight S curves in a row, where if you drive anything faster than 20km/h you will fail the turns and drive onto the artificial constructed curbs.
In other roads they put a sign in the middle of two way roads while at the same time drastically limiting the width to the curb, forcing drivers to slow down in order to center the car in the lane and squeeze through.
In each of those is that a human driver with human fear of crashing will cause drivers to pay extra attention and slow down.
Possibly, but Waymos have recently been much more aggressive about blowing through situations where human drivers can (and generally do) slow down. As a motorcyclist, I've had some close calls with Waymos driving on the wrong side of the road recently, and I had a Waymo cut in front of my car at a one-way stop (t intersection) recently when it had been tangled up with a Rivian trying to turn into the narrow street it was coming out of. I had to ABS brake to avoid an accident.
Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.
So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.
I think my problem is that it reacted after seeing the child step out from behind the SUV.
An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.
(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)
multiple children in my area have died due to being hit by distracted drivers driving near schools. One incident resulted in 2 children being dragged 60 yards. Here's a snippet from an article about the death I was referencing:
> The woman told police she was “eating yogurt” before she turned onto the road and that she was late for an appointment. She said she handed her phone to her son and asked him to make a call “but could not remember if she had held it so face recognition could … open the phone,” according to the probable cause statement.
> The police investigation found that she was traveling 50 mph in a 40 mph zone when she hit the boys. She told police she didn’t realize she had hit anything until she saw the boys in her rearview mirror.
The Waymo report is being generous in comparing to a fully-attentive driver. I'm a bit annoyed at the headline choice here (from OP and the original journalist) as it is fully burying the lede.
I usually take extra care when going through a school zone, especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?), and overtaking is something I would probably never do (and should be banned in school zones by road signs).
This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.
This exact scenario happened with my dad 50 years ago when a little girl ran out to the street from between some parked cars. It's an extremely difficult scenario to avoid an accident in.
Is there footage available? A fully attentive human could've read cues from the environment and maybe avoid the accident or at least increased the error margin for that emergency stop.
If that wasn't an option I guess a fast reacting driver would have done a bit worse and an unattentive driver might have even ran ever the kid.
>It's likely that a fully-attentive human driver would have done worse.
Maybe. Depends on the position of the sun and shadows, I'm teaching my kids how to drive now and showing them that shadows can reveal human activity that is otherwise hidden by vehicles. I wonder if Waymo or other self-driving picks up on that.
A human driver in a school zone during morning drop off would be scanning the sidewalks and paying attention to children that disappear behind a double parked suv or car in the first place, no?
As described by the nhtsa brief:
"within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity"
The "that there were other children, a crossing guard, and several double-parked vehicles in the vicinity" means that waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
It's possible, but likely is a heavy assertion. It's also possible a human driver would have been more aware of children being present on the sidewalk and would have approached more cautiously given obstructed views.
Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.
I wonder if that is a "fully attentive human drive who drove exactly the same as the Waymo up until the point the child appeared"?
Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.
> It's likely that a fully-attentive human driver would have done worse.
> a huge portion of human drivers
What are you basing any of these blind assertions off of? They are not at all born out by the massive amounts of data we have surrounding driving in the US. Of course Waymo is going to sell you a self-serving line but here on Hacker News you should absolutely challenge that. In particular because it's very far out of line with real world data provided by the government.
Waymo is intentionally leaving out the following details:
- Their "peer-reviewed model" compares Waymo vehicles against only "Level 0" vehicles. However even my decade-old vehicle is considered "Level 1" because it has an automated emergency braking system. No doubt my Subaru's camera-based EBS performs worse than Waymo's, still it's not being included in their "peer-reviewed model." That comparison is intentionally comparing Waymo performance against the oldest vehicles on the road -- not the majority of cars sold currently.
- This incident happened during school dropoff. There was a double-parked SUV that occluded the view of the student. This crash was the fault of that double-parked driver. But why was the uncrewed Waymo driving at 17 mph to begin with? Do they not have enough situational awareness to slow the f*ck down around dropoff time immediately near an elementary school?
Automotive sensor/control packages are very useful and will be even more useful over time -- but Waymo is intentionally making their current offering look comparatively better than it actually is.
It depends on the situation, and we need more data/video. But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.
UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.
Exactly. That’s why I’ve always said the driving is a truly AGI requiring activity. It’s not just about sensors and speed limits and feedback loops. It’s about having a true understanding for everything that’s happening around you:
Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.
Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.
Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.
> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast
Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.
But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.
(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).
I bet we'll the the SUV mania in the future as something crazy, like smoking in a plane or using lead for gasoline. Irrational large size cars that people get because everyone it's afraid of another SUV hitting them in a sedan. The tragedy of the commons.
The best reaction from Waymo would have been to start to lobby against letting those monster-trucks park on streets near schools. They are killing so many children, I'm flabbergasted they are still allowed outside of worksites.
What I find a bit confusing is that no one is putting any blame on the kid. I did the same thing as a kid, except it was a school bus instead of SUV, and that was a fucking stupid thing to do (I remember starting to run over the street, and the next thing is that I am in the hospital bed), even though I had been told to always cross the street from behind the bus, not in front of it.
AV’s with enough sensing are generally quite good at stopping quickly. It is usually the behavior prior to the critical encounter that has room for improvement.
The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.
It does sound like a good outcome for automation. Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.
What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself
>Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.
Sure but also throw in whether that driver is staring at their phone, distracting by something else, etc. I have been a skeptic of all this stuff for a while but riding in a Waymo in heavy fog changed my mind when questioning how well I or another driver would've done at that time of day and with those conditions.
For me it would be interesting to know if 17 mi/h was a reasonable speed to be driving in this environment under these conditions to begin with. In my school zones that's already close to the maximum speed allowed. What was the weather, were there cars parked which would make a defensive driver slow down even more?
The autonomous vehicle should know what it can't know, like children coming out from behind obstructions. Humans have this intuitive sense. Apparently autonomous systems do not, and do not drive carefully, or slower, or give more space, in those situations. Does it know that it's in a school zone? (Hopefully.) Does it know that school is starting or getting out? (Probably not.) Should it? (Absolutely yes.)
This is the fault of the software and company implementing it.
Some do, some of the time. I'm always surprised by how much credence other people give to the idea that humans aren't on average very bad at things, including perception.
So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That’s why they purchase goods and services (from others) and then cry about things they don’t and probably never will understand.
And why they can be ignored and just fed some slop to feel better.
I could lie but that’s the cold truth.
Edit: I'm not sure if the repliers are being dense (highly likely), or you just skipped over context (you can click the "context" link if you're new here)
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
> I honestly cannot imagine a better outcome or handling of the situation.
It's the "best outcome" if you're trying to go as fast as possible without breaking any laws or ending up liable for any damage.
German perspective, but if I told people I've been going 30km/h next to a school with poor visibility as children are dropped off around me, I would be met with contempt for that kind of behavior. I'd also at least face some partial civil liability if I hit anyone.
There's certainly better handling of the situation possible, it's just that US traffic laws and attitudes around driving do not encourage it.
I suspect many human drivers would've driven slower, law or no law.
I look for shadows underneath stationary vehicles. I might also notice pedestrians "vanishing". I have a rather larger "context" than any robot effort.
However, I am just one example of human. My experience of never managing to run someone over is just an anecdote ... so far. The population of humans as a whole manages to run each other over rather regularly.
A pretty cheap instant human sensor might be Bluetooth/BLE noting phones/devices in near range. Pop a sensor in each wing mirror and on the top and bottom. The thing would need some processing power but probably nothing that the built in Android dash screen couldn't handle.
There are lots more sensors that car manufacturers are trying to avoid for cost reasons, that would make a car way better at understanding the context of the world around it.
I gather that Tesla insist on optical (cameras) only and won't do LIDAR. My EV has four cameras and I find it quite hard to see what is going on when it is pissing down with rain, in the same way I do if I don't clean my specs.
Waymo driver? The vehicles are autonomous. I otherwise applaud Waymo's response, and I hope they are as cooperative as they say they will be. However, referring to the autonomous vehicle as having a driver is a dangerous way to phrase it. It's not passive voice, per se, but it has the same effect of obscuring responsibility. Waymo should say we, Waymo LLC, subsidiary of Alphabet, Inc., braked hard...
Importantly, Waymo takes full ownership for something they write positively: Our technology immediately detected the individual.... But Waymo weasels out of taking responsibility for something they write about negatively.
the "Waymo Driver" is how they refer to the self-driving platform (hardware and software). They've been pretty consistent with that branding, so it's not surprising that they used it here.
> Importantly, Waymo takes full ownership for something they write positively [...] But Waymo weasels out of taking responsibility for something they write about negatively
Pretty standard for corporate Public Relations writing, unfortunately.
I'll just remind anyone reading: they're under no obligation to tell the unvarnished truth on their blog.
Even if the NHTSA eventually points out significant failures, getting this report out now has painted a picture of Waymo only having an accident a human would have handled worse.
It would be wise to wait and see if the NHTSA agree. Would a driver have driven at 17mph in this sort of traffic or would they have viewed it as a situation where hidden infant pedestrians are likely to step out?
I honestly think that Waymo's reaction was spot on. I drop off and pick up my kid from school every day. The parking lots can be a bit of a messy wild west. My biggest concern is the size of cars especially those huge SUV or pickup trucks that have big covers on the back. You can't see anything incoming unless you stick your head out.
I'm picturing a 10 second clip showing a child with a green box drawn around them, and position of gas and brake, updating with superhuman reactions.
That would be the best possible marketing that any of these self driving companies could hope for, and Waymo probably now has such a video sitting somewhere.
When I was a boy, I ran into the street from between two parked cars. I did not notice the car coming, but he noticed me popping out from nowhere, and screeched to a stop.
I saw a girl dart out between to parked cars on a strode. She was less lucky. The car did slam on their breaks. I have no idea what speed it was ultimately going when they hit the girl. It wasn't enough to send her flying but it was enough to knock her over hard. The dad, was sitting in his front yard and had her up and in his car and I'm guessing rushed to the hospital.
Those kind of neighborhoods where the outer houses face the fast large roads I think are less common now but lots of them left over from the 50+ years ago.
In fact I would call that “superhuman” behavior across the board.
The vast vast vast majority of human drivers would not have been able to accomplish that braking procedure that quickly, and then would not have been able to manage the follow up so quickly.
I have watched other parent drivers in the car pick up line at public schools for the last 16 years and people are absolutely trash at navigating that whole process and parents drive so poorly it’s absurd. At least half parents I see on their phones while literally feet away from hitting some kid.
How do you know how quickly the software braked? A blog post by a company selling a product is not credible material. We need independent sources.
> The vast vast vast majority of human drivers ... would not have been able to manage the follow up so quickly
You are saying the "vast vast vast majority of human drivers" wouldn't pull over after hitting a child?
I remember similar blind faith in and unlimited advocacy for anything Tesla and Musk said, and look how that has turned out. These are serious issues for the people in our communities, not a sporting event with sides.
Human drivers are smart enough to slow down when around a school where kids are being dropped of. This piece of software wasn't. Clearly not superhuman.
If the person got up and walked away I'm not sure what damage you'd be doing by reasonably removing your car from blocking others while waiting for police.
Yeah. I'm a stickler for accountability falling on drivers, but this really can be an impossible scenario to avoid. I've hit someone on my bike in the exact same circumstance - I was in the bike lane between the parked cars and moving traffic, and someone stepped out between parked vehicles without looking. I had nowhere to swerve, so squeezed my brakes, but could not come to a complete stop. Fortunately, I was going slow enough that no one was injured or even knocked over, but I'm convinced that was the best I could have done in that scenario.
The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.
This is the classic Suddenly Revealed Pedestrian test case, which afaik, most NCAP (like EuroNCAP, Japan NCAP) have as part of their standard testing protocols.
Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.
Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.
This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.
Waymo’s performance, once the pedestrian was revealed, sounds pretty good. But is 17mph a safe speed at an active school dropoff area? I admit that I don’t think I ever personally pay attention to the speedometer in such a place, but 17mph seems excessive even for an ordinary parking lot.
I wonder whether Waymo’s model notices that small children are present or likely to be present and that it should leave extra margin for error.
(My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
> But is 17mph a safe speed at an active school dropoff area?
Now you're asking interesting questions... Technically, in CA, the speed limit in school zones are 25 mph (which local authorities can change to 15 mph, as needed). In this case, that would be something the investigation would check, of course. But regardless of that, 17 mph per se is not a very fast speed (my gut check: turning around intersections at > 10-11 mph feels fast, but going straight at 15-20 mph doesnt feel fast; YMMV). But more generally, in the presence of child VRUs (vulnerable road users), it is prudent to drive slowly just because of the randomness factor (children being the most unaware of critters). Did the Waymo see the kids around in the area? If so, how many and where? and how/where were they running/moving to? All of that is investigation data...
My 2c is that Waymo already took all of that into account and concluded that 17 mph was indeed a good speed to move at...
...which leads to your observation below:
> (My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
Yes, I have indeed made that same observation. The Waymos of 2 years ago were very cautious; now they seem much more assertive, even a bit aggressive (though that would be tough to define). That is a driving policy decision (cautious vs assertive vs aggressive).
One could argue if indeed 17 mph was the "right" decision. My gut feel is Waymo will argue that (but likely they might make the driving policy more cautious esp in presence of VRUs, and child VRUs particularly)
In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?
> In your opinion as an AV safety expert, has Waymo already demonstrated a far higher standard of driving than human drivers in collision avoidance scenarios?
That's a difficult question to answer, and the devil really is in the details, as you may have guessed. What I can say that Waymo is, by far, the most prolific publisher of research on AV safety on public roads. (yes, those are my qualifiers...)
Here's their main stash [1] but notably, three papers talk about comparison of Waymo's rider-only (i.e. no safety driver) performance vis-a-vis human driver, at 7.1 million miles [2], 25 million miles [3], 56 million miles [4]. Waymo has also been a big contributor to various AV safety standards as one would expect (FWIW, I was also a contributor to 3 of the standards... the process is sausage-making at its finest, tbh).
I haven't read thru all their papers, but some notable ones talk about the difficulty of comparing AV vs human drivers [5], and various research on characterising uncertainty / risk of collision, comparing AVs to non-impaired, eyes-on human driver [6]
As one may expect, at least one of the challenges is that human-driven collisions are almost always very _lagging indicators_ of safety (i.e. collision happened: lost property, lost limbs, lost lives, etc.)
So, net-net, Waymo still has a VERY LONG WAY to go (obviously) to demonstrate better than human driving behavior, but they are showing that their AVs are better-than-humans on certain high-risk (potential) collisions.
As somebody remarked, the last 1% takes 90% of time/effort. That's where we are...
> In your experience, where do we find a credible source of info? Do we need to wait for the government's investigation to finish?
Most likely, yes, the NHTSA investigation will be credible source of info for this case. HOWEVER, Waymo will likely fight it tooth-and-nail from letting it be public. They will likely cite "proprietary algorithms / design", etc. to protect it from being released publicly. So, net-net, I dunno... Will have to wait and see :shrug.gif:
But meanwhile, personally I would read reports from experts like Phil Koopman [1] and Missy Cummings [2] to see their take.
> Remember Tesla's blog posts?
You, Sir, cite two companies that are diametrically opposite on the safety spectrum, as far as good behavior is concerned. Admittedly, one would have less confidence in Waymo's own public postings about this (and I'd be mighty surprised if they actually made public their investigation data, which would be a welcome and an pioneering move).
On the other hand, the other company you mentioned, the less said the better.
In a technical sense, maybe, but it's all going to be about optics. They have a responsibility to handle the situation well even if it's not their fault, and the public will hold them accountable for what they deem the involvement was, which may not be the actual scenario.
The performance of a human is inherently limited by biology, and the road rules are written with this in mind. Machines don't have this inherent limitation, so the rules for machines should be much stronger.
I think there is an argument for incentivising the technology to be pushed to its absolute limits by making the machine 100% liable. It's not to say the accident rate has to be zero in practice, but it has to be so low that any remaining accidents can be economically covered by insurance.
“The event occurred when the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under six mph before contact was made,” a statement from Waymo explains.
Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.
I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .
> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.
A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.
Err, that is not the desirable statistic you seem to think it is. American drivers average ~3 trillion miles per year [1]. That means ~7000 child pedestrian injurys per year [2] would be ~1 per 430 million miles. Waymo has done on the order of 100-200 million miles autonomously. So this would be ~2-4x more injurys than the human average.
However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.
I suspect that highway miles heavily skew this statistic. There's naturally far fewer pedestrians on highways (lower numerator), people travel longer distances on highways (higher denominator), and Waymo vehicles didn't drive on highways until recently. If you look only at non-highway miles, you'll get a much more accurate comparison.
> we should default to the calculation of 2-4x the rate.
No we should not. We should accept that we don't have any statistically meaningful number at all, since we only have a single incident.
Let's assume we roll a standard die once and it shows a six. Statistically, we only expect a six in one sixth of the cases. But we already got one on a single roll! Concluding Waymo vehicles hit 2 to 4 times as many children as human drivers is like concluding the die in the example is six times as likely to show a six as a fair die.
Would this Waymo incident be counted as an injury? Sounds like the victim was relatively unharmed? Presumably there are human-driver incidents like this where a car hits a child at low speeds, with effectively no injuries, but is never recorded as such?
People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.
And for not totally irrational reasons like machine follows programming and does not fear death, or with 100% certainty machine has bugs which will eventually end up killing someone for a really stupid reason—and nobody wants that to be them. Then there's just the general https://xkcd.com/2030/ problem of people rightfully not trusting technology because we are really bad at it, and our systems are set up in such a way that once you reach critical mass of money consequences become other people's problem.
Washington banned automatic subway train operation for 15 years after one incident that wasn't the computer's fault, and they still make a human sit in the cab. That's the bar. In that light it's hard not to see these cars as playing fast and loose with people's safety by comparison.
>People's standards for when they're willing to cede control over their lives both as the passenger and the pedestrian in the situation to a machine are higher than a human.
Are they? It is now clear that Tesla FSD is much worse than a human driver and yet there has been basically no attempt by anyone in government to stop them.
We I 787 I 879-0215 I I I ui 87⁸⁸78⁸877777777 I 77 I⁸7 I 87888887788 I 7788 I I 8 I 8 I 788 I 7⁷88 I 8⁸I 7788 I 787888877788888787 7pm I 87 I⁸77 I ui 77887 I 87787 I 7777888787788787887787877777⁷777⁷879-0215 7777 I 7pm⁷I⁷879-0215 777⁷IIRC 7 7pm 87787777877 I I I⁷⁷7 ui ui 7⁷879-0215 I IIRC 77 ui 777 I 77777 I7777 ui I 7877777778 I7 I 77887 I 87⁷8777⁸8⁷⁷⁸⁸7⁸⁸⁸87⁸⁸⁸⁸8⁷87⁸⁸87888⁷878⁷878887⁸⁸⁸88⁸878888888888888888888887878778788888888787788888888888888888888888888887 ui is 888888888887 7
We should all think twice before taking a company PR statement completely at face value and praising them for slowing down faster than their own internal "model" says a human driver would. Companies are heavily interested in protecting their bottom line and in a situation like this probably had 5-10 people carefully craft every single word of the statement for maximum damage control.
Surprised at how many comments here seem eager to praise Waymo based off their PR statement. Sure it sounds great if you read that the Waymo slowed down faster than a human. But would a human truly have hit the child here? Two blocks from a school with tons of kids, crossing guards, double parked cars, etc? The same Waymo that is under investigation for passing school busses illegally? It may have been entirely avoidable for the average human in this situation, but the robotaxi had a blind spot that it couldn't reason around and drove negligently.
Maybe the robotaxi did prevent some harm by braking with superhuman speed. But I am personally unconvinced it was a completely unavoidable freak accident type of situation without seeing more evidence than a blog post by a company with a heavily vested interest in the situation. I have anecdotally seen Waymo in my area drive poorly in various situations, and I'm sure I'm not the only one.
There's the classic "humans are bad drivers" but I don't think that is an excuse to not look critically into robotaxi accidents. A human driver who hit a child next to a school would have a personal responsibility and might face real jail time or at the least be put on trial and investigated. Who at Waymo will face similar consequences or risk for the same outcome?
Have you been around a Waymo as a pedestrian? Used one recently? I have never felt as safe around any car as I do around Waymos.
It can feel principled to take the critical stance, but ultimately the authorities are going to have complete video of the event, and penalizing Waymo over this out of proportion to the harm done is just going to make the streets less safe. A 6mph crash is best avoided, but it's a scrap, it's one child running into another and knocking them over, it's not _face jail time_.
Do you know anyone who works at Waymo? The cynicism is silly. Just because some people at some companies behave horribly, it doesn't mean all or even most do.
Look at Waymo's history in the space, meet some of the people working there, then make a decision.
You don't have to think anyone is behaving horribly to acknowledge that a company's PR department will tend to put out the version of the story that makes them look best.
I think the reason why people are willing to believe this company's PR statement (and would be much more hesitant to believe some others) is that there have so far been relatively few publicized incidents overall, and AFAIK none where Waymo was caught lying/downplaying.
> Who at Waymo will face similar consequences or risk for the same outcome?
I'd argue that the general pushback against self-driving cars and immense PR and regulatory attention makes the consequences of accidents much more severe for the company than for a driver. (For comparison: How many kids do you think were hit by human drivers in the past month in the same general area, and how many of them made international news?)
I highly doubt a non-distracted driver going at/below the speed limit hitting a child that darted into the road would be at any realistic risk of facing jail time, especially in the US.
It's going to sound batshit insane what I say - the problem is, if we don't praise company PR, the other side will use this as an excuse to push even harder regulations, not allow them in newer cities, slow down the adoption rate, while factually ignoring that this is just a safer method of transport. I wish I was not a bootlicker, but I really want robotaxis to be available everywhere in the world at some point, and such issues should not slow them down IF it's better, and especially, not worse than humans on average.
The a human would do it better people are hilarious. Given how many times I have been hit by human drives on my bike and watched others get creamed by a cars. One time in Boulder at a flashing cross walk a person ran right through it and the biker they creamed got stuck in the roof rack.
For real, I am convinced these are people who never walk or bike, at least around cities like Santa Monica. I am an everyday urban walker and I have to constantly be on alert not to be hit, even when I'm behaving predictably and with the right of way.
Yeah I have to wonder if any of the "humans would do it better" people actually have children and have dropped them off in a school zone. Drivers are on their phones rolling through school zones at 25-30 during pickup/dropoff hours all the fucking time.
I was just dropping my kids off at their elementary school in Santa Monica, but not at Grant Elementary where this happened.
While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.
If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.
At some point children are capable of pursuing Darwin Awards. Parents may enable this, but ultimately if one’s child does something stupid contrary to one’s guidance and restrictions, they may end up with a Darwin for it. Two hundred years ago the child mortality rate was half, as in you lost one child per two, and most of those were not the fault of the child or parents. Society for quite some years has been pushing that down, to the point that a near-death involving a neglectful parent and a witless child is apparently (?) newsworthy — but the number of deaths will never reach zero, whether humans or robots or empty plains and blue skies. There will always be a Veruca Salt throwing themselves into the furnace no matter how many safety processes we impose onto roads, cars, drivers, and/or robots.
If you want to see an end to this nonsensical behavior by parents, pressure your local city into having strict traffic enforcement and ticketing during school hours at every local school, so that the parent networks can’t share news with each other of which school is being ‘harassed’ today. Give license points to vehicles that drop a child across the street, issue parking tickets to double parkers, and boot vehicles whose drivers refuse to move when asked. Demand they do this for the children, to protect them from the robots, if you like.
But.
It’ll protect them much more from the humans than from the robots, and after a few thousand rockets are issued to parents behaving badly, you’ll find that the true threat to children’s safety on school roads is children’s parents — just as the schools have known for decades. And that’s not a war you’ll win arguing against robots. (It’s a war you’ll win arguing against child-killing urban roadway design, though!)
A human driver travelling at the same speed would have hit that child at exactly 17 mph, before their brain even registered that child was there. If that driver would also have been driving a large SUV that child would have been pushed on the ground and ran over, so probably a fatality. And also functionally nobody would have given a shit apart from some lame finger pointing at (probably) the kid’s parents.
And it is not the child’s or their parents’ fault either:
Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults. And honestly even for adults stepping out a bit from behind an obstacle in the path of a car is an easy mistake to make. Don’t forget that for children an SUV is well above head height so it isn’t even possible for them to totally avoid stepping out a bit before looking. (And I don’t think stepping out vs. running out changes the outcome a lot)
This is why low speed limits around schools exist.
So I would say the Waymo did pretty well here, it travelled at a speed where it was still able to avoid not only a fatality but also major injury.
> A human driver travelling at the same speed would have hit that child at exactly 17 mph, before their brain even registered that child was there.
Not sure where this is coming from, and it's directly contradicted by the article:
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.” The company did not release a specific analysis of this crash.
No, Waymo’s quote supports the grandparent comment - it was about a “fully attentive human driver” - unless you are arguing that human drivers are consistently “fully attentive”?
> And it is not the child’s or their parents’ fault either: Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults.
I get what you are trying to say and I definitely agree in spirit, but I tell my kid (now 9) "it doesn't matter if it isn't your fault, you'll still get hurt or be dead." I spent a lot of time teaching him how to cross the street safely before I let him do it on his own, not to trust cars to do the right thing, not to trust them to see you, not to trust some idiot to not park right next to cross walk in a huge van that cars have no chance of seeing over.
If only we had a Dutch culture of pedistrian and road safety here.
Vehicle design also plays a role: passenger cars have to meet pedestrian collision standards. Trucks don't. The silly butch grilles on SUVs and pickups are deadly. This is more of an argument for not seeing transportation as a fashion or lifestyle statement. Those truck designs are about vanity and gender affirming care. It's easier to make rational choices when it's a business that's worried about liability making those choices.
> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.
So you are around young children with visibility significantly impaired because of double parking. I'd love to see video of the incident because driving 17mph (27kph for metric types) in this context is reckless and not something human would typically do, because a kid popping out from behind one of those cars is not only unsurprising but completely expected.
Another reason you also slow way down in this scenario is one of those cars suddenly swinging open their door which, again, would not be particularly surprising in this sort of context.
We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.
The car reacted quickly once it saw the child. Is that enough?
But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.
The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.
One is to merge opposite directional roads into a single lane, forcing drivers to cooperate and take turn to pass it, one car at a time.
For a combined car and pedestrian road (max speed of 7km/h) near where I live, they intentionally added large obfuscating objects on the road that limited visibility and harder to navigate. This forces drivers to drive very slow, even when alone on the road, as they can't see if a car or person may be behind the next object.
In an other road they added several tight S curves in a row, where if you drive anything faster than 20km/h you will fail the turns and drive onto the artificial constructed curbs.
In other roads they put a sign in the middle of two way roads while at the same time drastically limiting the width to the curb, forcing drivers to slow down in order to center the car in the lane and squeeze through.
In each of those is that a human driver with human fear of crashing will cause drivers to pay extra attention and slow down.
Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.
So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.
An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.
(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)
> The woman told police she was “eating yogurt” before she turned onto the road and that she was late for an appointment. She said she handed her phone to her son and asked him to make a call “but could not remember if she had held it so face recognition could … open the phone,” according to the probable cause statement.
> The police investigation found that she was traveling 50 mph in a 40 mph zone when she hit the boys. She told police she didn’t realize she had hit anything until she saw the boys in her rearview mirror.
The Waymo report is being generous in comparing to a fully-attentive driver. I'm a bit annoyed at the headline choice here (from OP and the original journalist) as it is fully burying the lede.
This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.
Why is it likely? Are we taking the vendor's claims in a blog post as truth?
Does Waymo have the same object permanence and trajectory prediction (combined) to that of a human?
Once the video evidence it out, it might become evident.
Generally Waymo seems to be a responsible actor so maybe that is the case and this can help demonstrate potential benefits of autonomous vehicles.
Alternatively, if even they can't get this right then it may cast doubts about the maturity of the entire ecosystem
If that wasn't an option I guess a fast reacting driver would have done a bit worse and an unattentive driver might have even ran ever the kid.
Maybe. Depends on the position of the sun and shadows, I'm teaching my kids how to drive now and showing them that shadows can reveal human activity that is otherwise hidden by vehicles. I wonder if Waymo or other self-driving picks up on that.
Deleted Comment
As described by the nhtsa brief:
"within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity"
The "that there were other children, a crossing guard, and several double-parked vehicles in the vicinity" means that waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
Deleted Comment
Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.
It is also crazy that this happened 6 days ago at this point and video was NOT part of the press releases. LOL
Dead Comment
Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.
> a huge portion of human drivers
What are you basing any of these blind assertions off of? They are not at all born out by the massive amounts of data we have surrounding driving in the US. Of course Waymo is going to sell you a self-serving line but here on Hacker News you should absolutely challenge that. In particular because it's very far out of line with real world data provided by the government.
- Their "peer-reviewed model" compares Waymo vehicles against only "Level 0" vehicles. However even my decade-old vehicle is considered "Level 1" because it has an automated emergency braking system. No doubt my Subaru's camera-based EBS performs worse than Waymo's, still it's not being included in their "peer-reviewed model." That comparison is intentionally comparing Waymo performance against the oldest vehicles on the road -- not the majority of cars sold currently.
- This incident happened during school dropoff. There was a double-parked SUV that occluded the view of the student. This crash was the fault of that double-parked driver. But why was the uncrewed Waymo driving at 17 mph to begin with? Do they not have enough situational awareness to slow the f*ck down around dropoff time immediately near an elementary school?
Automotive sensor/control packages are very useful and will be even more useful over time -- but Waymo is intentionally making their current offering look comparatively better than it actually is.
UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.
[0] https://www.safedrivingforlife.info/free-practice-tests/haza...
Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.
Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.
Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.
Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.
But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.
(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).
These giant SUVs really are the worst when it comes to child safety
That day I learned why it was so.
The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.
What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself
Sure but also throw in whether that driver is staring at their phone, distracting by something else, etc. I have been a skeptic of all this stuff for a while but riding in a Waymo in heavy fog changed my mind when questioning how well I or another driver would've done at that time of day and with those conditions.
The UK is such a situation, and this vehicle would have failed a driving test there.
This is the fault of the software and company implementing it.
Some do, some of the time. I'm always surprised by how much credence other people give to the idea that humans aren't on average very bad at things, including perception.
That’s why they purchase goods and services (from others) and then cry about things they don’t and probably never will understand.
And why they can be ignored and just fed some slop to feel better.
I could lie but that’s the cold truth.
Edit: I'm not sure if the repliers are being dense (highly likely), or you just skipped over context (you can click the "context" link if you're new here)
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
It's the "best outcome" if you're trying to go as fast as possible without breaking any laws or ending up liable for any damage.
German perspective, but if I told people I've been going 30km/h next to a school with poor visibility as children are dropped off around me, I would be met with contempt for that kind of behavior. I'd also at least face some partial civil liability if I hit anyone.
There's certainly better handling of the situation possible, it's just that US traffic laws and attitudes around driving do not encourage it.
I suspect many human drivers would've driven slower, law or no law.
I look for shadows underneath stationary vehicles. I might also notice pedestrians "vanishing". I have a rather larger "context" than any robot effort.
However, I am just one example of human. My experience of never managing to run someone over is just an anecdote ... so far. The population of humans as a whole manages to run each other over rather regularly.
A pretty cheap instant human sensor might be Bluetooth/BLE noting phones/devices in near range. Pop a sensor in each wing mirror and on the top and bottom. The thing would need some processing power but probably nothing that the built in Android dash screen couldn't handle.
There are lots more sensors that car manufacturers are trying to avoid for cost reasons, that would make a car way better at understanding the context of the world around it.
I gather that Tesla insist on optical (cameras) only and won't do LIDAR. My EV has four cameras and I find it quite hard to see what is going on when it is pissing down with rain, in the same way I do if I don't clean my specs.
Isn't the speed limit normally 15 mph or less in a school zone? Was the robotaxi speeding?
Where is the video recording ?
Importantly, Waymo takes full ownership for something they write positively: Our technology immediately detected the individual.... But Waymo weasels out of taking responsibility for something they write about negatively.
the "Waymo Driver" is how they refer to the self-driving platform (hardware and software). They've been pretty consistent with that branding, so it's not surprising that they used it here.
> Importantly, Waymo takes full ownership for something they write positively [...] But Waymo weasels out of taking responsibility for something they write about negatively
Pretty standard for corporate Public Relations writing, unfortunately.
I'll just remind anyone reading: they're under no obligation to tell the unvarnished truth on their blog.
Even if the NHTSA eventually points out significant failures, getting this report out now has painted a picture of Waymo only having an accident a human would have handled worse.
It would be wise to wait and see if the NHTSA agree. Would a driver have driven at 17mph in this sort of traffic or would they have viewed it as a situation where hidden infant pedestrians are likely to step out?
Deleted Comment
I was very very lucky.
Those kind of neighborhoods where the outer houses face the fast large roads I think are less common now but lots of them left over from the 50+ years ago.
Human reaction times are terrible, and lots of kids get seriously injured, or killed, when they run out from between cars.
Deleted Comment
The vast vast vast majority of human drivers would not have been able to accomplish that braking procedure that quickly, and then would not have been able to manage the follow up so quickly.
I have watched other parent drivers in the car pick up line at public schools for the last 16 years and people are absolutely trash at navigating that whole process and parents drive so poorly it’s absurd. At least half parents I see on their phones while literally feet away from hitting some kid.
> The vast vast vast majority of human drivers ... would not have been able to manage the follow up so quickly
You are saying the "vast vast vast majority of human drivers" wouldn't pull over after hitting a child?
I remember similar blind faith in and unlimited advocacy for anything Tesla and Musk said, and look how that has turned out. These are serious issues for the people in our communities, not a sporting event with sides.
what about all the traffic violations though?
https://news.ycombinator.com/item?id=46814583
Stopped or moved? Is it allowed in CA to move car at all after a serious accident happens?
I am personally a fan of entirely automated but slow traffic. 10mph limit with zero traffic is fast enough for any metro area.
One better than: We investigated our own system and found ourselves to be at no fault?
> From the Waymo blog
Yeah, like, no shit Sherlock. We'd better wait for some videos before making our opinions.
If it can yell at the kid and send a grumpy email to the parents and school, the automation is complete.
The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.
Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.
Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.
This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.
[1] Yes, I'm an AV safety expert
[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...
(edit: verbiage)
I wonder whether Waymo’s model notices that small children are present or likely to be present and that it should leave extra margin for error.
(My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
Even people being all indignant on HN.
Now you're asking interesting questions... Technically, in CA, the speed limit in school zones are 25 mph (which local authorities can change to 15 mph, as needed). In this case, that would be something the investigation would check, of course. But regardless of that, 17 mph per se is not a very fast speed (my gut check: turning around intersections at > 10-11 mph feels fast, but going straight at 15-20 mph doesnt feel fast; YMMV). But more generally, in the presence of child VRUs (vulnerable road users), it is prudent to drive slowly just because of the randomness factor (children being the most unaware of critters). Did the Waymo see the kids around in the area? If so, how many and where? and how/where were they running/moving to? All of that is investigation data...
My 2c is that Waymo already took all of that into account and concluded that 17 mph was indeed a good speed to move at...
...which leads to your observation below:
> (My general impression observing Waymo vehicles is that they’ve gone from being obnoxiously cautious to often rather aggressive.)
Yes, I have indeed made that same observation. The Waymos of 2 years ago were very cautious; now they seem much more assertive, even a bit aggressive (though that would be tough to define). That is a driving policy decision (cautious vs assertive vs aggressive).
One could argue if indeed 17 mph was the "right" decision. My gut feel is Waymo will argue that (but likely they might make the driving policy more cautious esp in presence of VRUs, and child VRUs particularly)
That's a difficult question to answer, and the devil really is in the details, as you may have guessed. What I can say that Waymo is, by far, the most prolific publisher of research on AV safety on public roads. (yes, those are my qualifiers...)
Here's their main stash [1] but notably, three papers talk about comparison of Waymo's rider-only (i.e. no safety driver) performance vis-a-vis human driver, at 7.1 million miles [2], 25 million miles [3], 56 million miles [4]. Waymo has also been a big contributor to various AV safety standards as one would expect (FWIW, I was also a contributor to 3 of the standards... the process is sausage-making at its finest, tbh).
I haven't read thru all their papers, but some notable ones talk about the difficulty of comparing AV vs human drivers [5], and various research on characterising uncertainty / risk of collision, comparing AVs to non-impaired, eyes-on human driver [6]
As one may expect, at least one of the challenges is that human-driven collisions are almost always very _lagging indicators_ of safety (i.e. collision happened: lost property, lost limbs, lost lives, etc.)
So, net-net, Waymo still has a VERY LONG WAY to go (obviously) to demonstrate better than human driving behavior, but they are showing that their AVs are better-than-humans on certain high-risk (potential) collisions.
As somebody remarked, the last 1% takes 90% of time/effort. That's where we are...
---
[1] https://waymo.com/safety/research
[2] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[3] https://waymo.com/research/do-autonomous-vehicles-outperform...
[4] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[5] https://waymo.com/research/comparative-safety-performance-of...
[6] https://waymo.com/blog/2022/09/benchmarking-av-safety/
[edit: reference]
> I would say that Waymo's response, per their blog post [2] has been textbook compliance.
Remember Tesla's blog posts? Of course Waymo knows textbook compliance just like you do, and of course that's what they would claim.
Most likely, yes, the NHTSA investigation will be credible source of info for this case. HOWEVER, Waymo will likely fight it tooth-and-nail from letting it be public. They will likely cite "proprietary algorithms / design", etc. to protect it from being released publicly. So, net-net, I dunno... Will have to wait and see :shrug.gif:
But meanwhile, personally I would read reports from experts like Phil Koopman [1] and Missy Cummings [2] to see their take.
> Remember Tesla's blog posts?
You, Sir, cite two companies that are diametrically opposite on the safety spectrum, as far as good behavior is concerned. Admittedly, one would have less confidence in Waymo's own public postings about this (and I'd be mighty surprised if they actually made public their investigation data, which would be a welcome and an pioneering move).
On the other hand, the other company you mentioned, the less said the better.
[1] http://www.koopman.us/
[2] https://www.gmu.edu/profiles/cummings
Why? This is only true if they weren't supposed to be on the road in the first place. Which is not true.
If I program a machine and it goes out into the world and hurts someone who did not voluntarily release my liability, that's on me.
I think there is an argument for incentivising the technology to be pushed to its absolute limits by making the machine 100% liable. It's not to say the accident rate has to be zero in practice, but it has to be so low that any remaining accidents can be economically covered by insurance.
“The event occurred when the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under six mph before contact was made,” a statement from Waymo explains.
I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .
> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.
A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.
However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.
[1] https://afdc.energy.gov/data/10315
[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...
No we should not. We should accept that we don't have any statistically meaningful number at all, since we only have a single incident.
Let's assume we roll a standard die once and it shows a six. Statistically, we only expect a six in one sixth of the cases. But we already got one on a single roll! Concluding Waymo vehicles hit 2 to 4 times as many children as human drivers is like concluding the die in the example is six times as likely to show a six as a fair die.
And for not totally irrational reasons like machine follows programming and does not fear death, or with 100% certainty machine has bugs which will eventually end up killing someone for a really stupid reason—and nobody wants that to be them. Then there's just the general https://xkcd.com/2030/ problem of people rightfully not trusting technology because we are really bad at it, and our systems are set up in such a way that once you reach critical mass of money consequences become other people's problem.
Washington banned automatic subway train operation for 15 years after one incident that wasn't the computer's fault, and they still make a human sit in the cab. That's the bar. In that light it's hard not to see these cars as playing fast and loose with people's safety by comparison.
Are they? It is now clear that Tesla FSD is much worse than a human driver and yet there has been basically no attempt by anyone in government to stop them.
We I 787 I 879-0215 I I I ui 87⁸⁸78⁸877777777 I 77 I⁸7 I 87888887788 I 7788 I I 8 I 8 I 788 I 7⁷88 I 8⁸I 7788 I 787888877788888787 7pm I 87 I⁸77 I ui 77887 I 87787 I 7777888787788787887787877777⁷777⁷879-0215 7777 I 7pm⁷I⁷879-0215 777⁷IIRC 7 7pm 87787777877 I I I⁷⁷7 ui ui 7⁷879-0215 I IIRC 77 ui 777 I 77777 I7777 ui I 7877777778 I7 I 77887 I 87⁷8777⁸8⁷⁷⁸⁸7⁸⁸⁸87⁸⁸⁸⁸8⁷87⁸⁸87888⁷878⁷878887⁸⁸⁸88⁸878888888888888888888887878778788888888787788888888888888888888888888887 ui is 888888888887 7
Surprised at how many comments here seem eager to praise Waymo based off their PR statement. Sure it sounds great if you read that the Waymo slowed down faster than a human. But would a human truly have hit the child here? Two blocks from a school with tons of kids, crossing guards, double parked cars, etc? The same Waymo that is under investigation for passing school busses illegally? It may have been entirely avoidable for the average human in this situation, but the robotaxi had a blind spot that it couldn't reason around and drove negligently.
Maybe the robotaxi did prevent some harm by braking with superhuman speed. But I am personally unconvinced it was a completely unavoidable freak accident type of situation without seeing more evidence than a blog post by a company with a heavily vested interest in the situation. I have anecdotally seen Waymo in my area drive poorly in various situations, and I'm sure I'm not the only one.
There's the classic "humans are bad drivers" but I don't think that is an excuse to not look critically into robotaxi accidents. A human driver who hit a child next to a school would have a personal responsibility and might face real jail time or at the least be put on trial and investigated. Who at Waymo will face similar consequences or risk for the same outcome?
It can feel principled to take the critical stance, but ultimately the authorities are going to have complete video of the event, and penalizing Waymo over this out of proportion to the harm done is just going to make the streets less safe. A 6mph crash is best avoided, but it's a scrap, it's one child running into another and knocking them over, it's not _face jail time_.
Really? My impression is that, for the most part, HN consistently sides with the companies. I say this in the most neutral way possible.
Look at Waymo's history in the space, meet some of the people working there, then make a decision.
> Who at Waymo will face similar consequences or risk for the same outcome?
I'd argue that the general pushback against self-driving cars and immense PR and regulatory attention makes the consequences of accidents much more severe for the company than for a driver. (For comparison: How many kids do you think were hit by human drivers in the past month in the same general area, and how many of them made international news?)
I highly doubt a non-distracted driver going at/below the speed limit hitting a child that darted into the road would be at any realistic risk of facing jail time, especially in the US.
While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.
If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.
They got the point.
This matches exactly what they said.
That kid is lucky it was a Waymo & not a human driven car.
Deleted Comment
It is never a 6 year old's fault if they get struck by a robot.
If you want to see an end to this nonsensical behavior by parents, pressure your local city into having strict traffic enforcement and ticketing during school hours at every local school, so that the parent networks can’t share news with each other of which school is being ‘harassed’ today. Give license points to vehicles that drop a child across the street, issue parking tickets to double parkers, and boot vehicles whose drivers refuse to move when asked. Demand they do this for the children, to protect them from the robots, if you like.
But.
It’ll protect them much more from the humans than from the robots, and after a few thousand rockets are issued to parents behaving badly, you’ll find that the true threat to children’s safety on school roads is children’s parents — just as the schools have known for decades. And that’s not a war you’ll win arguing against robots. (It’s a war you’ll win arguing against child-killing urban roadway design, though!)
And it is not the child’s or their parents’ fault either:
Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults. And honestly even for adults stepping out a bit from behind an obstacle in the path of a car is an easy mistake to make. Don’t forget that for children an SUV is well above head height so it isn’t even possible for them to totally avoid stepping out a bit before looking. (And I don’t think stepping out vs. running out changes the outcome a lot)
This is why low speed limits around schools exist.
So I would say the Waymo did pretty well here, it travelled at a speed where it was still able to avoid not only a fatality but also major injury.
Not sure where this is coming from, and it's directly contradicted by the article:
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.” The company did not release a specific analysis of this crash.
I get what you are trying to say and I definitely agree in spirit, but I tell my kid (now 9) "it doesn't matter if it isn't your fault, you'll still get hurt or be dead." I spent a lot of time teaching him how to cross the street safely before I let him do it on his own, not to trust cars to do the right thing, not to trust them to see you, not to trust some idiot to not park right next to cross walk in a huge van that cars have no chance of seeing over.
If only we had a Dutch culture of pedistrian and road safety here.