I'd bet all my money, and all the money I could borrow, that a waymo would stop/swerve for a child running out before the sensory nerves in a humans eye reacted to that child. Just thinking it's not as egregious a violation when committed by something with a 0.1ms response time. Still a violation, still shouldn't do it, but the worst case outcome would be much much harder to realize than with a human driver.
Also just to add, the fact that there aren't cases of this from Phoenix or SF seems to signal it's a dumb mistake bug in the "Atlanta" build.
It is even a bit more scrambled such as this part: hcihw selcihev selcihev ot ot ot ot erauqs strap. Looking at the original site, that text is in various nested structures with the paragraphs having that kind of text. They have multiple bits of it being an article block with a .is-paywalled governing various behaviors such as showing ads. The scrambled text is in paragraphs within the separate article portions. Presumably they have a script that will decode it for those to login though I do not understand why they even provide the text? Why not just return it after login? Maybe it is total trash text and just there to pad it out like a lorum ipso. Kind of interesting.
The words are backward if you go the original page and turn on reader mode to get around the paywall. “Ha ha!”, they say, “your reader mode powers are no good here!”
Would AI be better at stopping for children jumping out from a stopped school bus so it’s not as necessary to stop with human drivers?
That being said, just ticket the company and make them pay. Isn’t this how it works with all moving violations? Does Waymo get pulled over for speeding?
The first point is exactly my thought. Self-driving cars are completely different to human drivers. We should not hold them to the same standards while simultaneously holding them to much higher standards. There are many driving violations that are just laws because they could lead to an unsafe scenario that is purely the fault of the driver.
Eg; stop signs. The only reason a full stop is required is to ensure that drivers are taking a clear observation and to give way to other stop signs. If there are no other traffic and no other drivers to give way to. Why do self-driving cars full-stop
You’re probably right in the long term. So, when the world is 100% self-driving cars, we can probably change the rules to favor the machines. In the near-term, however, it’s probably good to make the robots obey the human laws so that the humans don’t start getting the idea that they can disobey them, too.
laws of physics still apply. Car still takes time to slow down, even with perfect reaction times. Well, maybe you could get it to stop in time, but it might break the necks of everyone in the car.
Given how hidden children are walking in front of the bus, if the AI instantly applied breaks upon seeing the child, would the car slowdown in time? probably not. Better yes, good enough? no.
When I was twelve, a 10 yo kid from the next town over was hit and killed, his body was thrown over 100 feet when someone sped around a stopped bus with its flashers out.
No, to the CEO and all the managers who approved the process.
In addition to that, fine the company. Calculate the fine by the usual punishment multiplied by the number of vehicles on the road. And suddenly the companies begin focusing on safety.
Or Waymo going into an active crime scene, loads of cop cars, guns drawn? [1] Cops yelling to get away and instead Waymo pulls over closer to the crime scene causing the passengers to panic.
It was the talk of the school. Rumors spread like wildfire. Consensus was that whatever she did, it must have been terrible.
She had driven past a stopped school bus.
If this reaction is acceptable when a person does it, a $1 fine for a company is a slap in the face to law-abiding citizens.
Your quip about stock options is actually funny, because if the engineers were killing people then those stock options shouldn’t be worth so much.
I'd bet all my money, and all the money I could borrow, that a waymo would stop/swerve for a child running out before the sensory nerves in a humans eye reacted to that child. Just thinking it's not as egregious a violation when committed by something with a 0.1ms response time. Still a violation, still shouldn't do it, but the worst case outcome would be much much harder to realize than with a human driver.
Also just to add, the fact that there aren't cases of this from Phoenix or SF seems to signal it's a dumb mistake bug in the "Atlanta" build.
whatever, close page.
So archive.ph is presumably just picking that up.
If it gets to the point where the fine is prohibitively expensive, then the system should in fact be prohibited.
That being said, just ticket the company and make them pay. Isn’t this how it works with all moving violations? Does Waymo get pulled over for speeding?
Eg; stop signs. The only reason a full stop is required is to ensure that drivers are taking a clear observation and to give way to other stop signs. If there are no other traffic and no other drivers to give way to. Why do self-driving cars full-stop
Authorities investigating Waymo over failure to stop for school buses
https://news.ycombinator.com/item?id=46169695
When I was twelve, a 10 yo kid from the next town over was hit and killed, his body was thrown over 100 feet when someone sped around a stopped bus with its flashers out.
In addition to that, fine the company. Calculate the fine by the usual punishment multiplied by the number of vehicles on the road. And suddenly the companies begin focusing on safety.
[1] - https://www.youtube.com/watch?v=p2XoMKwZE3o [video][1m42s]