"Waymo estimates that human drivers fail to report 32 percent of injury crashes;"
That seems insanely high to me. Maybe this is just a SF thing?
These articles seem to be pushing the safety aspect and saying they can do just as good as a human driver. This is progress and could be better than relying on a taxi or Uber human. But if you want to convince me to use it, it needs to be better than me, not the average. The average is human driver terrible. I would love to hear (and see!) more about how good these systems are at defensive driving. Ok, so you got rear-ended or another car ran the light - not your fault, but what steps did the car take to try to avoid it?
The article doesn't seem to address fully how much of the "Humans are to blame" part is Waymo cars driving in a manner that isn't normal.
The cases they highlight, such as a multi-collision hit and run, are obvious bad human situations. But, this article feels like it's being a bit generous in its interpretation.
After all, I've seen Waymo cars cause wild traffic jams, and that sort of unexpected behavior could absolutely cause collisions.
Being rear ended (16 out of 23 serious accidents according to the article) is a pretty clear case of the car not doing anything wrong at all. It's the one case where collision avoidance is going to be useless because the car is waiting for e.g. a red light and is supposed to be stopped and has to blindly trust cars behind us will do the same thing.
Your assumption is that it was stopped at a red light. What if if slams the brakes on in the middle of the road due to a mylar balloon, etc? Does it sense a vehicle approaching quickly and sound the horn to hopefully alert the driver to stop, and pull forward from the stop line into the crosswalk if it's clear to provide extra braking distance?
The first mistake here it equating robo taxi drivers to "average drivers." Is that even per mile? Shouldn't the coparison be other cab drivers? There is some statistical skulduggery afoot.
Waymo has an advantage here carefully planning tests in optimal conditions and locations to bias the results from the outset.
That's quite an impressive track record. I would still like to see how they perform in a city with bicycles and/or more variations in vehicles but it at least looks like they are making meaningful progress.
I think the "safer" argument is moot because as a society a clear choice has been made where we preference convenience of the individual over the safety of the community.
If safety was the key concern for travelling by car, highways would have a 30mph limit, cars would require yearly inspection (common in Europe), and city centres would be car-free.
Making the "look how safe we are" argument in order to garner support seems like a fools errand.
Many states have yearly inspections too. The research seems mixed and does not show strong support that inspections reduce mechanical failure related crashes.
My intuition tells me that there are quite a lot of dangerous cars on the roads in states where it's a free-for-all, so I'd be interested in the research you're referring to.
If it's just an annual emissions test (as I assume many are, but I'm not well informed), then there's unlikely to be any significant prevention of mechanical failure on the road.
> Inside companies like Zoox, this kind of human assistance is taken for granted. Outside such companies, few realize that autonomous vehicles are not completely autonomous.
> For years, companies avoided mentioning the remote assistance provided to their self-driving cars. The illusion of complete autonomy helped to draw attention to their technology and encourage venture capitalists to invest the billions of dollars needed to build increasingly effective autonomous vehicles.
> “There is a ‘Wizard of Oz’ flavor to this,” said Gary Marcus, an entrepreneur and a professor emeritus of psychology and neural science at New York University who specializes in A.I. and autonomous machines.
> ...
> When regulators last year ordered Cruise to shut down its fleet of 400 robot taxis in San Francisco after a woman was dragged under one of its driverless vehicles, the cars were supported by about 1.5 workers per vehicle, including remote assistance staff, according to two people familiar with the company’s operations. Those workers intervened to assist the vehicles every two and a half to five miles, the people said.
The submitted article is about safety of the Waymo cars. The article you're linking to is about non-safety [0] remote assistance of a non-Waymo remote taxi company. I'm quite baffled why you think there's any connection here.
[0] Remote operators don't control the car in real time, so they certainly would not have any kind of impact on the safety record either way.
> The submitted article is about safety of the Waymo cars. The article you're linking to is about non-safety [0] remote assistance of a non-Waymo remote taxi company. I'm quite baffled why you think there's any connection here.
The article I linked is about Waymo cars too:
> As companies like Waymo, owned by Google’s parent company, Alphabet...have begun to remove drivers from their cars, scrutiny of their operations has increased. After a series of high-profile accidents, they have started to acknowledge that the cars require human assistance.
Zoox let the NYT peek to see the man behind the curtain. That "Waymo and Cruise declined to comment for this story" doesn't mean there's no man behind their curtain, and probably just means they're still following the playbook of exaggerating the technology to drive interest in it.
That seems insanely high to me. Maybe this is just a SF thing?
These articles seem to be pushing the safety aspect and saying they can do just as good as a human driver. This is progress and could be better than relying on a taxi or Uber human. But if you want to convince me to use it, it needs to be better than me, not the average. The average is human driver terrible. I would love to hear (and see!) more about how good these systems are at defensive driving. Ok, so you got rear-ended or another car ran the light - not your fault, but what steps did the car take to try to avoid it?
https://www.reddit.com/r/Damnthatsinteresting/comments/1dllw...
The cases they highlight, such as a multi-collision hit and run, are obvious bad human situations. But, this article feels like it's being a bit generous in its interpretation.
After all, I've seen Waymo cars cause wild traffic jams, and that sort of unexpected behavior could absolutely cause collisions.
Waymo has an advantage here carefully planning tests in optimal conditions and locations to bias the results from the outset.
If safety was the key concern for travelling by car, highways would have a 30mph limit, cars would require yearly inspection (common in Europe), and city centres would be car-free.
Making the "look how safe we are" argument in order to garner support seems like a fools errand.
Many states have yearly inspections too. The research seems mixed and does not show strong support that inspections reduce mechanical failure related crashes.
If it's just an annual emissions test (as I assume many are, but I'm not well informed), then there's unlikely to be any significant prevention of mechanical failure on the road.
> Inside companies like Zoox, this kind of human assistance is taken for granted. Outside such companies, few realize that autonomous vehicles are not completely autonomous.
> For years, companies avoided mentioning the remote assistance provided to their self-driving cars. The illusion of complete autonomy helped to draw attention to their technology and encourage venture capitalists to invest the billions of dollars needed to build increasingly effective autonomous vehicles.
> “There is a ‘Wizard of Oz’ flavor to this,” said Gary Marcus, an entrepreneur and a professor emeritus of psychology and neural science at New York University who specializes in A.I. and autonomous machines.
> ...
> When regulators last year ordered Cruise to shut down its fleet of 400 robot taxis in San Francisco after a woman was dragged under one of its driverless vehicles, the cars were supported by about 1.5 workers per vehicle, including remote assistance staff, according to two people familiar with the company’s operations. Those workers intervened to assist the vehicles every two and a half to five miles, the people said.
[0] Remote operators don't control the car in real time, so they certainly would not have any kind of impact on the safety record either way.
The article I linked is about Waymo cars too:
> As companies like Waymo, owned by Google’s parent company, Alphabet...have begun to remove drivers from their cars, scrutiny of their operations has increased. After a series of high-profile accidents, they have started to acknowledge that the cars require human assistance.
Zoox let the NYT peek to see the man behind the curtain. That "Waymo and Cruise declined to comment for this story" doesn't mean there's no man behind their curtain, and probably just means they're still following the playbook of exaggerating the technology to drive interest in it.