You can read the California DMV's set of autonomous vehicle accident reports.[1] Almost all the Waymo reports are "vehicle was entering intersection, detected cross traffic, stopped, was rear-ended by human driver". There's one Waymo report where someone ran a stop sign and hit them.
The rear-ending problem will be solved as automatic emergency braking becomes standard on cars. Already, it's shipping on almost all high-end cars and 70-80% of midrange cars. There's a US auto industry goal of it being standard by 2022. That's probably the main feature needed for self-driving cars to coexist with human-driven cars.
>The rear-ending problem will be solved as automatic emergency braking becomes standard on cars. Already, it's shipping on almost all high-end cars and 70-80% of midrange cars. There's a US auto industry goal of it being standard by 2022. That's probably the main feature needed for self-driving cars to coexist with human-driven cars.
I really wish we could just enforce proper education of being observant.
Most times when I hear about people getting rear-ended it's because the driver behind them was looking at their phone or otherwise distracted. We shouldn't invent features that allow that to continue.
On the other hand, I think this is fine in some rare cases, like maybe an accident or some other incident happens up ahead, a driver stomps their brakes, and then the person behind them who doesn't know what's happened doesn't stop soon enough. But even for the latter case, you could argue that the DMV says to keep enough distance where in most cases you should be able to still come to a stop with barely a moment's notice as long as you're paying enough attention. Of course, most people don't leave that much space much, if ever.
Any solution that relies on "let's just make humans beings better" is basically doomed to failure and little more than hand wringing and virtue signalling.
Cars have been pervasive for over a century and virtually none of the actual improvements in safety have come from "hoping humans do better".
Unfortunately, current cars have to be controlled by humans, and the human brain has certain characteristics (selected for by evolution) that makes it near impossible for it to be able to react to 100% of the driving situations. There are built-in efficiency circuits (similar to CPU branch prediction) that takes immediate action based on what it typically sees.
So if 99.9999% of the time, when a car in front starts to accelerate, and doesn't slam on the breaks immediately after, the brain makes the assumption that the care in front will never slam on the breaks and devotes less brain-CPU time to "Oh No, that care in front just slammed on the breaks" processing.
In general, human drivers don't do this unless they are very inexperienced. You don't mash on the gas and immediately mash on the breaks, instead you make sure there is nothing coming. If you can't see well enough that there is something coming, then you ease up slowly and only mash on the gas when you have enough information that you won't get side swiped going through the intersection.
And yes, intersections that have a "stop line" far enough back that you can't see cross traffic are poorly designed. Or if cars are really expected to stop every time before a crosswalk, then advance 2 feet to stop again, this needs better enforcement so that activity becomes normalized to the point where the car behind would expect that (one way of doing this is to put two stop signs, one before the sidewalk, and one at the intersection).
That last point isn't a "you could argue" or "most cases", it's literally the law. Rear-ending is always the rear car's fault and it's always their responsibility to leave enough space and time to react and stop safely.
If the lead car braked when there was no reason or danger, that driver might also be cited for recklessness or some such, but a rear-end collision is always an error by the rear driver.
> I really wish we could just enforce proper education of being observant.
I know we're talking about California here, but the other straightforward explanation is that self driving cars are braking unpredictably compared to regular traffic, possibly when other drivers have judged that they wouldn't possibly need to brake.
Such accidents are only the fault of the rear car in a simplistic prima facie sense. Unexpectedly stomping on one's brakes in a fit of road rage is generally seen as causing any resulting accident, as should overzealous braking by a self driving vehicle that isn't following the norms of the road.
For example, from the Waymo report of February 16, 2021:
> While in autonomous mode, the Waymo AV was stopped at a red traffic light at the unprotected intersection of 16th Street and Market Street. After the traffic light turned green, the Waymo AV began to proceed and then came to a stop as it yielded to oncoming traffic also entering the intersection. A passenger vehicle behind the Waymo AV then made contact with the stationary Waymo AV’s rear bumper
This could have been the Waymo turning left and needing to yield to oncoming traffic, OR it could have been the Waymo going straight through the intersection and slamming on its brakes due to someone coming in from the right that hadn't stopped before the Waymo's judgement margin. It seems impossible to tell from the accident report, but the two have very different root cause analyses.
> I really wish we could just enforce proper education of being observant.
I know everyone considers themselves to be a good driver. I consider myself to be a relatively good driver and generally keep very conservative follow distances (no reason to be on someones bumper).
Even when paying attention, I've been in situations where I've had to apply the brakes harder than I should have to avoid a rear end. Sometimes you just don't read the situation correctly and need to play catchup.
Please tell me more about this "proper education of being observant." Do you believe such a thing exists? Does it reliably make people consistently observant? Would it work on me? I would pay a lot of money for such a thing.
> I really wish we could just enforce proper education of being observant.
You'd have to give people whole new brains.
My grandmother had a condition where her eyes forced shut, completely out of her control. She insisted she could still drive without issue. She ignored multiple friends and family telling her she'd murder someone.
Last Saturday I had to dodge out of the way of a car (fortunately moving very slowly). The woman driving was staring in my direction at me as I waved my arms. That she nearly ran me over eventually clicked and she rolled down her window to appoligize.
> I really wish we could just enforce proper education of being observant.
If I had a dollar for every person I see driving around in thick rain, or in full darkness with their lights off, I could probably buy a nice new macbook air.
It's truly shocking how many people are far from alert and observant while driving. Either because they really don't care, or they've been conditioned to take driving very casually.
I use the number of cars I see driving with their lights off as sort of a gauge of how many 'bad' drivers might be on the road, in a 4-5km radius, at any given time... And it's a percentage that is too damn high. My working theory being that if people are putting a car into gear and getting into traffic without checking that their lights are on first, they're probably far from alert and ready in all other driving scenarios as well.
I'm certainly not blaming the victim for getting rear ended, but I've noticed that a lot of people brake a lot harder and more suddenly than needed. It's easy to see how that can lead to getting rear ended, even if the other driver is only distracted for a second.
I try to spread my braking over the space I have to do so. Lower braking force over a longer time is much safer. It both gives people more time to react, and if a collision does occur the speed differential is lower, and less damage will occur. This doesn't save you from getting rear ended while stopped, but that's a minority of rear endings from what I have seen.
> I really wish we could just enforce proper education of being observant.
The problem: on long stretches of road, especially with tunnels or trees acting as side wall, human biology gets in the way - we literally don't perceive stuff happening properly anymore (and as usual, German has a word for it: Tunnelblick, tunnel vision). The additional factors of people being (too) tired to drive or stressed out by children, traffic jams etc. also won't disappear no matter how much enforcement there is... the only thing that enforcement can suppress is phone usage and drug usage.
While I agree, people are unpredictable. Second, even when they are paying attention they may still react too slowly. Even when there's a competent driver at the wheel paying attention, crash prevention systems can still outperform them.
Anyway, I don't think the goal is to prevent accidents entirely; that would be ideal, but it'll be really hard. More important is to prevent injury and death. A car is just hardware, that's what insurance is for in the end. In that regard, even without automation, cars have gotten safer over the decades with roll cages and airbags everywhere.
When do the automatic emergency brakes kick in? Normally you're supposed to maintain at least 5 cars distance on the highway -- will it brake to always maintain that, eliminating those idiot tailgaters? :)
I drive an aged car and drove a friends recent Mercedes long distance. The thing has a feature where it breaks where it thinks it’s good to break, often just when I need to accelerate to get out of a tight spot. I need the vehicle to respond to my inputs directly, not mix my inputs with what it thinks must be done. It’s extremely unpredictable and it feels like I’m not in control of the car.
If studies show that automatic emergency braking cuts down on deaths overall, I think it's a worthwhile tradeoff -- I'd even be happy to have it be mandatory/non-disableable. It sounds like the car you were driving had some kind of defective implementation of automatic emergency braking. That doesn't invalidate the good that these types of systems do when implemented properly.
Also I'm wondering what kind of situations there are where you "need to accelerate to get out of a tight spot?" I'm a pretty experienced driver and I think the number of "must accelerate" situations I've ever encountered could be counted on a hand. Almost universally, you can solve whatever problem you're in by being more patient and decelerating instead. For example, if you're trying to make an aggressive merge into the passing lane because someone is coming up quickly behind, you could just wait until that person passes. If you're trying to get up to highway speed as you merge, but someone is slow in front of you, it might be best to leave that person more of a gap and wait until they go. Even in the case of passing on a two lane road (a situation where I doubt emergency braking would trigger erroneously), you've already made a mistake if a rapid acceleration is needed to avoid a dangerous scenario. Etc.
Have you considered reflecting on your driving behavior? I'm personally rarely (i honestly don't remember when I've had to do this, but let's say less than once a year) had to accelerate quickly to get out of a tight spot.
People who believe they should accelerate out of dangerous situations, and who later comment upon such beliefs, should have their licenses automatically revoked by autonomous natural-language-processing robots that crawl around on the web for that specific purpose.
Or very slow entry speed. My wife rear ended someone while parked because her foot wasn't fully engaged on the break and it creeped forward until it was touching the car in front. It's a tesla so it started beeping but not until it was too late. No visible damage but that didn't stop the other driver from collecting insurance info and reporting it.
> The rear-ending problem will be solved as automatic emergency braking becomes standard on cars. Already, it's shipping on almost all high-end cars and 70-80% of midrange cars. There's a US auto industry goal of it being standard by 2022. That's probably the main feature needed for self-driving cars to coexist with human-driven cars.
Not necessarily. I have a Honda, and the system in my car is deliberately designed to only mitigate a collision, not prevent one (e.g. it will only engage at the last second to slow the car, not stop it). I'm speculating, but I'm guessing the idea is to discourage people from "testing it out" or relying too much on it.
The system I have in my Subaru is pretty useful, and warned me a few times when the vehicle in front of me has suddenly slowed down or stopped. However, it has too many false positives, and if it actually tried to brake every time, the false positives would have been very annoying and I would disable it.
> Almost all the Waymo reports are "vehicle was entering intersection, detected cross traffic, stopped, was rear-ended by human driver". There's one Waymo report where someone ran a stop sign and hit them.
I've driven some some recent cars with a lot of safety systems and more advanced cruise control and they always felt like a bad human driver. They seem to be unpredictable and abuse the breaks, especially when driving on highways where you should just drive with the flow and look in front of you to see what's going on at least 2 or 3 cars ahead of you. I don't know why I have the feeling that technically the Waymo cars are behaving like they should, but some of their actions are just too unpredictable or come all of the sudden, surprising the drivers behind them with sudden breaks and actions that a human driver wouldn't do, or would be more predictable at it.
Solving the automated driving problem will require these systems to behave in ways that regular human drivers consider acceptable - at least for a transition period of time where both live on the road together. Sudden hard breaking in situations that humans see absolutely no need for it is an unacceptable behavior for human drivers to have. Just because it's not your fault doesn't mean you can't do things to prevent it. I have prevented many crashes throughout my life due to my mitigations around others' driving mistakes. We should expect self driving cars to do this at least as well as - not worse than - humans.
I’m speculating here but it’s possible that the waymo vehicle being rear ended is an optimal outcome here. If the waymo vehicle would have otherwise collided with a car in the intersection it would probably be a much worse crash than a rear end with a low/moderate relative velocity.
This does not, by the way, imply that the machines are working correctly and human drivers are the problem. These cars are rear-ended more often than human drivers, which suggests they stop more suddenly or at times humans don't expect.
Slamming on the brakes is a flaw. Even if every car on the road had the ability to respond instantly to the car ahead, being forced to brake hard still risks loss of control especially in bad conditions.
Also, having automatic emergency braking on all vehicles is at least decades out. It will never work on motorcycles or scooters or bicycles, period. It cannot be retrofitted onto existing cars. Requiring it for all new cars would require pretty hamfisted regulations. I expect human drivers to be widespread for at least the rest of my life.
> Even if every car on the road had the ability to respond instantly to the car ahead, being forced to brake hard still risks loss of control especially in bad conditions
Driving on the road is not some kind of high speed car chase. If one had left enough space to the car in front based on the current road conditions and speed one wouldn't have to brake hard. Anyone can adjust their speed to increase the distance to the car on front. Often arguments sound like driving too close or too fast are inevitable facts of life.
> This does not, by the way, imply that the machines are working correctly and human drivers are the problem. These cars are rear-ended more often than human drivers, which suggests they stop more suddenly or at times humans don't expect.
I had a look and couldn't find an answer; does their increased rear-ending frequency result in a decrease in head-on collisions? Is waymo getting rear ended to avoid being t-boned (or a head on collision with someone else)?
They can use it to start fixing the roads, not the cars.
Just look at the cross intersection they use as an example, typical of the US. The ROAD is the death trap there. People are crossing each other at high speed, this is the BUG, and should be fixed.
The proper solution is putting there a roundabout, and you will have instantly less fatal accidents.
The US was the first to use roundabouts, but those early designs did not work well, and as a result we have those monstrosities like in the picture all around the US.
With electric cars that record accidents, the first bugs that we should fix are those in the roads, like those concrete barriers that have no smooth transition but a front wall out of nowhere.
> Just look at the cross intersection they use as an example, typical of the US. The ROAD is the death trap there. People are crossing each other at high speed, this is the BUG, and should be fixed.
> The proper solution is putting there a roundabout, and you will have instantly less fatal accidents.
People are only "crossing each other at high speed" in these simulations because the scenario involves someone running a red light:
> Here, for example, you can see on the bottom of the screen that the simulated Waymo Driver avoids a reconstructed version of a real-life fatal crash by obeying the speed limit—and not running a red light, as the initiator did in real life [emphasis mine]...
I don't think "roundabout all the things" is the answer to every traffic and road safety problem. This looks like an intersection between two 3/4 lane roads, and you'd need monster roundabout for it that seems like it would be confusing and/or slow. Also, a road can "have instantly less fatal accidents" by being closed or turned it into a traffic jam. IMHO, the road engineers need to balance accident avoidance, throughput, and user-friendliness.
I live in Germany. We have some roundabouts. And let me tell you: they rarely work better than red lights. Traffic flow requires more than roundabouts. They require proper planning, green waves, proper sizing and still different lanes for roundabouts. Don’t forget about semis either. Just throwing roundabouts somewhere won’t solve anything. I’ve seen that often enough. That will just create more traffic jams.
I do agree that road design would drastically cut down on excess speed and crashes. I'm not so sure about roundabouts as a solution though. Roundabouts may be useful to allow more cars to travel through a place but they are not great for other road users. I think you are also correct about the speed. Cars are mostly driving too fast. We have the tech to solve this problem today in a rather straightforward manner but the very idea of it makes people go crazy: speed governors on vehicles. The funny thing is, some places have them mandated on things like scooters and e-bikes, but a car with 700 HP that can go 180 mph does not have one and drives on the same road. There is a LOT of low hanging fruit.
I really dislike the way the data is being summarized as "100% avoided or mitigated*".
First, they're omitting the types of case they're failing to mitigate (rear endings in this case). Cue "60% of the time it works every time" anchorman scene.
Second, "mitigated" is defined to mean a 25% relative reduction in chance of serious injury. Which is good, but again smacks of "60% of the time it works every time".
Basically, it looks to me like someone wanted to be able to show a bold blue 100% and required the definitions they were using to be massaged to match. And apparently they got their way. This has damaged my trust in future summaries put out by Waymo.
A more truthful summarization style would be e.g. "X fewer serious injuries over 72 incidents", referring to expected injuries and simulated incidents of course.
I agree the results are excellent but the way they try to over-present them is very scummy. I don't see why they didn't think the results were good enough to stand on their own.
I suspect that when we have self driving cars, it's going to take off very quickly due to insurance. In particular, if you are in a non self driving car and in an accident with a self driving one, it'll be hard to not be shown as at fault. So insurance I would guess go up based on the percentage of self driving cars, and likely be really cheap for self driving cars.
I wonder if there's a period though where people cut off self driving cars, knowing they won't get hit. I suspect pedestrians in cities will jaywalk a lot more (and maybe that's ok).
I suspect the adoption process will be slow because consumers really don't want this product. Self driving cars will drive the speed limit and infuriate both other motorists and their owners who are getting passed by speeding human driven cars. There will also be well publicized stories of the few times that self driving cars perform worse than humans. All drivers think they are above average and therefore the statistical comparison to the average driver does not apply to them.
If there's one thing I don't care about on the road, it's infuriating other motorists with my safe driving. But, for what it's worth, I drive the speed limit and I pretty much never see anyone get angry at me for it. If you hang out in the right lane, all the type A people driving 20 over are in the other lane, zipping past you. You simply don't encounter the type of people who would get mad at speed limit driving in the slow lane. As long as the self driving cars are implemented to follow passing rules, it's not going to be an issue if they were to drive the speed limit (although as another person pointed out to you this is a counterfactual).
> and their owners who are getting passed by speeding human driven cars.
I personally drive 5-6 mph over the speed limit on freeways when the weather and visibility is good.
But I absolutely 100% do not care if the vehicle is driving the speed limit if I do not have to be controlling it. What do I care if the vehicle takes an extra 10 minutes to get to the destination? It'll just mean 10 extra minutes of work or nap time.
I think people will want it. They'll get to watch netflix or read or take meetings on their commutes. They'll get to live further from the city in a bigger place than they could otherwise.
I also suspect we'll see cars looking more like living rooms or offices.
The other thing is that as we have more self driving cars, speed limits will be able to be relaxed. Most humans can't safely drive at 100mph, but on a road of only self driving cars that seems very possible. So that 1 hour commute radius gets bigger and bigger.
I have two conflicting thoughts on self-driving cars.
* On the one hand I totally agree with you, the majority of people don't care about driving and just want to get from Point A to Point B. Given the opportunity to sit back and watch Netflix vs fighting traffic and having lower insurance premiums I think will make self-driving cars a tempting combination.
* On the other hand though, outside densely populated places where there is less traffic and more unpredictable terrain, self-driving cars are going to have an extremely uphill battle. First with the lack of extreme traffic, there isn't much of a case for needing for a car to drive itself.
Then there's the issue of trust. Say you're out in rural Idaho, nearest anything is 50+ miles away. Out there all those cool tech gizmos and self driving tech is a massive liability. If you're out in the boonies and your car looses connection and refuses to drive what do you do? What about if some wired internal computer system breaks? This is why cars like the Toyota 4Runner and Tacoma are so so popular, they are old school simple cars but more importantly, you can take the thing out the middle of nowhere and trust that you won't get stranded by "cutting edge technology"
On the other hand, if self-driving cars are as good at avoiding accidents as the article says that may cause the price of insurance for non-self-driving cars to fall.
I recall reading a few years ago that insurance companies are looking to partner with car manufactures. The idea is, when you purchase a full self-driving vehicle, insurance will be included with the cost of the car.
No, tech companies will just hide a "Not actually self-driving, driver takes full responsibility" sticker near the VIN. It'll be easy to blame the human for negligence then.
It's easy to be cynical. Volvo has already said they take responsibility [0], and I assume other manufacturers will also accept blame if their self-driving was involved.
A big caveat to this study is that, in the reconstructed simulations, once the self-driving car (SDC) deviates from what the human-driven car did, the behavior of all other agents becomes unrealistic. Often in these sims, instead of reacting to the new SDC trajectory, they replay their original behavior.
Ideally, once the SDC deviates, all other agents are simulated as well. A tall order, but necessary if these counterfactuals are to hold weight.
> Road safety is a major, global public health crisis. More than 1.3 million people die on the world’s roads every year, according to WHO. That’s more people than die from HIV/AIDS, and is equivalent to a passenger plane’s worth of people crashing every single hour—or one death every 30 seconds.
> I’ve spent over 20 years working in crash avoidance research, in the belief that improved driving technology is the key to reducing these needless deaths.
This is a noble goal, but I find it really damn hard to ignore the possibility that this is all just another ploy by Google to steal our data and violate our privacy. Just imagine trying to push back against a technology that saves lives for something as "frivolous" as privacy.
Why does it seem like the consumer can never win? The free market is supposed to be self balancing, yet I can't remember a time where it actually felt that way in the technology sector.
I think if we want to solve large scale intersectional problems like traffic (where lots of individuals with their own goals and own variables intersect) we will need to give up SOME of our privacy in terms of where we are headed (just heading), speed, car model/make or some kind of capabilities estimate... This could result in a smarter traffic flow, road design, or in Google's case powering safer self-driving cars.
I'm up for any of those outcomes and am also concerned about giving up data unnecessarily. I think the crux of the article besides hyping up self-driving simulated driver's better decision making is that roads aren't designed as safely as they could be and some minor inconvenience to the user (time, some data given up) could save lots of lives.
1) Would a random driver (random steering and throttle) also have avoided these crashes? It seems like they were fatal because several variables coincided very exactly, and really any other input would have avoided them.
2) What input is the Waymo driver using here? If these were not originally Waymo cars, then is it perceiving simulated video or getting raw access to ground truth vehicle positions?
They should try human drivers in a simulator. You would have to be careful not to signal when the accident was about to happen, because accidents in real life are very uncommon.
You could estimate how avoidable an accident was by the percentage of human drivers in the simulated "responder" vehicle that avoided it.
Maybe some accidents are easily avoided by humans because the real responder driver was distracted. I'm not impressed if Waymo driver can also avoid these.
But some accidents may be very difficult to avoid for humans. If Waymo driver can perform better in these cases then it's truly remarkable.
For 2, Waymo has a simulation environment with a whole simulated physics and sensor suite. They use this environment to train their driving systems on far more virtual miles than the number of physical miles they drive. So presumably they induce the reconstructed initial conditions from the crash and then they run the simulation a lot of times to see if their driver also gets into the crash.
This shows (sort of) that the computer does better in situations where humans are known to fail. It's entirely likely that the computers will have new and unexpected failure modes that humans would never run into.
The rear-ending problem will be solved as automatic emergency braking becomes standard on cars. Already, it's shipping on almost all high-end cars and 70-80% of midrange cars. There's a US auto industry goal of it being standard by 2022. That's probably the main feature needed for self-driving cars to coexist with human-driven cars.
[1] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...
I really wish we could just enforce proper education of being observant.
Most times when I hear about people getting rear-ended it's because the driver behind them was looking at their phone or otherwise distracted. We shouldn't invent features that allow that to continue.
On the other hand, I think this is fine in some rare cases, like maybe an accident or some other incident happens up ahead, a driver stomps their brakes, and then the person behind them who doesn't know what's happened doesn't stop soon enough. But even for the latter case, you could argue that the DMV says to keep enough distance where in most cases you should be able to still come to a stop with barely a moment's notice as long as you're paying enough attention. Of course, most people don't leave that much space much, if ever.
Cars have been pervasive for over a century and virtually none of the actual improvements in safety have come from "hoping humans do better".
So if 99.9999% of the time, when a car in front starts to accelerate, and doesn't slam on the breaks immediately after, the brain makes the assumption that the care in front will never slam on the breaks and devotes less brain-CPU time to "Oh No, that care in front just slammed on the breaks" processing.
In general, human drivers don't do this unless they are very inexperienced. You don't mash on the gas and immediately mash on the breaks, instead you make sure there is nothing coming. If you can't see well enough that there is something coming, then you ease up slowly and only mash on the gas when you have enough information that you won't get side swiped going through the intersection.
And yes, intersections that have a "stop line" far enough back that you can't see cross traffic are poorly designed. Or if cars are really expected to stop every time before a crosswalk, then advance 2 feet to stop again, this needs better enforcement so that activity becomes normalized to the point where the car behind would expect that (one way of doing this is to put two stop signs, one before the sidewalk, and one at the intersection).
If the lead car braked when there was no reason or danger, that driver might also be cited for recklessness or some such, but a rear-end collision is always an error by the rear driver.
I know we're talking about California here, but the other straightforward explanation is that self driving cars are braking unpredictably compared to regular traffic, possibly when other drivers have judged that they wouldn't possibly need to brake.
Such accidents are only the fault of the rear car in a simplistic prima facie sense. Unexpectedly stomping on one's brakes in a fit of road rage is generally seen as causing any resulting accident, as should overzealous braking by a self driving vehicle that isn't following the norms of the road.
For example, from the Waymo report of February 16, 2021:
> While in autonomous mode, the Waymo AV was stopped at a red traffic light at the unprotected intersection of 16th Street and Market Street. After the traffic light turned green, the Waymo AV began to proceed and then came to a stop as it yielded to oncoming traffic also entering the intersection. A passenger vehicle behind the Waymo AV then made contact with the stationary Waymo AV’s rear bumper
This could have been the Waymo turning left and needing to yield to oncoming traffic, OR it could have been the Waymo going straight through the intersection and slamming on its brakes due to someone coming in from the right that hadn't stopped before the Waymo's judgement margin. It seems impossible to tell from the accident report, but the two have very different root cause analyses.
I know everyone considers themselves to be a good driver. I consider myself to be a relatively good driver and generally keep very conservative follow distances (no reason to be on someones bumper).
Even when paying attention, I've been in situations where I've had to apply the brakes harder than I should have to avoid a rear end. Sometimes you just don't read the situation correctly and need to play catchup.
You'd have to give people whole new brains.
My grandmother had a condition where her eyes forced shut, completely out of her control. She insisted she could still drive without issue. She ignored multiple friends and family telling her she'd murder someone.
Last Saturday I had to dodge out of the way of a car (fortunately moving very slowly). The woman driving was staring in my direction at me as I waved my arms. That she nearly ran me over eventually clicked and she rolled down her window to appoligize.
If I had a dollar for every person I see driving around in thick rain, or in full darkness with their lights off, I could probably buy a nice new macbook air.
It's truly shocking how many people are far from alert and observant while driving. Either because they really don't care, or they've been conditioned to take driving very casually.
I use the number of cars I see driving with their lights off as sort of a gauge of how many 'bad' drivers might be on the road, in a 4-5km radius, at any given time... And it's a percentage that is too damn high. My working theory being that if people are putting a car into gear and getting into traffic without checking that their lights are on first, they're probably far from alert and ready in all other driving scenarios as well.
I try to spread my braking over the space I have to do so. Lower braking force over a longer time is much safer. It both gives people more time to react, and if a collision does occur the speed differential is lower, and less damage will occur. This doesn't save you from getting rear ended while stopped, but that's a minority of rear endings from what I have seen.
The problem: on long stretches of road, especially with tunnels or trees acting as side wall, human biology gets in the way - we literally don't perceive stuff happening properly anymore (and as usual, German has a word for it: Tunnelblick, tunnel vision). The additional factors of people being (too) tired to drive or stressed out by children, traffic jams etc. also won't disappear no matter how much enforcement there is... the only thing that enforcement can suppress is phone usage and drug usage.
Better safety systems save lives!
Anyway, I don't think the goal is to prevent accidents entirely; that would be ideal, but it'll be really hard. More important is to prevent injury and death. A car is just hardware, that's what insurance is for in the end. In that regard, even without automation, cars have gotten safer over the decades with roll cages and airbags everywhere.
I drive an aged car and drove a friends recent Mercedes long distance. The thing has a feature where it breaks where it thinks it’s good to break, often just when I need to accelerate to get out of a tight spot. I need the vehicle to respond to my inputs directly, not mix my inputs with what it thinks must be done. It’s extremely unpredictable and it feels like I’m not in control of the car.
Also I'm wondering what kind of situations there are where you "need to accelerate to get out of a tight spot?" I'm a pretty experienced driver and I think the number of "must accelerate" situations I've ever encountered could be counted on a hand. Almost universally, you can solve whatever problem you're in by being more patient and decelerating instead. For example, if you're trying to make an aggressive merge into the passing lane because someone is coming up quickly behind, you could just wait until that person passes. If you're trying to get up to highway speed as you merge, but someone is slow in front of you, it might be best to leave that person more of a gap and wait until they go. Even in the case of passing on a two lane road (a situation where I doubt emergency braking would trigger erroneously), you've already made a mistake if a rapid acceleration is needed to avoid a dangerous scenario. Etc.
It has warned with a beep signal I think 2 times when it deemed the car too close to other traffic.
> Forward collision warning plus autobrake is associated to a 50% decrease in front to rear crashes.
Similar stats are reported from around the world.
Low friction surfaces, weird car angles, poor radar/camera visibility and too high entry speed all mean the system isn't going to be 100% effective.
[1] https://en.wikipedia.org/wiki/Collision_avoidance_system
Not necessarily. I have a Honda, and the system in my car is deliberately designed to only mitigate a collision, not prevent one (e.g. it will only engage at the last second to slow the car, not stop it). I'm speculating, but I'm guessing the idea is to discourage people from "testing it out" or relying too much on it.
I've driven some some recent cars with a lot of safety systems and more advanced cruise control and they always felt like a bad human driver. They seem to be unpredictable and abuse the breaks, especially when driving on highways where you should just drive with the flow and look in front of you to see what's going on at least 2 or 3 cars ahead of you. I don't know why I have the feeling that technically the Waymo cars are behaving like they should, but some of their actions are just too unpredictable or come all of the sudden, surprising the drivers behind them with sudden breaks and actions that a human driver wouldn't do, or would be more predictable at it.
How is that feature related to self-driving cars? Are you saying self-driving cars get rear-ended more than human-driven cars do?
Slamming on the brakes is a flaw. Even if every car on the road had the ability to respond instantly to the car ahead, being forced to brake hard still risks loss of control especially in bad conditions.
Also, having automatic emergency braking on all vehicles is at least decades out. It will never work on motorcycles or scooters or bicycles, period. It cannot be retrofitted onto existing cars. Requiring it for all new cars would require pretty hamfisted regulations. I expect human drivers to be widespread for at least the rest of my life.
Driving on the road is not some kind of high speed car chase. If one had left enough space to the car in front based on the current road conditions and speed one wouldn't have to brake hard. Anyone can adjust their speed to increase the distance to the car on front. Often arguments sound like driving too close or too fast are inevitable facts of life.
I had a look and couldn't find an answer; does their increased rear-ending frequency result in a decrease in head-on collisions? Is waymo getting rear ended to avoid being t-boned (or a head on collision with someone else)?
Just look at the cross intersection they use as an example, typical of the US. The ROAD is the death trap there. People are crossing each other at high speed, this is the BUG, and should be fixed.
The proper solution is putting there a roundabout, and you will have instantly less fatal accidents.
The US was the first to use roundabouts, but those early designs did not work well, and as a result we have those monstrosities like in the picture all around the US.
With electric cars that record accidents, the first bugs that we should fix are those in the roads, like those concrete barriers that have no smooth transition but a front wall out of nowhere.
> The proper solution is putting there a roundabout, and you will have instantly less fatal accidents.
People are only "crossing each other at high speed" in these simulations because the scenario involves someone running a red light:
> Here, for example, you can see on the bottom of the screen that the simulated Waymo Driver avoids a reconstructed version of a real-life fatal crash by obeying the speed limit—and not running a red light, as the initiator did in real life [emphasis mine]...
I don't think "roundabout all the things" is the answer to every traffic and road safety problem. This looks like an intersection between two 3/4 lane roads, and you'd need monster roundabout for it that seems like it would be confusing and/or slow. Also, a road can "have instantly less fatal accidents" by being closed or turned it into a traffic jam. IMHO, the road engineers need to balance accident avoidance, throughput, and user-friendliness.
https://www.safetylit.org/citations/index.php?fuseaction=cit...
Deleted Comment
First, they're omitting the types of case they're failing to mitigate (rear endings in this case). Cue "60% of the time it works every time" anchorman scene.
Second, "mitigated" is defined to mean a 25% relative reduction in chance of serious injury. Which is good, but again smacks of "60% of the time it works every time".
Basically, it looks to me like someone wanted to be able to show a bold blue 100% and required the definitions they were using to be massaged to match. And apparently they got their way. This has damaged my trust in future summaries put out by Waymo.
A more truthful summarization style would be e.g. "X fewer serious injuries over 72 incidents", referring to expected injuries and simulated incidents of course.
I suspect that when we have self driving cars, it's going to take off very quickly due to insurance. In particular, if you are in a non self driving car and in an accident with a self driving one, it'll be hard to not be shown as at fault. So insurance I would guess go up based on the percentage of self driving cars, and likely be really cheap for self driving cars.
I wonder if there's a period though where people cut off self driving cars, knowing they won't get hit. I suspect pedestrians in cities will jaywalk a lot more (and maybe that's ok).
I personally drive 5-6 mph over the speed limit on freeways when the weather and visibility is good.
But I absolutely 100% do not care if the vehicle is driving the speed limit if I do not have to be controlling it. What do I care if the vehicle takes an extra 10 minutes to get to the destination? It'll just mean 10 extra minutes of work or nap time.
I also suspect we'll see cars looking more like living rooms or offices.
The other thing is that as we have more self driving cars, speed limits will be able to be relaxed. Most humans can't safely drive at 100mph, but on a road of only self driving cars that seems very possible. So that 1 hour commute radius gets bigger and bigger.
> Self driving cars will drive the speed limit
Tesla is proving both to be false, today.
I remember the first mobile phones and everybody said that will stay niche, because why would you need to call if not at home? (Except sales people)
Convenience is a very, very strong motivator.
What they don't want is a self driving product where they bear liability for the mistakes the self driving product makes.
* On the one hand I totally agree with you, the majority of people don't care about driving and just want to get from Point A to Point B. Given the opportunity to sit back and watch Netflix vs fighting traffic and having lower insurance premiums I think will make self-driving cars a tempting combination.
* On the other hand though, outside densely populated places where there is less traffic and more unpredictable terrain, self-driving cars are going to have an extremely uphill battle. First with the lack of extreme traffic, there isn't much of a case for needing for a car to drive itself.
Then there's the issue of trust. Say you're out in rural Idaho, nearest anything is 50+ miles away. Out there all those cool tech gizmos and self driving tech is a massive liability. If you're out in the boonies and your car looses connection and refuses to drive what do you do? What about if some wired internal computer system breaks? This is why cars like the Toyota 4Runner and Tacoma are so so popular, they are old school simple cars but more importantly, you can take the thing out the middle of nowhere and trust that you won't get stranded by "cutting edge technology"
No, tech companies will just hide a "Not actually self-driving, driver takes full responsibility" sticker near the VIN. It'll be easy to blame the human for negligence then.
0 - https://www.caranddriver.com/news/a15352720/volvo-will-take-...
Ideally, once the SDC deviates, all other agents are simulated as well. A tall order, but necessary if these counterfactuals are to hold weight.
> I’ve spent over 20 years working in crash avoidance research, in the belief that improved driving technology is the key to reducing these needless deaths.
This is a noble goal, but I find it really damn hard to ignore the possibility that this is all just another ploy by Google to steal our data and violate our privacy. Just imagine trying to push back against a technology that saves lives for something as "frivolous" as privacy.
Why does it seem like the consumer can never win? The free market is supposed to be self balancing, yet I can't remember a time where it actually felt that way in the technology sector.
I'm up for any of those outcomes and am also concerned about giving up data unnecessarily. I think the crux of the article besides hyping up self-driving simulated driver's better decision making is that roads aren't designed as safely as they could be and some minor inconvenience to the user (time, some data given up) could save lots of lives.
2) What input is the Waymo driver using here? If these were not originally Waymo cars, then is it perceiving simulated video or getting raw access to ground truth vehicle positions?
You could estimate how avoidable an accident was by the percentage of human drivers in the simulated "responder" vehicle that avoided it.
Maybe some accidents are easily avoided by humans because the real responder driver was distracted. I'm not impressed if Waymo driver can also avoid these.
But some accidents may be very difficult to avoid for humans. If Waymo driver can perform better in these cases then it's truly remarkable.
Deleted Comment