Once I'd had it pointed out to me that these articles go out of their way never to refer to people on foot as actual people, I just can't unsee it.
The only thing referred to as human here is the driver, and the only person whose safety is discussed is the driver. Clearly at some point during the composition process the writer got bored of using the term "pedestrian", because it looks like he hit up the thesaurus, discovered the term "biped", and decided that would be a good term to use. (But I bet you a fiver these cars won't stop for kangaroos or ostriches other than as part of some generic collision avoidance system.)
In the writer's defence, the sentence about the driver's safety is a mite ambiguous - but it could have been more explicit, so I'm going to be uncharitable here. And I do give him points for sneaking the term "run people over" in the last sentence, because he could so easily have referred to it as an accidental unavoidable pedestrian collision incident or something.
Entire cities are built around our love for cars — the ideology of the superiority and desireability of cars is deeply buried within the fibre of society. It could be that the author thinks they are just being neutral, while the neutral thing would be to describe this technology as something that regulates the interaction between equally valuable human beeings.
This is car culture 101: expensive Cars (and their drivers) are the top of the foodchain, cyclists and pedestrians are the bottom. Depending on where you live this might differ (e.g. in the Netherlands cyclists are on the same level as motorists and you can really see it in the architecture).
Kind of like how they go out of their way to refer to people operating cars as drivers instead of actual people?
This kind of language is par for the course because these articles basically use a less terse version of whatever language the study or official report uses and those kinds of documents tend to shy away from things like "people driving cars" and "people walking".
One of the side-effects of using "human centered language" tends to be needless verbosity and resulting clumsy writing as the human being writing the piece of writing labors to appropriately center the human being or human beings being discussed.
To me, the term "bipeds" seemed like a stylistic choice to emphasize the shortcomings of the systems -- the algorithms are not sophisticated enough to understand what a "person" is, so the author used a different word to emphasize that fact.
And the rest of the article is about a test with crash dummies. Why would the author talk about "people" when the tests were performed with dummies?
> And the rest of the article is about a test with crash dummies. Why would the author talk about "people" when the tests were performed with dummies?
Because the upshot of the tests is that if your car is equipped with ACAS that "detects pedestrians," then it may lull you into a false sense that your car will avoid running people over and killing them even if you stop paying attention. The reality is that the ACAS is not foolproof and you shouldn't rely on it. But these systems are likely to create a Volvo Effect, where the people who have them drive much more recklessly than people without them.
Just because they aren't perfect doesn't mean they don't make your car safer. The writing here almost sounds like the author would prefer not having it. Having AEB on your car can't make it more dangerous unless the AEB system accelerates into pedestrians. You have two options-
1. A car that only brakes when you react
2. A car that brakes when you react, and also sometimes when you don't.
The only way that AEB might conceivably make you less safe is if you pick up bad driving habits because you expect the all-knowing robot to keep you safe. This certainly applies to some people, but that's the person's fault, not the manufacturers fault for rolling out systems that are better than nothing, even if they aren't perfect.
Edit: I'll acknowledge that unexpected braking events might mean that AEB equipped cars have a higher risk of being rear-ended, but this test was about pedestrian safety. I'm including this edit just because I know someone will bring it up thinking it proves that they are smart even if it wasn't the point of the study.
The trouble is that, humans being humans, drivers come to rely on these systems and decrease their own vigilance, ultimately resulting in greater danger to pedestrians.
There may be some evidence this is true for certain specific features, but I think particularly with AEB, the vast experience most drivers have with the feature are just the false positives. Most people just get annoyed by it and want it off.
The very few times someone will have a positive experience with AEB it’s likely to be a huge adrenaline inducing panic response. As in, oh fuck fuck fuuu.. phew!
It think would take quite a bit of psychosis to actually drive in a way that you were “relying” on AEB.
I’ve never seen anyone “rely” on auto emergency breaking before to drive more recklessly. Such a concept reminds me of an old Aerosmith video where some kid runs a car into a wall Just because they know the airbags will save them.
>The only way that AEB might conceivably make you less safe is if you pick up bad driving habits because you expect the all-knowing robot to keep you safe. This certainly applies to some people, but that's the person's fault, not the manufacturers fault for rolling out systems that are better than nothing, even if they aren't perfect.
Not quite. Imagine you are driving along a narrow but straight sections of a road. The road is wet, there are leaves on the surface and shallow puddles of water. You're doing 50 mph (or 80 km/h) and you feel comfortable because visibility is good, there are no other drivers, no animals and you can see for hundreds of meters in front of you and there are empty fields beyond one row of trees on each side of the road. Then suddenly a flicker of light or some shadow makes your AEB system brake suddenly as if trying to avoid hitting a pedestrian right in front of you. Your car's ABS does it's best, but the leaves, the water and the speed mean your car starts sliding, when the tires recover traction the whole car is angled 20 degrees to the left.You have no time to do a right turn to recover and you end up crashing into one of those thick trees still going 30 mph (or 50km/h). If you're lucky you're ok. The car is totalled. Will you buy another car with that automated braking system?
You’re describing and absolute failure of the ABS system. If your car has such a massive flaw little else matters, it’s inherently unsafe. Granted, if you had turned off whatever version of vehicle stability assistance came with the car and ABS had some major defect it’s possible, but again at that point you’re describing several failures.
I don’t mean to suggest the AEB system is without flaws, but if applying sudden breaking cause the break lines to rupture or whatever, that’s a serious failure on it’s own.
Didn't once have an unexpected braking event on my old '14 Mazda 3 Astina (6MT sedan w/radar cruise as well).
LOVED that car so much. They just get the UX so much.
Tested driving through empty cardboard boxes three times from 30km/h steady not moving inputs a mm:
First time detected and applied braking, but stopped 5cm into them, only just knocking top one off the bottom.
The next two times it stopped just in front. All times car engine stalled into stop and go mode, IIRC, and applied hazards automatically.
(Was I one of the three only drivers who ever verified the systems workings?)
In dense highway traffic, on a few instances it alerted me to stopped traffic ahead when I was checking mirrors and tired. Was able to stop myself before AEB.
Found myself using radar cruise all the time after that, as it applies braking earlier.
Loved the rear cross sensor especially when pulling out from 45d angle parking, and doubly so when being parked next to a truck. One does not simply reverse into these. AFAIK Tesla still doesn't offer rear cross alert.
Did I mention BLIS with visual indicators in the mirrors? Or the fully functional HUD? All this for a fraction of the cost of a Tesla or other luxury vehicles offering the same - in 2014 this list was far shorter than today. Really miss it, would buy again if we had to have another 2nd car.
> Just because they aren't perfect doesn't mean they don't make your car safer.
My reading of this article is different: indeed they do make your car safer, if you're driving it. But we're being told that self-driving cars are just around the corner and regulators are beginning to allow self-driving cars on our streets. This article shows just how poor the technology is, and how we are nowhere near ready for self-driving cars on our streets.
>But we're being told that self-driving cars are just around the corner
I'm sure not hearing many people saying that these days. I'm not sure there's a real consensus on exactly where things will be in 10, 20, and 50 years. But there seems to be a pretty broad consensus at this point that pretty much nothing is "just around the corner."
Once most of the people with a vested interest in self-driving being imminent found they couldn't credibly keep to that storyline, things got quiet in a big hurry.
To be fair to self driving cars, none of the vehicles being tested are Waymo vehicles. There are obviously some that are much farther along on this than others.
> The only way that AEB might conceivably make you less safe is if you pick up bad driving habits because you expect the all-knowing robot to keep you safe. This certainly applies to some people, but that's the person's fault, not the manufacturers fault for rolling out systems that are better than nothing, even if they aren't perfect.
When you bring up fault, it implies you view this as a conscious change in behavior.
It could be. But many driving habits can't really be conscious decisions as they're judgement calls that one couldn't explain if pressed. If visibility is poor and you decide to drive slower than the speed limit, you are picking a speed that feels right. The way you approach an intersection or a curve with bad line of sight is based on your sensation of the possible risk.
It seems impossible to figure out how people respond to a moral hazard individually, but there is a strong enough signal in aggregate that the phenomenon is well known by insurers.
If the unconcious factor outweighs the benefits of the system, an AEB is a net negative.
Like this article says, in city traffic that's stop and go, these systems could prevent rear-endings. But it could also make drivers more complaisant and not pay as much attention in situations where these systems don't work as well (higher speeds on freeways or with pedestrian detection).
I don't get it. Are you arguing that because the systems quietly encourage habits of mind that the human driver isn't responsible for knowing that these habits exist and correcting them? Because that subtly implies that human drivers shouldn't be at fault for anything the car does. And that's a pretty scary attitude for me considering that I walk and bike on the roads you drive on.
There are good rules of thumb for things like approaching intersections(slow down, move foot to brake pedal), and you can use EG the dotted lines of the road to decide whether you're going too fast for conditions. The rule of thumb is that you pick a dot that just enters your field of view, and then count out how many seconds it takes for it to disappear below your hood. If it's less than about 5, you are driving too fast for conditions and should slow down. And the rule of thumb is 3 seconds for a strip appearing from behind the car in front of you for a safe following distance on the highway.
Euro NCAP has a category named "Vulnerable Road User Protection". Test try to determine how forgiving the front of the car is when you run someone over.
Cars can get extra points in that category if they have AEB features. But if it turns out that AEB isn't effective, this just makes dangerous cars (like SUVs or vans) appear safer than they really are.
So I'd say it's fair to criticise AEB. It's misleading to claim "AEB makes your car safer for pedestrians" when in reality a car with a lower hood and bigger windows without AEB would be a lot less dangerous for pedestrians.
Right, but right now there is a real issue (https://www.google.com/search?channel=fs&q=tesla+driver+auto...) with people b̶e̶i̶n̶g̶ ̶t̶r̶i̶c̶k̶e̶d̶ ̶i̶n̶t̶o̶ believing that their cars are more autonomous than they actually are. Its good to assume that your car will not avoid accidents by itself.
Abrupt, unexpected breaking could cause a rear-end collision, which could conceivably kill quite a few more people than if the breaking had not occurred. Imagine a car breaking hard directly in front of 16-wheel semi-trailer. That could lead to a pile-up and quite a few people dead.
I don't know if you started typing your comment before I got my edit up or not, so I'll give you the benefit of the doubt.
Yes, unexpected braking events can cause rear-end collisions, but that was not the point of this study. If a pedestrian walks in front of a car unexpectedly you can either try not to hit them by braking (or possibly swerving if the situation permits), or you can just take the action you're implying and plow right on through them because your unexpected braking might cause some other people to get hurt.
My 2016 Subaru Forester with Eyesight has gotten confused and applied the brakes while weaving through the gates at military bases (these are low speed and lined with barriers), but other than that I have had no unexpected braking events in my car that weren't justified. In 40k miles the Eyesight system in my car has helped with about six close calls (I can't know if they would have become collisions), and haven't been rear ended. Right now I'm going to say that AEB systems add plenty of safety value to offset the increased risk of a rear end collision.
A rear-end collision is caused by the following driver being too close. A car in front could at any time, for any reason, use full breaking force and you have to plan your following distance appropriately.
That said, a good way to get an angry rant from a truck driver is to talk about cars cutting in front so they no longer have an appropriate stopping distance.
The same is true if there is a pedestrian, so the fault lies with the truck driver being too close to stop in time. Perhaps that means the law should mandate automatic braking technology on trucks.
How is this different from a human slamming on the breaks to avoid the accident? Or a human crashing into the truck. These situations can also lead to a rear end collision the same was as AEB would.
It is irreleveant if AEB fails when a human doesn't- the human and AEB systems are complimentary. When the human driver doesn't fail, then it's okay for AEB to fail. AEB only needs to pick up the slack when a human isn't fast enough to get on the brakes. Even if it only works in 10% of those situations, it's still preventing collisions.
Agree completely - Eyesight impresses me on a regular basis on the complexity of scenarios it seems to be able to react to, both when it comes to collision alerts / automatic breaking and when used for assisted cruise control.
I haven’t had any experiences with pedestrians at speed in front of the vehicle, but the rear automatic breaking in my Subaru is very aware of / sensitive to people walking behind it.
I’ve seen that NCAP video and was really impressed by it at the time. It doesn’t jive with the results in TFA at all - where it sounds like the Tesla (and most of the other cars they tested) hit the dummy every time, and often never even slowed down.
I wonder how to explain the disparity. Could the video you linked have been taken with AutoPilot enabled maybe?
From reading the description it seems like AAA used tests where there was a bend in the road and / or the pedestrian was moving across so they weren't actually in the path of the vehicle until a couple seconds before impact. These systems are going to be conservative in classifying potential obstacles as in need of emergency braking, as they should be - you don't for example want your car slamming the brakes because it sees a pedestrian on the sidewalk next to the lane you're driving on.
That said, it should be noted that the Honda Accord actually did quite well in these tests.
Touchscreen dash; training people to expect their car will do stuff for them (beep if something is behind, alert if someones in the next lane, break if something is in the way, drive); cellphones (both drivers and pedestrians); pedestrian crosswalks in the middle of busy roads with new signaling most drivers have never been formally trained on; motorcycle lanes between car lanes; bike lanes on the side of the road, sometimes second lane from the side, sometimes on thruways. The list goes on and it's getting worse everyday. Keep it simple stupid.
FYI, worth reiterating, a deep learning vision system will not necessarily recognize a dummy as a human. Particularly if it operates in both visible and IR spectrum.
That's a fair point which also occurred to me while reading the article. It is, I think, indicative of a deeper issue with using ML in these sorts of safety contexts. If the only way to really test your safety system is to actually put people in danger your whole concept may be problematic.
That's why Tesla's approach is pretty brilliant IMO. It's easy to collect samples where there was hard braking and there was a real, actual human visible in the path of the car while the car is under human control. No dummies are needed, and AI was not in control of the car, so there's no ethics issue either. Your Tesla will upload such samples automatically if Tesla deep learning system wants them.
So one good thing about the (perhaps unfair? perhaps disproportionate?) media attention that failures of semi-autonomous systems get, over the long term I'm pretty sure it helps focus the teams to make the tech better.
It's how aviation got insanely safe. Every passenger airplane crash is scrutinized. The very distortion in thinking that makes people think flying in airplanes is more dangerous than cars is what motivates every crash to be scrutinized.
The only thing referred to as human here is the driver, and the only person whose safety is discussed is the driver. Clearly at some point during the composition process the writer got bored of using the term "pedestrian", because it looks like he hit up the thesaurus, discovered the term "biped", and decided that would be a good term to use. (But I bet you a fiver these cars won't stop for kangaroos or ostriches other than as part of some generic collision avoidance system.)
In the writer's defence, the sentence about the driver's safety is a mite ambiguous - but it could have been more explicit, so I'm going to be uncharitable here. And I do give him points for sneaking the term "run people over" in the last sentence, because he could so easily have referred to it as an accidental unavoidable pedestrian collision incident or something.
This is car culture 101: expensive Cars (and their drivers) are the top of the foodchain, cyclists and pedestrians are the bottom. Depending on where you live this might differ (e.g. in the Netherlands cyclists are on the same level as motorists and you can really see it in the architecture).
And the modern police force.
This kind of language is par for the course because these articles basically use a less terse version of whatever language the study or official report uses and those kinds of documents tend to shy away from things like "people driving cars" and "people walking".
You are finding meaning where there is none.
https://usa.streetsblog.org/2018/03/28/how-coverage-of-pedes...
And the rest of the article is about a test with crash dummies. Why would the author talk about "people" when the tests were performed with dummies?
Because the upshot of the tests is that if your car is equipped with ACAS that "detects pedestrians," then it may lull you into a false sense that your car will avoid running people over and killing them even if you stop paying attention. The reality is that the ACAS is not foolproof and you shouldn't rely on it. But these systems are likely to create a Volvo Effect, where the people who have them drive much more recklessly than people without them.
Deleted Comment
1. A car that only brakes when you react
2. A car that brakes when you react, and also sometimes when you don't.
The only way that AEB might conceivably make you less safe is if you pick up bad driving habits because you expect the all-knowing robot to keep you safe. This certainly applies to some people, but that's the person's fault, not the manufacturers fault for rolling out systems that are better than nothing, even if they aren't perfect.
Edit: I'll acknowledge that unexpected braking events might mean that AEB equipped cars have a higher risk of being rear-ended, but this test was about pedestrian safety. I'm including this edit just because I know someone will bring it up thinking it proves that they are smart even if it wasn't the point of the study.
The very few times someone will have a positive experience with AEB it’s likely to be a huge adrenaline inducing panic response. As in, oh fuck fuck fuuu.. phew!
It think would take quite a bit of psychosis to actually drive in a way that you were “relying” on AEB.
Not quite. Imagine you are driving along a narrow but straight sections of a road. The road is wet, there are leaves on the surface and shallow puddles of water. You're doing 50 mph (or 80 km/h) and you feel comfortable because visibility is good, there are no other drivers, no animals and you can see for hundreds of meters in front of you and there are empty fields beyond one row of trees on each side of the road. Then suddenly a flicker of light or some shadow makes your AEB system brake suddenly as if trying to avoid hitting a pedestrian right in front of you. Your car's ABS does it's best, but the leaves, the water and the speed mean your car starts sliding, when the tires recover traction the whole car is angled 20 degrees to the left.You have no time to do a right turn to recover and you end up crashing into one of those thick trees still going 30 mph (or 50km/h). If you're lucky you're ok. The car is totalled. Will you buy another car with that automated braking system?
I don’t mean to suggest the AEB system is without flaws, but if applying sudden breaking cause the break lines to rupture or whatever, that’s a serious failure on it’s own.
LOVED that car so much. They just get the UX so much.
Tested driving through empty cardboard boxes three times from 30km/h steady not moving inputs a mm:
First time detected and applied braking, but stopped 5cm into them, only just knocking top one off the bottom.
The next two times it stopped just in front. All times car engine stalled into stop and go mode, IIRC, and applied hazards automatically.
(Was I one of the three only drivers who ever verified the systems workings?)
In dense highway traffic, on a few instances it alerted me to stopped traffic ahead when I was checking mirrors and tired. Was able to stop myself before AEB.
Found myself using radar cruise all the time after that, as it applies braking earlier.
Loved the rear cross sensor especially when pulling out from 45d angle parking, and doubly so when being parked next to a truck. One does not simply reverse into these. AFAIK Tesla still doesn't offer rear cross alert.
Did I mention BLIS with visual indicators in the mirrors? Or the fully functional HUD? All this for a fraction of the cost of a Tesla or other luxury vehicles offering the same - in 2014 this list was far shorter than today. Really miss it, would buy again if we had to have another 2nd car.
My reading of this article is different: indeed they do make your car safer, if you're driving it. But we're being told that self-driving cars are just around the corner and regulators are beginning to allow self-driving cars on our streets. This article shows just how poor the technology is, and how we are nowhere near ready for self-driving cars on our streets.
I'm sure not hearing many people saying that these days. I'm not sure there's a real consensus on exactly where things will be in 10, 20, and 50 years. But there seems to be a pretty broad consensus at this point that pretty much nothing is "just around the corner."
Once most of the people with a vested interest in self-driving being imminent found they couldn't credibly keep to that storyline, things got quiet in a big hurry.
When you bring up fault, it implies you view this as a conscious change in behavior.
It could be. But many driving habits can't really be conscious decisions as they're judgement calls that one couldn't explain if pressed. If visibility is poor and you decide to drive slower than the speed limit, you are picking a speed that feels right. The way you approach an intersection or a curve with bad line of sight is based on your sensation of the possible risk.
It seems impossible to figure out how people respond to a moral hazard individually, but there is a strong enough signal in aggregate that the phenomenon is well known by insurers.
If the unconcious factor outweighs the benefits of the system, an AEB is a net negative.
https://www.wired.com/2011/07/active-safety-systems-could-cr...
Like this article says, in city traffic that's stop and go, these systems could prevent rear-endings. But it could also make drivers more complaisant and not pay as much attention in situations where these systems don't work as well (higher speeds on freeways or with pedestrian detection).
There are good rules of thumb for things like approaching intersections(slow down, move foot to brake pedal), and you can use EG the dotted lines of the road to decide whether you're going too fast for conditions. The rule of thumb is that you pick a dot that just enters your field of view, and then count out how many seconds it takes for it to disappear below your hood. If it's less than about 5, you are driving too fast for conditions and should slow down. And the rule of thumb is 3 seconds for a strip appearing from behind the car in front of you for a safe following distance on the highway.
It's not nebulous at all.
Sure it can. If "drivers" rely on this instead of being vigilant themselves, then it absolutely does make the car less safe.
Cars can get extra points in that category if they have AEB features. But if it turns out that AEB isn't effective, this just makes dangerous cars (like SUVs or vans) appear safer than they really are.
So I'd say it's fair to criticise AEB. It's misleading to claim "AEB makes your car safer for pedestrians" when in reality a car with a lower hood and bigger windows without AEB would be a lot less dangerous for pedestrians.
Yes, unexpected braking events can cause rear-end collisions, but that was not the point of this study. If a pedestrian walks in front of a car unexpectedly you can either try not to hit them by braking (or possibly swerving if the situation permits), or you can just take the action you're implying and plow right on through them because your unexpected braking might cause some other people to get hurt.
My 2016 Subaru Forester with Eyesight has gotten confused and applied the brakes while weaving through the gates at military bases (these are low speed and lined with barriers), but other than that I have had no unexpected braking events in my car that weren't justified. In 40k miles the Eyesight system in my car has helped with about six close calls (I can't know if they would have become collisions), and haven't been rear ended. Right now I'm going to say that AEB systems add plenty of safety value to offset the increased risk of a rear end collision.
That said, a good way to get an angry rant from a truck driver is to talk about cars cutting in front so they no longer have an appropriate stopping distance.
If you’re tailgating. Don’t tailgate.
There is no evidence that they make your car safer, but there is evidence that AEB fails in common situations or in ways a human driver wouldn't.
https://www.caranddriver.com/features/a24511826/safety-featu...
I haven’t had any experiences with pedestrians at speed in front of the vehicle, but the rear automatic breaking in my Subaru is very aware of / sensitive to people walking behind it.
https://youtu.be/cMiZa3HgRVE?t=125
The systems work as described and don't claim to work in every scenario.
https://www.tesla.com/sites/default/files/model_3_owners_man...
I wonder how to explain the disparity. Could the video you linked have been taken with AutoPilot enabled maybe?
That said, it should be noted that the Honda Accord actually did quite well in these tests.
It's how aviation got insanely safe. Every passenger airplane crash is scrutinized. The very distortion in thinking that makes people think flying in airplanes is more dangerous than cars is what motivates every crash to be scrutinized.
Not by media attention though; it took an authoritative, hard-line government body to reach that level.