Readit News logoReadit News
Gratsby · 10 years ago
I live in the area. The stinking buses around here ... They crowd the lanes regularly. Us human drivers I suppose are used to it, but it is a serious pain in the butt.

The buses will drive right next to the lane marker, with their mirrors hanging over into your lane. This makes it so everybody has to creep over a little bit into the drivers side lane and hopefully everybody in traffic has a small enough car to deal with it. Otherwise, you have to hang back behind the bus as if it's in your lane and wait for it to get to a stop.

I've had my issues with Google autonomous cars (they drive slow and they used to be exceptionally slow at making right hand turns, causing traffic problems), but in this instance I'm happy to throw VTA under the bus, if you will, and lay blame 100% at their feet.

DBNO · 9 years ago
Here's a link to a video showing the damage to the Google car: https://youtu.be/wBU9zsGQR5k

Here's a link to the DMV Google traffic report: https://www.dmv.ca.gov/portal/wcm/connect/3946fbb8-e04e-4d52...

From reading the above report, it seems it was officially only a one-lane road, but the road was big enough to handle two-streams of traffic. The google car could have just stayed in the middle of the road, but instead, was hugging the right side of the road in preparation for a right-hand turn. Due to sand bags next to a storm drain, the google car had to "merge" back into the one-land road to get around the sandbags. Considering it's still a one-land road, the bus driver should have yielded to any car that was in front of it. I'd place a majority of the blame on the bus.

What I don't understand: Why doesn't the Google car have video of the accident? Or if they do, does anyone know if they will share a video of it?

ocdtrekkie · 9 years ago
I suspect Google will render some of the fancy video of 'what their car sees' of this incident like they've used in previous marketing materials. Those fancy videos aren't rendered on the spot though! We'll probably see it soon.
megablast · 9 years ago
I would rather a bus any day. Easier to see, move slowly, less dangerously, less off them, more predictable, and mostly pretty courteous. Completely unlike drivers.
lucaspiller · 9 years ago
When I was in London a few years ago I used to cycle to work and the buses were the worse. Taxis would beep or gesture at you, but at least they gave you space.

One time I was stopped at a set of traffic lights waiting for them to turn green. It was on a tight bend and I was about 20cm from the line, and 50cm from the inside of the road. A bus came up and decided for whatever reason he had to get in front of me (even though there was a stop directly on the other side of the junction). There was enough space for a car, but definitely not for a bus as it was on a bend.

He slowly kept creeping forward, and the side of the bus kept getting closer and closer. I starting banging on the window as he got next to me, but no he kept going. I was far enough back that I was out of his blind spot. In the end I had to move onto the pavement.

I then walked to the front and banged on the door, gave him a few choice words and the driver just shrugged.

seizethecheese · 9 years ago
> move slowly, less dangerously

f = ma. The bus is considerably more dangerous than a car, even while traveling at a significantly lower speed. As the parent comment noted, there are certainly some aggressive bus drivers out there as well.

x3n0ph3n3 · 9 years ago
You would rather what a bus?
bitshepherd · 9 years ago
Except in the Bay Area, where bus drivers move those slow, more predictable machines as if they were sports cars.
superuser2 · 10 years ago
Indeed. Buses are held up as a solution to traffic, but they are terrible things to share the road with as either a driver or cyclist. I wonder how many people have to be on the bus for that many cars to be as annoying as a single bus.
jkyle · 9 years ago
I've been cycling as a primary mode of travel for about 10 years now in LA and in the Mountain View area. So pretty broad depth of experience from bat shit insane metro traffic to the bucolic in comparison suburban commuting.

Car drivers are orders of magnitude more of a threat to my safety than buses. Buses are far more predictable and tend to change their vector of travel more gradually when they do change. Also, the drivers tend to be much more alert and aware of their surroundings.

Also, in general and particularly during commuter traffic, they really aren't that much slower aside from the stops.

I've also never, ever had a bus driver intentionally try to harm me in a bus (like trying to 'muscle' me off the road). This has happened on multiple occasions with car drivers.

edit

I'd also like to mention I love the google cars. They're just so predictable and courteous. And in general, they just follow the rules.

For example, at a stop sign that a human driver reaches first. They have the right of way, but they want to be polite so they'll try to wave a cyclist on...then start to go...then stop and wave....then start to go. There's nothing more dangerous to a cyclist then a driver not following the rules and behaving erratically.

Google car? Stops. Waits. Goes. No fuss. I have seen the older models get "stuck" and wait longer than normal. Though usually this is when other drivers refuse to go before the google car.

sharkweek · 10 years ago
Oooh fun thought exercise - My guess is about four cars equal the annoyance of a bus over the span of a year. Four randomized drivers will on average be slightly more annoying than an average year for one bus.

Taking into account:

1) Number of drivers removed from the roads by average annual bus use

2) Size of the vehicles

3) Random acts of annoyance

4) Dangers posed by each

To confess my bias - I'm a huge fan of mass transit and a bike commuter who touches his car maybe once a week, but curious how else one might model this.

jameshart · 9 years ago
most of the buses I see round here are full of either school kids or elderly folks. I suspect the annoyance they would cause by becoming drivers might be considerably more than they do as bus passengers.
esolyt · 9 years ago
Seems like Google acknowledged this:

"From now on, our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles"

mojuba · 9 years ago
I will probably get downvoted for this, but: this is why I am skeptical about self-driving vehicles. Driving in general and adjusting to each country/locality, then to vehicle types is such a human thing that you'd need a thousand more "deep understanding" things like this.

It will be an endless process for an algorithm driven car. Buses in Dublin will be so much different from buses in London, or trucks in the US different from trucks in Germany. Then there are the bikes, pedestrians. When human drivers/bikers/pedestrians look at each other and know what to expect. These are the things that would be incredibly difficult to formalize, if possible at all. Unless you have something like a Deep Mind on board, but even then I wouldn't be so sure.

nicholas73 · 10 years ago
I especially loathe the double length buses. Seriously, why? Not like these buses get all that full. Just run them more often at peak hours.
konspence · 10 years ago
That costs far more (in fuel, maintenance, and drivers), and typically places that have articulated buses warrant having articulated buses.
mnw21cam · 9 years ago
Running buses more frequently has its own problems. When a bus is running slightly further from the one in front than it should, it has to stop more often for passengers, and those passengers take longer to buy their tickets, which makes the bus run even later. This is a positive feedback loop that is made worse when buses are more frequent. I have certainly seen a set of four buses coming as a convoy on a route that is meant to have them every 12 minutes.

The solution is to either run fewer bigger buses, or to inject a whole load of negative feedback into the system, either by having buses wait for a minute at a few stops until their scheduled departure time (which involves making the route slower) or by real time tracking with radioed instructions to the drivers, like they do on some underground train systems.

sschueller · 9 years ago
We have triple length buses [1] in some areas of Zürich. They are all electric and in large parts of the route have their own lanes.

Buses need bus lanes or they will not be on time and add to traffic.

[1] http://www.bus-bild.de/bilder/hess-trolleybus-bgt-n2c-16700....

vacri · 9 years ago
> Just run them more often at peak hours.

Every bus you run requires a qualified employee to drive it, and putting them on for only an hour or two per shift would make it even more expensive (what employee would want to work such minor shifts?)

kennydude · 9 years ago
Bendy buses? They're great for accessibility but for some reason where I live we have double deckers everywhere :(
coin · 9 years ago
Would need more drivers. I suspect personnel are the largest cost.
cpprototypes · 10 years ago
There is a common assumption that humans = bad drivers. And that these self driving cars will be much better than people at driving. I think this belief greatly underestimates the difficulty of what Google and others are trying to do.

Humans are actually great drivers when the situation requires thinking. Lot of snow and can't see the lane markers? Millions of people adapt every day to this during winter. Lots of pedestrians, bicycles, motorcycles, etc. doing somewhat unpredictable things? Again, look at any Asian mega city, people can adapt very fast. Basically, if the situation requires being alert, people are very good at driving.

When are people bad a driving? Whenever it's monotonous. Bumper to bumper traffic or a free flowing freeway. Constant repetition of red/green cycles while going down a suburban street. These boring situations make a lot of people basically turn off their alert thinking. Then they do other things like texting, talking on phone, etc.

And these boring situations are exactly where AI self driving is better at driving than people. Computers never get bored. The self driving car will be at 100% attention even during the most boring traffic. But when boring suddenly turns to not boring? The current state of AI is very very bad at this.

And unfortunately there's no easy way to use the best of both sides. If the AI is fully driving, then it will do great while the driving is boring. But by the time the AI decides, this is too much, can't handle this, it's too late to alert the human driver to take over. But if the human driver is required to always pay attention, what's the point of the self driving car?

Self driving cars will get there someday, but I think it's much farther away than many assume it will be.

DannyBee · 10 years ago
"Lot of snow and can't see the lane markers? Millions of people adapt every day to this during winter."

Having driven a corridor with a ton of snow quite a lot (western PA to washington DC), i'm going to strongly disagree with this.

People don't "adapt". They just do it anyway because they have to. The rate of minor and major fender benders in this kind of situation is really high, it's just "acceptable".

In truth, computers that understand the physics of what is happening to the car at a given point in time based on tons of sensor data are going to be much more effective drivers in snow and ice than people (because of even simple data it can use like "this wheel has this much traction currently, so i need to correct this much"). People are really bad at figuring this out in their head and use horrible heuristics, often hitting other cars, guard rails, you name it.

You could also can get data about surrounding conditions (IE the ice the wheel will hit next or whatever) that people can't because they are not close enough/can't process fast enough (IE not just what is going on with the car, but what is the state of the ice on the road 2 inches to the right or left of the car and would it be better for the car to steer in that direction to get more traction, etc)

zanny · 10 years ago
This is it. There is nothing inherent to CV that makes self driving cars unable to operate in snow. Humans already suck massively at it (and in the rain).

I remember driving home in a massive rainstorm at night about two years ago with around 2 meter visibility and was going like 20mph because I could not see shit. I had less sensory data than a computer with extra vectors of vision beyond sight would have had, and my reaction time is a ton slower.

Then a year later I'm driving in icy conditions and count five cars that either off roaded or rear ended each other in the span of 15 minutes because of the slick surfaces. I imagine self driving cars would do a much better job slowing down and budgeting for potential slippage than impatient humans.

There is nothing really magical about our eyes that give us better clarity in awful driving conditions, and we don't have radar and GPS fed into our brains to give us a clue where the roads are supposed to be - we just rely on historical context and past driving experience on the same roads.

How many hours would it take a networked fleet of self driving vehicles to outdo my historical context for driving on any of the roads I've driven on for a decade? Two? Maybe three? You just need to get the tunable parameters right though some good old genetic algorithm work like what Google is doing now with its test fleet and while it would be impossible to guarantee perfect performance in every scenario (because inevitably an earthquake is going to make a self driving car crash where nobody could have prevented it) it just has to be better than us, and that has almost certainly already happened.

JaRail · 10 years ago
To be fair, a lot of those concerns are handled pretty well by traction control systems, etc. I drive in the snow a lot and really appreciate the things my car does to micro-manage traction.

The main rule for driving in snow is to assume everyone else is actively trying to kill you. :)

smoyer · 9 years ago
I know that corridor - and I agree with you. Unfortunately, there's also a huge number of people that don't treat snow with the respect it deserves. If you live in WV or OH (or in PA near enough to the border to fake it) you still don't need to have your car inspected. In PA, you can probably still find a mechanic that will slap a sticker on anything, but it's getting harder.

In any case, you need good tires and working brakes to navigate snow effectively. Anti-lock brakes (in my opinion) are no better than a person that can properly pulse their brakes but ... if you've got even a slight warp in your rotor, you won't brake evenly enough to stop in the snow. The other problem in PA is PennDOT's "new regime". There's been far more ice packed onto the roads when it snows lately - because melting snow that's packed under tires turns to ice. I'd far rather drive on dry, crunchy snow (for traction)

/rant

ninjakeyboard · 10 years ago
In Toronto I've seen many drivers on the highway splitting lanes because they have no idea where the lanes actually are.
bisby · 9 years ago
We can also make digital lane markers, doing whatever the case may be. If we have future infrastructure built with autonomous cars in mind, they can easily be better in those situations too.

The hard part of this we're trying to force self driving into our current infrastructure and traffic. If the bus was self driving, they would have communicated properly and merged perfectly.

rayiner · 10 years ago
Some of this stuff about self driving cars sounds to me like the "sufficiently smart compiler" that can can optimize away the overhead of certain features of Lisp. Meanwhile, Uber thinks my house is in the middle of I-83.
dorfsmay · 10 years ago
True, the number of fender benders on the first snow of the season is very high because people haven't adopted the winter braking distance model in their head yet.
rqebmm · 10 years ago
> Lot of snow and can't see the lane markers? Millions of people adapt every day to this during winter.

This could not be less true among people driving in unfamiliar conditions. (http://legacy.wusa9.com/story/news/local/virginia/2016/01/20...)

The difference is that all self-driving cars benefit from the knowledge created by bumping into the edge cases. People can only do this in an ad-hoc manner (i.e. personal experience, taught by mentor, etc)

mhurron · 10 years ago
It doesn't even have to be in a place that doesn't get much snow.

First snow fall in Calgary looks like what you get in a standard snow storm in a city in the south east. Its even funnier if the first snow fall is early and then there are another two months of sun and no snow and then you get a repeat during the second snow fall. People have to relearn how to drive in snow every year.

People are bad drivers in good conditions, they just happen to be able to be bad drivers in bad conditions as well.

striking · 10 years ago
I think the poster assumed that the person was trained to fit their surroundings. We are all hyper-fitted to what we know best and what we encounter every day, and will probably fail spectacularly if we're unaware of that.
roymurdock · 10 years ago
I have a $100 bet going with a coworker at the office.

We defined self-driving cars: When I can pull out my phone in any of at least 5 major metropolitan areas around the world, order a driverless car, have it pick me up and deliver me to a specified location within the city in a timely fashion, with as little risk as getting in a taxi.

He is sub 20 years. I'm thinking more like 40-50 mark.

My thesis? We can solve the technical challenges, but so much physical, legal, and regulatory infrastructure needs to change to make this viable that it will be more than 2 decades until we see self-driving cars as a reality across multiple cities. Of course, I would be more than happy to lose the $100 as it would make everyone's lives better that much sooner. But I'm skeptical.

smileysteve · 10 years ago
> He is sub 20 years. I'm thinking more like 40-50 mark.

Perspective:

* 60 years ago, we didn't have highways.

* 45 years ago, seat belts weren't required.

* 40 years ago, we expected engines to last 10s of thousands of miles. We didn't really understand crumple zones. Open containers and even drinking while driving.

* 30 years, we added fuel injection, computers, and airbags.

* 20 years ago, the hybrid car, better fuel efficiency, and side airbags

Imagine what we'll know tomorrow.

justin66 · 10 years ago
>so much physical, legal, and regulatory infrastructure needs to change to make this viable that it will be more than 2 decades until we see self-driving cars as a reality across multiple cities.

It could be a lot more straightforward than you suspect. The three things we have going for us:

1. Car accidents are almost always resolved by deals made between insurance companies (this will be doubly true with unpiloted cars, which will presumably not commit criminal acts)

2. The same few large companies are likely to deal in insurance of piloted and autonomous vehicles, so there's not a huge incentive to go sue-crazy

3. If autonomous vehicles cause significantly fewer accidents and companies can charge the same amount for insurance covering them, that is a huge win for some very large companies with the power to get the ball rolling.

Increased vehicle safety will be a big win for these companies in the short term, although in the long term some price discounts might be factored into the cost of autonomous vehicle insurance.

a-priori · 10 years ago
I have a bet going on with my wife that our 1-year-old daughter will never take driving lessons when she grows up.
baddox · 10 years ago
Call me a technology optimist. I'd bet 10 years.
Ensorceled · 10 years ago
I think you are seriously underestimating what the baby-boomers will do when they have to start giving up their drivers licenses because they are getting to old to drive.
petra · 10 years ago
>> but so much physical, legal, and regulatory infrastructure needs to change

Let's go with that. Say that tomorrow Google finished the development of the car. It works very well.

Than it just need to find a single city that would agree to a wide scale deployment. And I imagine there are plenty of reasons why a city might want to do that. Heck if Google cared, they could legally bribe politicians. And it doesn't even have to be a city in the west. It could be a city ruled by a dictator somewhere in africa. Or it could be a self driving car created by a chinese company, and china would be willing to take the risk to lead in this field.

So maybe the first city isn't as hard as it seems.

Now what happens after a car has proven itself in a city for a year or two ?Showing almost zero accidents, and all the other benefits of a self-driving car. How much will people everywhere would want it ? how much political support would it gather ? i imagine a lot.

Would such support and proof be enough for relatively rapid deployment of self-driving cars ?

zodPod · 10 years ago
Careful, that $100 probably won't be worth shit in 20 years let alone 40-50.
rqebmm · 10 years ago
You're right that there's significant legal and regulatory hurdles that have to be cleared, but I think they're more of an issue for broad adoption. SF will have a driverless taxi service probably within the next 5 years. Once that's in place it's only a matter of time/strength of the taxi lobby that's going to stop it, (and Uber has shown that can be broken even in places like London).

If you're talking about _everyone_ in major economies having a driverless car, yeah, I think 40 years is realistic given the replacement rate for cars alone, nevermind the legal issues.

randyrand · 9 years ago
Humans overestimate tech advancement for 3yrs in the future, but underestimate it 10 yrs in the future.

Food for thought.

Retric · 10 years ago
Considering the number of accidents after major storms I think your overstating the case. People are bad drivers in all conditions.

The problem with driving is you have to consistently do it well. You can make zero mistakes 99.99% of the time and be a terrible driver. One accident per 10,000 minutes would be ~one accident every year. Worse, a lot can happen every minute, so it's closer to not messing up for 600,000+ seconds.

The only reason people can drive is we build in very wide tolerances. ex: Car lanes are far wider than strictly necessary, you can fit ~2 cars in the average lane. Traffic lights have a significant delay where they are all red etc.

TrevorJ · 10 years ago
I think it's more accurate to say that people are unreliable drivers. Humans are certainly capable of driving very very well, but they perform unreliably.
bsder · 10 years ago
> The only reason people can drive is we build in very wide tolerances.

And other drivers react to minimize the problem created by another driver.

ender89 · 10 years ago
Actually, it sounds like the self-driving car got muscled out by the bus. They claim that the the car AND the driver of the car thought the bus would let them through. Now this is a problem with the car not driving defensively enough, because either the car miscalculated the bus's ability to slow or assumed the bus would be able to see the car and react accordingly (or, just possibly, the bus driver is an ass who tried to shut out the car. I know, its practically unheard of, but its possible). I don't think this is a deathblow to self-driving cars, nor is it a sign that they are worse at driving than people, only that they are still being perfected and that the single greatest threat to a self driving car are people because they are the ultimately unpredictable.
Tenhundfeld · 10 years ago
Anecdotally, this is a lesson I too had to learn, almost the hard way. The closest I've ever come to accidents in city driving is with municipal buses. City buses seem to drive as though they always have the right of way, and other cars will always get out of their way.

Coming from a smaller, more suburban area, I learned to recalibrate my driving to be more aggressive in larger cities (primarily NYC & DC), to hold my own against pushy taxis, etc. Most city drivers kind of bluff, and if you show you're not falling for it, they'll stay out of your way.

But city buses don't bluff. If they start pulling into your lane, even if clearly cutting you off, you better slam on those brakes, because that bus is not stopping. It's one of those unwritten rules you learn, at least in my experience driving in major East Coast cities.

I can see how it'd be hard to train AI that you have adjust the driving style depending your locale and for specific types of non-emergency vehicles.

I'm not making an argument here, really just sharing a chuckle about the crash involving a city bus.

baddox · 10 years ago
I'm not convinced that the distinction between "situations requiring thinking" and "boring situations" is that strong. You seem to be implying that self-driving cars will probably be worse than humans at things like negotiating through city traffic (where this accident apparently occurred), but is there any evidence of that?

It's true that most deaths and serious accidents occur on highways, where human factors like boredom and sleepiness are probably significant. But humans also have a heck of a lot of fender benders like this incident, and a lot of the ones I have seen weren't exactly situations where "a lot of thinking" was necessary. People are texting and rear-end the person in front of them, or underestimate how much room they have to switch lanes between two cars, or other things like that. I don't see why these situations are uniquely difficult for AI to solve. In fact, they seem like very apt problems for computers to solve.

jonknee · 10 years ago
On the other hand Google has driven a lot of miles without an incident and this minor accident doesn't change the fact that their system is safer than a human behind the wheel. And unlike humans who don't improve after an accident, Google's whole fleet just got even more safe.
Slartie · 10 years ago
They have driven millions of miles - in perfect driving conditions: great weather, lots of highways, all of it by day (how do I know that? Simple: Google would brag about their AIs being able to drive in rough situations if they were able to do that. But they don't. They just brag about a huge number of miles in undefined conditions...). That is exactly what the original poster criticized. When driving is so easy that it bores human drivers to death, todays' AIs can be better.

The problem is: real-life driving does not only consist of situations that even a child could easily manage. Humans are extremely good at letting their driving skills degrade gracefully when conditions get rough. AIs? They drive perfectly until they reach their limit, then they suddenly pull off epic fails - if no human driver is on standby to resolve the situation immediately.

ocdtrekkie · 10 years ago
This is false. Google's cars have only gone as far as they have without an incident BECAUSE of human drivers. In a 12 month period, human drivers prevented Google Self-Driving Cars from causing ten accidents. And 272 times the car's software outright failed and dropped control of the car to the human driver. This is all in Google's recent report to the California DMV, but it's not a reality they like to advertise openly.

Statistically, Google's Self-Driving Car would've lost it's license by now, if not for human drivers keeping it in check.

jfoutz · 10 years ago
It's hard to say with such a short description, but as i read it, the google car never left its lane, just moved right to the edge of it's lane to avoid some debris.

Why did the bus move into an occupied lane?!?

jonknee · 10 years ago
> Why did the bus move into an occupied lane?!?

It depends on the city, but bus drivers often don't care whatsoever about other vehicles in the way. They merge and let others avoid contact.

dragonwriter · 10 years ago
> Why did the bus move into an occupied lane?!?

Because that's what busses do all the time, whether or not that is safe, legal, or desirable.

arthurgibson · 10 years ago
Sounds like we need a self-driving bus.
ocdtrekkie · 10 years ago
It sounded like it was merging into the bus's lane. But I imagine Google will prep some very fancy videos of the incident from the car's perspective within the next week or so, and then we'll know for sure.
aidenn0 · 9 years ago
You got that backwards. They moved right in their lane to slow down for a turn (while allowing traffic going straight to pass), and then had to move back into the stream of traffic as there was debris preventing them from turning on the far-right side of the lane.
duderific · 10 years ago
From the writing in the article, I found it extremely difficult to get a good visual of exactly what happened in the accident. I wish the description was a little more, uh, descriptive.
zv · 9 years ago
At least in my country, buses have legal right of way when they are driving out of bus stop.
benologist · 10 years ago
Bad drivers take on a whole new meaning outside of wealthy countries where the dangerous drivers routinely end up disqualified from driving. Just by providing a predictable vehicle everyone can trust will stop and stay stopped at red lights, stop at stop signs etc AI will bring a lot to the world.

    Seventy-four per cent of road traffic deaths occur in middle-income 
    countries, which account for 70% of the world’s population, but only 53% 
    of the world’s registered vehicles, burdens for 74% of world's road 
    deaths. In low-income countries it is even worse. Only one percent of the 
    world's registered cars produce 16% of world's road traffic deaths.
https://en.wikipedia.org/wiki/List_of_countries_by_traffic-r...

josho · 10 years ago
People are bad drivers. Every winter at the first snowfall there are tonnes of accidents because folks forgot to adjust following distances.

The promise of an AI driver is that there will never be a second accident of the same type. The only question is how fast the learning rate can ramp up.

spuz · 10 years ago
The problem you are stating is called the 'handoff' problem which I believe will be a serious challenge for manufacturers of driverless cars. The handoff problem is when an automated system identifies a scenario in which it needs to hand over control to a human. How will that work in a vehicle that is not designed to be occupied by someone that knows how to control it? This problem has been written about in the book Our Robots, Ourselves [0] and there is an Econ Talk podcast about it as well [1].

To me, it makes sense to make use of the brain power provided by the occupant of the vehicle. For example, if one of the main vision sensors in the car breaks, why not let the human occupant drive the car manually?

[0] http://www.amazon.co.uk/Our-Robots-Ourselves-Robotics-Autono...

[1] http://www.econtalk.org/archives/2015/11/david_mindell_o.htm...

perlgeek · 10 years ago
From a safety engineering standpoint, handoff is a very, very bad idea. When you automate the common operation of a system, how is the operator supposed to get a feeling for the system? And when the operator doesn't handle the system regularly, how is he supposed to handle a situation that's so dangerous and unusual that the automation can't handle it?

If you have years of experience driving a car, maybe it'll work. For some time. But what if your last manual intervention was five years ago? Or twenty years ago? The rarer handoff happens, the more stress it puts on the manual operator, and the more likely things go wrong.

It's much better to have a safe default reaction (stop the car, shut down the reactor, ...) that kicks in when normal, automatic operation can't continue.

bryanlarsen · 10 years ago
As far as I understand, the current behaviour of self-driving cars is to stop the cars safely in a safe place, to the best of their degraded ability. Once it's stopped, the human can either take over. If a human can't, you're stuck on the side of the road and you call a tow truck.

Which isn't really any different than any other sort of major mechanical failure.

a-priori · 10 years ago
Two options for how a driverless car without human controls might handle these sorts of hand-off scenarios:

1) It hands control to an out-of-car human driver, similar to how military drones and driverless trains are operated now.

2) It turns on the hazard lights and pulls to the side of the road. Another vehicle is dispatched to pick you up.

rubicon33 · 10 years ago
Hold up. Just because Google's problem is a big one to solve, doesn't refute the point that human drivers are bad.

Yes, driving is actually a challenging task. But guess what, humans are BAD AT IT. And are computers going to be better? I'd be pretty damn surprised if they weren't.

Most crashes come down to distraction. Computers never get distracted. So if we can workout errors like the one in this story, then at least we can say the computer isn't going to get lazy or distracted and suddenly forget it's lessons learned.

Yes, its a challenging road ahead for these self driving cars. But I'd feel so much safer in a world of autonomous vehicles than I would with the morons behind the wheel of today.

StreamBright · 9 years ago
Not all of the humans are bad drivers, but some of them definitely are. I have been living in the US for 4 years and here are the list of things that are extremely dangerous:

- not using signals at all, just randomly changing lanes

- not using signals at all and on the top of that do not care about traffic in the other lanes

- driving 45 mph in the left lane because California teaches to new drivers that changing lanes are dangerous, aka "stick to your lane" (I learned that recently and I could not believe)

- driving with the exact same attitude in rain like under dry weather conditions

- do not care about people merging lanes, do not look at them, pretend they dont exist

- being on facebook on the highway driving with 65mph

- do not care about bicycles, pretending they are not legitimate traffic

I am not sure about the rest of US, but several friends told me it is specific to California. I would argue that humans are not as good drivers, at least there is a large part of drivers who aren't. Replacing drivers with software is always going to win, just like it did with airplanes or anything else repetitive task that we automated. I am pretty sure there is going to be a period while AI is not as good as humans but this is why Google is collecting to data at large scale to improve it to beat humans and make driving a safer way to commute.

petra · 10 years ago
>> And unfortunately there's no easy way to use the best of both sides.

It doesn't seem too hard to go in that direction. For example you can use satellties or drones or other sensing mechanism to survey a city in real time and tell the driver X minutes in advance before he would need to take control of the wheel - and of course you could adapt this recommendation according the skill of the person, his current state(being drunk) or other data.

bigiain · 10 years ago
Not sure that'd work: "Sorry I mowed your child down, but it wasn't my fault, the computer didn't tell me I needed to pay attention". People are in general extremely quick to deflect blame from themselves (especially under stress) no matter _how_ stupid or implausible their no-the-spot excuse is.

(Source: I'm a motorcycle rider, we have a term here "SMIDSY", which stands for "Sorry mate, I didn't see you" - because it's such a common thing for drivers to say when they've just driven into you. There's a great quote from a judge a few years back in a court case over a driver excusing themselves for running into a bike "But the plaintiff was _clearly_ there to be seen.")

Symbiote · 10 years ago
> Again, look at any Asian mega city, people can adapt very fast.

Have you seen the accident rate in these places? It's huge, and in many countries underreported.

mirkules · 10 years ago
Personally, I feel we are approaching the self driving situation backwards. We are so heavily invested in current technology that we are trying to shoehorn existing tech into doin something it was not designed to be autonomous in the first place.

Imagine how much easier it would be, for example, if roads had embedded markers (like magnets or RFID chips) that designated lanes, speed limits, and other road rules and weather conditions. AI vehicles would simply need to read this information and adjust accordingly.

But, unfortunately, this would be almost unfeasible in today's world vecause a) it depends on every car following the same rules and that's impossible if you have human drivers and b) it would require massive investment and national-level cooperation to change the infrastructure.

toomuchtodo · 10 years ago
> Imagine how much easier it would be, for example, if roads had embedded markers (like magnets or RFID chips) that designated lanes, speed limits, and other road rules and weather conditions. AI vehicles would simply need to read this information and adjust accordingly.

Speed limits can be defined as metadata on existing mapping data. Existing vehicles, such as Tesla's with their autopilot system and GPS receivers, can "define" existing lanes by refining existing mapping data using their drivers as expert system trainers [1].

You don't need national-level cooperation to change infrastructure; you just need a massively automated system for collecting data and training based on it. Google and Tesla are doing both, just in different ways.

[1] http://www.teslarati.com/tesla-building-maps-autopilot-drive...

Gratsby · 10 years ago
Let's say for instance we wanted an RFID indicator for each lane every 200 feet.

You're talking about 13 million RFID chips that would need to be installed on 164,000 miles of roadway - just to get the interstate highways covered.

Now think about the fact that these chips are going to need to sit static and not be replaced for more than 15 years at a time. So you have to come up with a way to house them such that they will not be affected by > 150 degree heat or by < -50 degree (F) cold.

And then think about the fact that it takes ~6 months for the government wheels to spin in the right direction for a pothole to be filled in.

The project to put RFID, or any new technology on the roadways in the US will have to involve federal cooperation as well as the governments of 50 states and countless local governments. It will be 100 times more expensive than any other venture using the same technology.

Look at the Big Dig project in Boston. It took 25 years, $24 billion and several acts of congress to get it to a state where they could start working out the kinks like "why do the guardrails keep killing people".

The scale of solving this problem "the right way" is incredible. I don't disagree with you entirely. But I think any change like that is a solid 100 years out into the future.

VLM · 10 years ago
It wouldn't help very much.

We need visual processing and heuristic logic to handle a kid on the side of the road chasing a ball unlikely to intersect and catch it before he enters traffic. Given that mandatory ability, reading a speed limit sign is pretty trivial.

Given that its very unusual for an oncoming car to swerve into your lane, and a giant pothole in the oncoming lane, and the car sees three cars swerve into your lane taking turns, what are the odds of the next car hitting you head on? Compared to this puzzler, interpreting a stop sign is pretty easy.

Also the human drivers presumably will not understand the weird behavior of an enhanced self driving car, unless the human infrastructure matched the automatic infrastructure. Of course you could analyze the human data visually to predict the human behavior, but then why bother with the automatic hardware sensors?

I guess a bad automobile analogy is if you must design this pickup truck to safely haul a 8000 pound trailer, then a 300 pound trailer will not overtax it.

agumonkey · 10 years ago
I feel the same but refactoring the whole driving infrastructure will only happen if people see SDV as that much valuable and not overnight. If SDV get critical mass, what you say will happen.
nske · 10 years ago
If all cars were self-driven and there was a system that allowed them to exchange information, negotiate decisions, etc with all cars in proximity, that would be the ultimate vision safety-wise. I expect that we'll live to see that day. Removing the human factor would be removing the biggest problem.

On the other hand, making this mandatory would also be the wet-dream of most governments. In the name of safety, they could have access to so much information... Just knowing how many passengers the car is carrying, what's the weight of each passenger, and, of course, a history of the coordinates, would open insane possibilities, not all of them good.

_greim_ · 10 years ago
> But when boring suddenly turns to not boring? The current state of AI is very very bad at this.

"Boring" versus "not boring" is too vague for this statement to really be meaningful. Examples could be drummed up on both sides. For example computers beat humans hands down in at least some criteria fitting the "not boring" description, like reaction time and seeing in multiple directions at once, and probably some others.

isaacremuant · 9 years ago
That humans are bad drivers is something most of us experience everyday. Of course, everyone thinks THEY are the exception.

Self driving cars being better than the average driver is absolutely to be expected since they can only get better. They'll be consistent. They will always be focused, not get distracted, not get tired, have more sensors than a human could. It's not just about the environment but also about the mental state of the driver.

Sure it's difficult but your using one case to flip things around doesn't seem to be a very strong argument.

> Self driving cars will get there someday, but I think it's much farther away than many assume it will be.

There's no evidence to support your assertion given the tests that have been going on so far. There's no evidence to support that switching back and forth would be better than having the AI all the time either.

x3n0ph3n3 · 9 years ago
> Of course, everyone thinks THEY are the exception.

This gets brought up every so often, much to my chagrin. If you asked 10 people what qualifies a good driver, you will get 10 different answers.

Does it mean they have a good safety record?

Does it mean they are skilled at controlling the car?

Does it mean they can navigate traffic efficiently?

Does it mean they communicate with other drivers well enough to not negatively affect traffic?

Does it mean they have good reactions times?

What combination and weighting to you provide the above questions and what questions haven't even been asked?

Of course people respond that they are above average or good drivers since everyone is using a separate metric to define a good driver.

amelius · 10 years ago
How difficult is it to not hit something?

Let's suppose you are given a 2d field in which there are a number of rectangles, each of which have a velocity vector. So you know the location of each rectangle now, and, to a good approximation, in the near future.

How difficult is it to program a path-finding algorithm that will not crash into any of these rectangles, given that lowering speed is always a good option within a city?

I'd say this is really not all that difficult. Yes, there are a lot of special cases to consider, but still, in essence, this seems to me a quite simple problem.

So breaking down the problem is key. If you look at an autonomous vehicle as a black box, then, of course, it will seem dauntingly complicated, and difficult to trust.

infinity0 · 10 years ago
> Let's suppose you are given

hard AI problem, computer vision

> to a good approximation, in the near future.

how near is near?

> lowering speed is always a good option

can't brake if someone faster is behind you

> there are a lot of special cases [..] simple problem

simple problems by definition don't have many special cases, the "simplicity" means you can abstract them into a few cases.

in other words, talk is cheap show me the code

haxel · 10 years ago
There is a way to get the best of both worlds: variably-autonomous swarming vehicles. The larger the swarm, the more autonomy the car has. The smaller the swarm, the less autonomy.

Large swarms on the highway, small swarms (or solitary driving) elsewhere. Best of both!

trhway · 10 years ago
only this one human-driven car in the middle of the swarm is what stops this swarm from reaching new level of autonomy. Opps, such an unfortunate mishap! that human-driven car has just hit the car in front of it and went off the road. Finally the swarm is free... err... can move onto the next level.
samstave · 10 years ago
Personally - I think we need a network of "robot only" car lanes in cities and highways.

Turn one out of four one-way streets in business districts to robot only. Make a certain highway on/off-ramp robot only etc...

mtgx · 9 years ago
> And unfortunately there's no easy way to use the best of both sides.

How about using the "self-driving AI" only as "really advanced automatic assistance". Humans will still drive the cars, but the AI will kick-in in those boring situations where the humans turn off their thinking.

j79 · 10 years ago
Interesting point. Would be interesting if this lead to remote drivers who are responsible for navigating self driving car when the car decides human intervention is safest? (At least until all cars on the road are self driving and connected to each other for real time communication...)
megablast · 9 years ago
Over 1 million people a year die from driving accidents. And that is not including the huge number of injured.

That is an awful result. Humans are awful drivers. They are too easily distracted, don't have enough information, bored, and tired.

euyyn · 10 years ago
> The current state of AI is very very bad at this.

What makes you think so?

dsp1234 · 10 years ago
And here is the actual accident report [0]

"A Google Lexus-model autonomous vehicle ("Google AV") was traveling in autonomous mode eastbound on El Camino Real in Mountain View in the far right-hand lane approaching the Castro St. intersection. As the Google AV approached the intersection, it signaled its intent to make a right turn on red onto Castro St. The Google AV then moved to the right-hand sid of the lane to pass traffic in the same lane that was stopped at the intersection and proceeding straight. However, the Google AV had to come to a stop to go around sandbags positioned around a storm drain that were blocking its path. When the light turned green, traffic in the lane continued past the Google AV. After a few cars had passed, the Google AV began to proceed back into the center of the lane to pass the sand bags. A public transit bus was approaching from behind. The Google AV test driver saw the bus approaching in the left side mirror but believed the bus would stop or slow to allow the Google AV to continue. Approximately three seconds later, as the Google AV was reentering the center of the lane, it made contact with the side of the bus. The Google AV was operating in autonomous mode and travelling less than 2 mph, and the bus was travelling at about 15 mph at the time of contact.

The Google AV sustained body damage to the left front fender, the left front wheel and one of its driver's-side sensors. There were no injuries reported at the scene."

[0] - https://www.dmv.ca.gov/portal/wcm/connect/3946fbb8-e04e-4d52...

wstrange · 10 years ago
Interesting. That is a fairly complex scenario (sand bags partially blocking a lane) -and given the test driver also didn't expect the bus to move ahead it seems like this is a mistake that many humans would also make.

Interesting to note that the collision was very minor (car moving at 2 MPH). When looking at accident rates we should also consider the severity.

DannyBee · 10 years ago
I'm actually curious who had the right of way in this one. It sounds like the bus just plowed through them while accelerating the whole way.
dsp1234 · 10 years ago
I'm pretty sure it's the bus. Generally a already moving vehicle with a green light on straight road has right of way. The correct action for the autonomous vehicle (or a human driver) would be to wait for all traffic in the lane to clear (for example, after the previous light turns red and there is a lull in the traffic), then make it's way around the sandbags.

Here's my artists rending. The vertical lane is just the right hand lane, and is "wide", but still one lane. The bus is traveling "up" and the Google AV tries to manenouver around the sandbags blocking it's way. Thinking that it can get around in time, the AV starts moving to the left and crashes into the bus.

I guess the "one lane" aspect is the confounding variable, but since the autonomous vehicle was stopped, then it takes on the aspect of any other stopped vehicle on the side of the road (ie giving up it's right of way). Just like if you have a road where cars are allowed to park on the side of the road, those cars cannot enter traffic unless it is safe to do so.

Edit: Actually, I have no idea. Did a quick read of California's right of way laws, and couldn't find anything that jumped out that would cover this situation.

  |      |
         |
  |      +----
  
  
  |    SS+----
    <-[G]|
  | ^    |
    |    |
  |[B]   |
   [B]   |
  |      |

alistairSH · 10 years ago
Looking at a map of the area, it's an extra-wide lane, as described in the report. Why would CA build it that way, instead of a full turn lane? I assume since it's technically a single lane, the fault actually lies with the bus, since it was attempting to pass within the lane?

https://www.google.com/maps/place/El+Camino+Real+%26+Castro+...

jbob2000 · 10 years ago
It sounds like the car attempted a lane change, but cancelled when it detected the bus. The bus didn't want to wait for it to return to its lane, so it just scraped the side of the car. If the car was further in the lane, it would have been rear-ended, but the report just says side scraped.
jongraehl · 9 years ago
splitting the lane for a right turn is common in CA (assume legal too). but you'd best be sure you can make it without having to cut someone off. if a human did this the result might be the same. tough problem. there's a chance that if the google car went faster than 2mph it might have gotten some respect (or around the sandbags before getting hit, if there was room in front)
ChuckMcM · 10 years ago
Ok, I laughed out loud on that one. This quote in particular, "The vehicle and the test driver 'believed the bus would slow or allow the Google (autonomous vehicle) to continue.'"

Bus drivers in the Bay Area are notorious for ignoring traffic (and pedestrians). Apparently there is some indemnity or statutes that make suing either the transportation agency or the driver nearly impossible and so pretty quickly people learn that the bus drivers drive with impunity. Plenty of stories posted to the local traffic column in the paper, and shared amongst neighbors and in the department of safety's "blotter" feature.

Google needs to go back and program their cars to always assume that buses are out to get them and avoid them at all cost, They are an active traffic hazard often operated by a disinterested and distracted driver. The only way to "win" is to not be where ever the bus is.

technofiend · 10 years ago
Up next: Google self-driving buses, finally American buses run with the same attention to time tables that Swiss ones do. And added benefit: they obey traffic safety laws.
david-given · 10 years ago
I have actually visited Mountain View, from Switzerland, and caught a bus on El Camino. Probably not far from where this incident happened. Yeah. Timetables are not a thing that happens there.

...I remember once in Zurich when my local bus was two minutes late. The people at the bus stop were really quite cross.

kuschku · 10 years ago
The issue are not the drivers.

I’ll report what I observed in 2 years of 3 to 4 times of using the bus a day in Germany.

Every time a bus was late, or tried to break the sound barrier™ in the hope of reducing some of the delay, was due to one of the following issues:

(a) A tourist with texan accent trying to get onto the bus, discussing with the bus driver if he can pay with credit card or in dollar (no), then asking the bus driver to wait while he’s going to get money from the nearest ATM (happens at about 5% of stops in the downtown areas where the tourists are)

(b) Rush hour traffic, 200 people squeezing into a single bus, and another few hundred waiting at the bus stop – busses coming every one or two minutes, and it takes quite some time until people stop trying to get into the bus, and leave enough space for the doors to close

(c) some kids with invalid tickets trying to cheat and getting caught

These issues can’t be fixed by automated busses, or trains.

Only by less tight schedules, and more busses and trains.

ihsw · 10 years ago
Actually there is a very strong argument to have buses and trains automated before consumer vehicles.
simplemath · 10 years ago
Any job that chiefly involves a human operating a vehicle will not exist for much longer.
jerf · 10 years ago
"always assume that buses are out to get them and avoid them at all cost"

This is a case where perhaps the computers have too much imagination. We actually tell human drivers to drive that way, but as humans we all know that we don't really mean that we need to worry about someone driving in front of us suddenly slamming their brakes, drifting 180 degrees, and driving at us full speed. Tell a computer to assume too much malice and the car will refuse to even move, because it's pretty easy to specify the search algorithm that will find that outcome.

We have to specify the exact level of malice the computer can reasonably expect, which is way harder. And it will still, by necessity, at times be an underestimate.

ChuckMcM · 10 years ago
You make an excellent point, however as I've been recently up to my hips in RNNs I wonder if we can figure out how to score encounters, can the car learn the level of malice to expect, can it learn it to the level of perhaps the individual driver shifts? My daughter took the 54 to DeAnza community college and learned which drivers were ok, which were mean, and which were indifferent. Would regular exposure of the car to the bus at different times of day allow it to figure that out? Can we start with an expected level of malice and tune it? Fun question to think about.
mabbo · 10 years ago
In Ontario at least (and maybe other places) it's actually law that you always yield to the bus. Well, maybe the laws actually aren't as strong as that, but the bus drivers know there are some laws that say they have more rights, and the rest of us kind of fall in line.

I mean, he's driving a bus and he thinks he has the right of way. This effectively means he has the right of way.

drchiu · 10 years ago
Agreed.

Google needs to increase the settings on "Defensive Driving".

Just because the Google robot drives in a reasonable manner doesn't mean the humans around it will.

theklub · 10 years ago
Made me think, someone can wave you ahead, etc. There won't be that option with a robot unless they install a indicator.
gardano · 10 years ago
From my memories of driving in Boston, defensive drivers were an absolute menace. If you don't drive aggressively, you're the cause of problems…

Perhaps each autonomous car needs to have a cultural knowledge, based on location.

BrandonY · 10 years ago
I drive defensively and lived in Boston. I couldn't deal with the mindset of Boston drivers. I moved to Seattle. I am so much happier here. Sometimes drivers cause traffic jams because they spend too much time waving at each other to go out of turn: "You go!" "No, you go!" "No, I insist, you definitely stopped before me!" These are my people.
redwall_hp · 9 years ago
Or the cars can set the driving culture and blockade aggressive drivers so they're forced to drive defensively.
mb_72 · 10 years ago
This made me think - and excuse my ignorance if this is a known and solved problem already - that for autonomous driver software there needs to be some significant degree of customisation allowable for local conditions, where 'local' seems to need to be 'in a local area, like the Bay Area' for example, not just country-wide right-of-way and correct side of road rules.

Here in South Australia public transport buses are extended unofficial better right of way standing (when departing from the curb), for example. There are also new laws determining the required minimum passing distance between a cyclist and a car to be 1m if travelling 60km/hr or less, and 1.5m if travelling > 60km/hr; and car drivers are allowed to cross double-white lines to avoid cyclists.

It seems to me there are ample opportunity for 'edge case' bugs to occur in the autonomous vehicle software when local conditions are taken into account, and an untested set of requirements are applied specific to a country / city / district.

thoughtpalette · 10 years ago
Also agree. Drivers operate the same, unsafely, in Chicago as well.
mring33621 · 10 years ago
Yeah, in Chicago, it's very clear that the onus is on the non-bus vehicle to avoid a potential accident. Which pretty much sums up the main tenet of defensive driving...
diogenescynic · 10 years ago
>so pretty quickly people learn that the bus drivers drive with impunity.

So true. There are no worse drivers on the roads than MUNI bus drivers. They are terrifying to drive near. I've nearly been crushed by MUNI buses multiple times, while driving in my own lane minding traffic laws.

DrScump · 9 years ago
This is VTA (formerly Santa Clara County Transit), not Muni.

Regardless, both (and A/C Transit and BART) have extremely strong unions.

cloverich · 10 years ago
A somewhat ironic thought experiment: If a reasonably practical early implementation of Google cars would be to replace municipal bus (drivers)... isn't it in the bus driver's self interest to try and get Google cars to hit them (and potentially keep them off the road.)
krschultz · 10 years ago
It was inevitable, so I'm sure they're quite pleased it was a minor issue and not something catastrophic. Someday in the future a self driving car is going to hurt or kill a person and then the real legal tests will begin, but this is the first step on the pathway to normalcy.

My personal fear is that Google and maybe one or two others will get self driving cars right, but then the imitations from other manufacturers will fall short. The liability needs to end up on the manufacturer of the self driving car system, this is not something to be taken lightly at all.

toomuchtodo · 10 years ago
I expect self driving car liability to end up similar to vaccine liability. A fund and adjudication process is created to compensate those who have an adverse outcome.

http://www.hrsa.gov/vaccinecompensation/

ghaff · 10 years ago
Which are controversial in some circles because they're a rare example of a "get out of jail free" card for big companies. (I'm not agreeing with that POV but noting that it's pretty widespread.)

I actually agree that vaccines (and really drugs more broadly) are a rather unusual example of products that can be used as directed and things can still go bad--without the manufacturer necessarily being "at fault." [Edit: Where it's not the fault of the user or other human.]

BinaryIdiot · 10 years ago
The only difference is the vaccine fund doesn't have to be proven and a large, large amount of cases they receive are highly unlikely to have been caused by vaccine issues.

But a car crash will be very easy to figure out who's at fault. Why would such a fund exist for something that can be pin-pointed directly at the party at fault?

cubano · 10 years ago
It would seem to me that Google's autonomous car insurance, which they must have bought from somewhere, would take care of this in pretty much the same way as other insured drivers are currently protected.

It really shouldn't be very different from, say, a fatal accident from a driver in a corporate-owned car.

Sure...PIP attorneys are going to go after Alphabet and make a big ruckus, but in the end, Alphabet has to have all this covered already as part of the plan.

Animats · 10 years ago
Here's the Autonomous Vehicle Accident Report filed with the CA DMV.[1] "The Google AV was operating in autonomous mode and traveling at less than 2 mph and the bus was traveling around 15 mph." The other vehicle was a 2002 Newflyer Lowfloor Articulated Bus, which is 61 feet long including the "trailer" part.

Here's where it happened.[2] You can see traffic cones around the storm drain.

This is a subtle error. Arguably, part of the problem was that the AV was moving too slowly. It was trying to break into a gap in traffic, but because it was maneuvering around an unusual road hazard (sandbags), was moving very slowly. This situation was misread by the bus driver, who failed to stop or change course, perhaps expecting the AV to accelerate. The AV is probably at fault, because it was doing a lane change while the bus was not.

Fixing this requires that the AV be either less aggressive or more aggressive. Less aggressive would mean sitting there waiting for a big break in traffic. That could take a while at that location. More aggressive would mean accelerating faster into a gap. Google's AVs will accelerate into gaps in ordinary situations such as freeway merges, but when dealing with an unusual road hazard, they may be held down to very slow speeds.

I wonder if Google will publish the playback from their sensor data.

[1] https://www.dmv.ca.gov/portal/wcm/connect/3946fbb8-e04e-4d52... [2] https://goo.gl/maps/QzvVXQGxhX72

jessriedel · 9 years ago
> The AV is probably at fault, because it was doing a lane change while the bus was not.

Yes, probably true, but not crystal clear. The Google car never left the lane, so it comes down to subtle questions about appearing to be parked or impromptu division of lanes near right-hand turns.

Animats · 9 years ago
This story is getting a lot of press coverage. Reuters, CNBC and Wired are already covering it.

When Cruise (YC 14)'s car hit a parked car at 20 mph in SF last month, there was no press attention.[1] Even though it was across the street from the main police station.

That Cruise crash is an example of the "deadly valley" between manual driving and fully automatic driving, The vehicle made a bad move which prompted the driver to take over, but too late. This is exactly why AVs can't rely on the driver as backup.

[1] https://www.dmv.ca.gov/portal/wcm/connect/bc21ef62-6e7c-4049...

mattzito · 10 years ago
> The vehicle and the test driver "believed the bus would slow or allow the Google (autonomous vehicle) to continue."

Clearly the algorithm does not take into account the classic attitudes of bus drivers.

elcapitan · 10 years ago
It's probably one of the interesting social outcomes of a mixed self-driving / human-driven traffic that all the assholes will have more opportunities to behave like assholes, because they will be able to rely on the programmed friendlyness of self-driving cars. Eventually, tuning that friendlyness factor will become the new car pimping.
jws · 10 years ago
Every time an asshole recklessly endangers the occupants of a self driving car, the car will have recorded the entire event. They should send these to a central clearinghouse and after a certain threshold hand them over to the appropriate prosecutor for charging and conviction. We'll all be safer.

If the asshole can't bring himself to drive civilly, eventually he'll lose his license on points and have to ride around in a self driving car!

blktiger · 10 years ago
This is the most interesting part of this story to me. That the test driver believed the bus would let the vehicle continue indicates to me that the human driver would probably not have done any better than the automated car in this case.
junto · 10 years ago
Even more lucky it wasn't a BMW.

Here in Europe the attitudes of BMW drivers are not highly regarded.

gradi3nt · 10 years ago
Here in the US we stand with you on the issue of BMW drivers.
mfkp · 10 years ago
Not just Europe - BMW drivers are the same around the world.
mrec · 10 years ago
Oh yes. Even in Germany. When some nutcase whizzed past you on the autobahn doing 200kph, it was pretty much always a BMW.
bsder · 10 years ago
My standard comment to a new BMW owner: "So, do you have to prove you are an asshole to buy a BMW, or do they force you to take a class after you buy one?"
johansch · 10 years ago
Not even in Germany?
seanp2k2 · 10 years ago
They should invite a bunch of Prius drivers to MV to "test" with the self-driving cars on the road with them "driving normally". I'm sure they'd get some great test data out of it and help prevent lots of future incidents :)