Is this because you specifically asked for a driverless permit? From the post,
> Cruise has had state authority to test autonomous vehicles on public roads with a safety driver since 2015 and authority to test autonomous vehicles without a driver since October 2020.
> Waymo has had state authority to test autonomous vehicles on public roads with a safety driver since 2014 and received a driverless testing permit in October 2018.
Yes. It's a very different permitting process. As you might expect, the bar is very high to jump from driverless testing to operating a driverless service available to the general public.
Congrats! Curious why Cruise got a permit to operate only between 10 PM and 6 AM? Is it a DMV decision because you applied for driverless operations or did Cruise specify operating conditions?
Hope we apply the same metric to human drivers as well and don't allow anyone on the roads till they prove they are attentive 100% of the time and can maneuver like F1 drivers at all times. Monitored cameras every 100 feet would be a good start to find who's driving badly, and also a required driver face camera that records to a blackbox like device that cops can get the video from, and which the driver has no control over.
>Fatalities from Crashes
>In 2016, over 33,000 traffic crashes resulting in fatalities, major injuries or minor injuries were reported on Bay Area roadways.
10% of the crashes resulted in more than minor injuries, so 3.3K serious crashes.
He's trying to be one of the pioneers of robot cars ... does he care how many people his robot cars are going to kill (drivers of them to the innocent drivers driving alongside them) in the name of progress? PUtting them on the road and learning and improving the AI is the progress needed to perfect them over Years, but its going to be deadly... unlike creating and fixing bugs on a videogame live streaming site.
Maybe he does care, but how much?
This is not to be flippant (i appreciated and used Justin.TV a ton back in the day..canceled Cable TV cause everything was there for one to watch ... 24/7 marathons of your favorite shows) but what I view of the harsh reality of putting these things on the road as Uber's self driving car already killed a pedestrian. Personally, I'm not sure I could in good conscious work for a robot car company (just was being recruited by such a company).
Two questions: 1. How do we see this detail about permits, such as for example that Cruise are permitted to have "nobody in the car" while Waymo are required to have "human safety driver behind the wheel at all times") ?
2. Where are Cruise's testimonials etc. for the presumably successful San Francisco non-commercial testing of this "fully driverless, nobody in the car" system you now have commercial deployment for?
Thanks for the clarification, but is there any public resource I can point people to in order to confirm the difference between Cruise and Waymo permits?
I believe you, but skeptics won't be persuaded by "the founder said so on HN".
> Cruise founder here. This is kind of confusing. Short version:
> - Cruise permit is for robo-taxi service, available to public (fully driverless, nobody in the car)
We need to have fewer executives talking about their awesome safe technology and more executives being forced to use their awesome safe technology.
It is extremely unfortunate that as the condition of the permit the DMV did not force dogfooding requiring that every executive of a company applying for driverless robo-taxi permit and executives' families, including their children, were mandated to give up their drivers licenses and all other modes of transportation ensuring that those executives and those executives families are the constant test subjects of the technology.
Yes. I've been saying that for a few years, as Waymo's disconnect rate improved by a factor of 2 every 18 months or so. This is a hard engineering problem. Now that the "fake it til you make it" clowns have dropped out, there's real progress.
From personal experience seeing how their data is generated, I don’t trust Waymo reports at all. They straight up lie about the capability and disengage rates to an astonishing degree. I left several years ago, but I see no reason they would have developed honest practices on the meantime.
The reported disengage rate has improved a lot, but Waymo only reports disengagements it classifies as "related to safety", based on proprietary analysis and counterfactual simulations it does not share.
> The California DMV said in a separate release that Cruise driverless "vehicles are approved to operate on public roads between 10 p.m. and 6 a.m. at a maximum speed limit of 30 miles per hour."
At night only testing at super low speed. It's a good way to get started, but definitely not an autonomous taxi that title implies.
The speed limit for most residential and commercial streets in SF is 25mph - something like 97% of all street segments in SF have a limit of 30mph or lower.
I think one interesting property about nighttime driving is that lots of objects and markers are high contrast. Lights of other cars, highway markings, what can be illuminated by headlights/etc
(aside from folks crossing the street in non-reflective dark clothing)
I think the worst time to drive - as a human - is right when the sun is coming up or going down, or under tree cover when you go into and out of shadow.
Some recognition tasks might be easier at night but I expect that's not the reason permission was granted for those hours. It's almost certainly because there's less other traffic (and fewer pedestrians) at night.
> I think the worst time to drive - as a human - is right when the sun is coming up or going down, or under tree cover when you go into and out of shadow.
Fatal accidents do indeed spike around these times, although they seem to be worse in the evenings than mornings, and more pronounced in southern states than northern ones.
For car-pedestrian, car-bicycle, and car-motorbike collisions, yes.
For car-car collisions, not really. Yes, getting into a head-on 30-30 MPH collision is pretty bad, but you're more than likely going to walk away from it, especially if you're in the back seat.
Robocars shuttling around actual people in SF at night in 2021 seems like it should surprise a lot of people who have been skeptics. The road to full, global, universal autonomy will probably take decades if we are going to use that as the standard, but the standard of being certain that autonomy will eat the world seems to be getting very close to being met.
First make public transportation as clean and safe as ride sharing, then you'll get demand. Nobody wants to sit next to people that smell like piss/weed/cigs/BO. If you were a Uber/Lyft driver that kind of thing would get you 1 star very quickly.
The economics of self driving cars will ensure a rise in public transit usage.
A few years ago, I was helping my friend consider options for saving money. On the table was if they needed to keep their car. Most of their commute was via public transit. However we looked at the number of transits for which public transit was not viable per month (visiting friends on the other side of the metro area, running to stores), multiplied by the cost of using a ride share service. Comparing this number to the car payment, auto insurance, gas, and maintenance, it was no contest: even a few trips amortized the cost of the car and made it worth it.
Every trip has a cost in dollars, time, and externalities (causing congestion, environmental impact, etc). People will tend towards being efficient with those costs, judged by their relative importance to them personally.
The thing is, owning a car is mostly a fixed dollar cost per month. Once you pass the point where owning one is worth it for a few trips, the cost of additional trips is far less, while the time savings remain high. The only time I consider public transportation is when parking is a problem, as all other things considered driving is preferable.
Once self driving cars reach a point where they are closer to the amortized cost of a trip when owning a car, the tradeoffs start to change. Those who use their cars for few, but necessary, trips will be the first to ditch owning. I suspect many multi-car families will downsize to a single car. Once you get rid of your car, and are using self-driving cars for the trips where they are necessary, the question for every trip becomes "public transit, or self driving?" Different trips will have different ways they play out. However lot of people who are currently eating the cost of owning a car and might as well drive anyways will suddenly be able to save a few bucks by using public transit instead.
The final magic of all this is that if you're going anywhere more than a few minutes away, there are many other people going from approximately where you are, to approximately where you're going. The big killer in public transit speed is time. Time going to a common pickup point, time waiting for it to arrive, time spent not going on a direct route, time spent stopping for others to get on or off. With a sufficiently well used self driving car network, it could easily join a few rides together at a point nearby, and have self driving cars ready and waiting at the point near your destination. Two quick transfers, and that's it. If the self driving car app says "45 minutes, $20, or 50 minutes for $10", lots of people would save that $10. For destinations such as downtown, airport, sport stadiums, there's even only one transfer. Given car pool lanes and such, transferring to an express bus might actually be faster than driving directly.
I read a perspective once that posed driverless cars would first be adopted on specific commuter routes as a point to point taxi, and it's seeming increasingly plausible there will be such a service available for SFO to select highway-adjacent points around the Bay before long.
>10 pedestrians are killed each year by human drivers in SF, and almost exclusively they are the drivers fault (not only by definition, but also based on an impartial assessment of the situations).
People will not care, and I also do not think these automated cars are going to kill any pedestrians anyways.
> Cruise in March submitted the applications for driverless operations, whereas Waymo in January applied for autonomous vehicle deployment with safety drivers behind the wheel.
Wow, driverless actually means no driver here. But how does that work? What if it makes a mistake? Is there some fallback pilot somewhere that can take over and control it remotely?
Waymo has a similar approach in Phoenix. Passengers can't directly intervene but can call tech support and they'll dispatch roadside assistance to override the car and drive it to safety. Will be interesting to see if Cruise copies this same model.
Didn't they have "chase" cars in the beginning? Maybe I'm thinking of a different company. Cars that would follow around the driverless cars and could respond immediately if something happened.
According to the DMV press release (submitted elsewhere on HN), this is incorrect:
>Waymo is authorized to use a fleet of light-duty autonomous vehicles for commercial services within parts of San Francisco and San Mateo counties. The vehicles are approved to operate on public roads with a speed limit of no more than 65 mph and can also operate in rain and light fog. Waymo has had state authority to test autonomous vehicles on public roads with a safety driver since 2014 and received a driverless testing permit in October 2018.
Full disclosure, I founded a somewhat competing company in the space, comma.ai
Why is this progress? Evaluating progress in self driving cars is extremely complex. Why do people trust that the California government is able to do so?
A milestone is a working product with many user videos on YouTube. This is a press release from the California DMV.
I really respect your skills as a hacker but it is actually quite scary that you would not understand that DMV issuing these permits is actually a strong sign of progress, and that 'user videos on YouTube' would somehow be the yardstick by which you believe we should measure such progress.
It depends what your opinions of those agencies are, and if you believe they have technical competence or not. After the 737 MAX fiasco, I've begun to seriously doubt it.
"you would not understand that DMV issuing these permits is actually a strong sign of progress" explain to me why you believe this. Videos on YouTube convince me something is real and exists. A government press release convinces me that some process was followed, but perhaps one having nothing to do with the solution to the problem.
Self driving is almost entirely a software problem. Testing that software is equally as complex. Do you believe the CA DMV has an army of good software engineers? Data scientists? If not, how did they evaluate it?
I'm nobody and I have huge respect for both of you. I do think some transparency would be nice in regards to how they made the decision and with how much data.
The shortcomings of Comma.AI and Tesla FSD are very clear due to YouTube. If Cruise has a much safer system, that is great, I would just like to see more data.
This isn’t technical progress but is regulatory progress.
Technical progress is the current limiting factor so I agree that it is more important to emphasize right now.
On the other hand, I disagree that user videos on YouTube is a sign of progress. 1. There is an obvious selection bias as only a minority of people with access to the software actually upload videos. 2. The videos are qualitative in nature and it is not feasible for a single human to go through all of them and form an objective opinion on the state of the art. So it’s prone to confirmation bias too.
I’d argue that there isn’t a common yardstick of technical progress for game theoretic reasons but I’d love to see one.
Seems like it might be a bit difficult to have user videos on YouTube if you can't legally have users. Getting the permits issued is a good first step.
It's not technical progress, but it's progress nonetheless. There are significant legal / legislative barriers to get through aren't there? This would, at least in my mind, be progress on that front...
Once software capable of driving at a superhuman level (say 10x the average) exists, the regulatory side will be trivial. What regulator wouldn't want to save 30,000 lives?
Sadly, that software is still a ways away. Don't fall for the trap believing that self driving is at all a regulatory problem. It is only a software one.
Huh, Cruise got a permit for "between 10 PM and 6 AM" up to 30 MPH, while Waymo got no time limit and 65 MPH. Presumably Cruise's permit is actually for the 6 AM to 10 PM time period and not the 10 PM to 6 AM one!
Well, because this is a permit for commercial service, not testing. They already have a testing permit. A commercial service that only operates in the middle of the night would not be very successful.
Why does night seem less likely? I'd expect many of the sensors work equally well or better (no sun interference) at night, and there will be a lot fewer other cars to crash into/cause a traffic jam when the car shuts down and has to be recovered.
The fact that the speed limit is much lower I think backs up that Cruise's permit is actually less permissive, and is most likely for at-night only.
Waymo has had driver-free cars in Phoenix for over a year, so it would be strange for Cruise (who doesn't have a driver-free robo-taxi service already) to get a more permissive permit than Waymo.
Who is liable when someone gets hit with a self driving car? It was always a joke in college how the dream was to get hit by a university bus and have your tuition paid off in a huge settlement. I'd imagine if a waymo car made its way around the more desperate parts of the bay area like the tenderloin and liability ends up shifting to huge company with billions of dollars, we might end up with a Russia sort of situation in terms of rampant insurance fraud. Maybe waymo et al will just respond by never servicing these areas, which will no doubt open an entire can of worms in the press and among the most virtuous online.
in russia the rampant insurance fraud ended in everybody getting dashcams to defend themselves. waymo and cruise have way more than just a dashcam - if you can successfully commit insurance fraud while being recorded on a dozen cameras as well as lidar and infrared sensors, you probably deserve the settlement.
> Who is liable when someone gets hit with a self driving car?
Short answer: in the U.S., it will depend on the facts and circumstances. Common law has many drawbacks. But organic adaptability is one of its advantages.
> Who is liable when someone gets hit with a self driving car?
While this is not exclusive of other liability, probably one or both of:
(1) The person hit, if they were breaking the law in a way which made it unreasonable to expect a driver to avoid hitting them,
(2) The manufacturer of the self-driving vehicle, under normal defective product liability principles.
The owner and/or, where different, operator of the vehicle, as well as other people in the chain of commerce may also be liable, especially when (2) applies.
How would you even discern fraud from the real thing? You could just get drunk and act drunker then stumble onto the road, you'd blow wet on the breathalyzer and the story is plausible enough that a public jury will side with you, the innocent guy on a night out or the guy down on his luck, over scary robot car company, then a precedent will be set.
- Cruise permit is for robo-taxi service, available to public (fully driverless, nobody in the car)
- Waymo permit is for robo-taxi service, available to public (human safety driver behind the wheel at all times)
- Nuro permit is for robo-delivery, available to public (no human passengers)
> Cruise has had state authority to test autonomous vehicles on public roads with a safety driver since 2015 and authority to test autonomous vehicles without a driver since October 2020.
> Waymo has had state authority to test autonomous vehicles on public roads with a safety driver since 2014 and received a driverless testing permit in October 2018.
More info at the DMV's website (although it's a lot to digest): https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...
On a scale of 1 to 10, how confident are you that your service won't kill someone?
>Fatalities from Crashes
>In 2016, over 33,000 traffic crashes resulting in fatalities, major injuries or minor injuries were reported on Bay Area roadways.
10% of the crashes resulted in more than minor injuries, so 3.3K serious crashes.
Maybe he does care, but how much?
This is not to be flippant (i appreciated and used Justin.TV a ton back in the day..canceled Cable TV cause everything was there for one to watch ... 24/7 marathons of your favorite shows) but what I view of the harsh reality of putting these things on the road as Uber's self driving car already killed a pedestrian. Personally, I'm not sure I could in good conscious work for a robot car company (just was being recruited by such a company).
Deleted Comment
2. Where are Cruise's testimonials etc. for the presumably successful San Francisco non-commercial testing of this "fully driverless, nobody in the car" system you now have commercial deployment for?
there's a way to ask a serious question to get a serious answer, and then there's your way.
I believe you, but skeptics won't be persuaded by "the founder said so on HN".
> - Cruise permit is for robo-taxi service, available to public (fully driverless, nobody in the car)
We need to have fewer executives talking about their awesome safe technology and more executives being forced to use their awesome safe technology.
It is extremely unfortunate that as the condition of the permit the DMV did not force dogfooding requiring that every executive of a company applying for driverless robo-taxi permit and executives' families, including their children, were mandated to give up their drivers licenses and all other modes of transportation ensuring that those executives and those executives families are the constant test subjects of the technology.
Same goes for the likes of Musk.
I am pretty sure they recently announced "Full self driving v 2.0 for real this time, seriously, guys" and did not drop out
https://en.wikipedia.org/wiki/Gartner_hype_cycle#/media/File...
They're generally against "create a self-driving startup in 2021, IPO by 2024 at the latest". That's just snake oil salesmanship.
At night only testing at super low speed. It's a good way to get started, but definitely not an autonomous taxi that title implies.
Deleted Comment
(aside from folks crossing the street in non-reflective dark clothing)
I think the worst time to drive - as a human - is right when the sun is coming up or going down, or under tree cover when you go into and out of shadow.
Fatal accidents do indeed spike around these times, although they seem to be worse in the evenings than mornings, and more pronounced in southern states than northern ones.
Deleted Comment
For car-car collisions, not really. Yes, getting into a head-on 30-30 MPH collision is pretty bad, but you're more than likely going to walk away from it, especially if you're in the back seat.
Speed being "slow" or "fast" here is unrelated.
https://www.littlerock.gov/media/2484/the-relation-between-s...
There's got to be a way to make public transportation stop being the least attractive option in low to medium density urban settings.
A few years ago, I was helping my friend consider options for saving money. On the table was if they needed to keep their car. Most of their commute was via public transit. However we looked at the number of transits for which public transit was not viable per month (visiting friends on the other side of the metro area, running to stores), multiplied by the cost of using a ride share service. Comparing this number to the car payment, auto insurance, gas, and maintenance, it was no contest: even a few trips amortized the cost of the car and made it worth it.
Every trip has a cost in dollars, time, and externalities (causing congestion, environmental impact, etc). People will tend towards being efficient with those costs, judged by their relative importance to them personally.
The thing is, owning a car is mostly a fixed dollar cost per month. Once you pass the point where owning one is worth it for a few trips, the cost of additional trips is far less, while the time savings remain high. The only time I consider public transportation is when parking is a problem, as all other things considered driving is preferable.
Once self driving cars reach a point where they are closer to the amortized cost of a trip when owning a car, the tradeoffs start to change. Those who use their cars for few, but necessary, trips will be the first to ditch owning. I suspect many multi-car families will downsize to a single car. Once you get rid of your car, and are using self-driving cars for the trips where they are necessary, the question for every trip becomes "public transit, or self driving?" Different trips will have different ways they play out. However lot of people who are currently eating the cost of owning a car and might as well drive anyways will suddenly be able to save a few bucks by using public transit instead.
The final magic of all this is that if you're going anywhere more than a few minutes away, there are many other people going from approximately where you are, to approximately where you're going. The big killer in public transit speed is time. Time going to a common pickup point, time waiting for it to arrive, time spent not going on a direct route, time spent stopping for others to get on or off. With a sufficiently well used self driving car network, it could easily join a few rides together at a point nearby, and have self driving cars ready and waiting at the point near your destination. Two quick transfers, and that's it. If the self driving car app says "45 minutes, $20, or 50 minutes for $10", lots of people would save that $10. For destinations such as downtown, airport, sport stadiums, there's even only one transfer. Given car pool lanes and such, transferring to an express bus might actually be faster than driving directly.
People will not care, and I also do not think these automated cars are going to kill any pedestrians anyways.
Wow, driverless actually means no driver here. But how does that work? What if it makes a mistake? Is there some fallback pilot somewhere that can take over and control it remotely?
"Hi, yeah, car's gone berserk, we're approaching a river, all doors are locked, can't get out, need assistance."
"Hi, thank you for calling. Please hold (your breath)."
>Waymo is authorized to use a fleet of light-duty autonomous vehicles for commercial services within parts of San Francisco and San Mateo counties. The vehicles are approved to operate on public roads with a speed limit of no more than 65 mph and can also operate in rain and light fog. Waymo has had state authority to test autonomous vehicles on public roads with a safety driver since 2014 and received a driverless testing permit in October 2018.
Deleted Comment
Why is this progress? Evaluating progress in self driving cars is extremely complex. Why do people trust that the California government is able to do so?
A milestone is a working product with many user videos on YouTube. This is a press release from the California DMV.
Does the same hold for the FAA? The FDA?
"you would not understand that DMV issuing these permits is actually a strong sign of progress" explain to me why you believe this. Videos on YouTube convince me something is real and exists. A government press release convinces me that some process was followed, but perhaps one having nothing to do with the solution to the problem.
Self driving is almost entirely a software problem. Testing that software is equally as complex. Do you believe the CA DMV has an army of good software engineers? Data scientists? If not, how did they evaluate it?
The shortcomings of Comma.AI and Tesla FSD are very clear due to YouTube. If Cruise has a much safer system, that is great, I would just like to see more data.
DMV consists of bureaucrats that seemingly just rubber stamp the permits hoping for the best.
In that context, people actually using the technology, testing and evaluating its performance are like FAA/FDA and not DMV.
Technical progress is the current limiting factor so I agree that it is more important to emphasize right now.
On the other hand, I disagree that user videos on YouTube is a sign of progress. 1. There is an obvious selection bias as only a minority of people with access to the software actually upload videos. 2. The videos are qualitative in nature and it is not feasible for a single human to go through all of them and form an objective opinion on the state of the art. So it’s prone to confirmation bias too.
I’d argue that there isn’t a common yardstick of technical progress for game theoretic reasons but I’d love to see one.
So Waymo then?
https://www.youtube.com/results?search_query=waymo
While FSD beta may still have issues, I see no reason to believe that this doesn't also have similar issues.
Sadly, that software is still a ways away. Don't fall for the trap believing that self driving is at all a regulatory problem. It is only a software one.
Waymo has had driver-free cars in Phoenix for over a year, so it would be strange for Cruise (who doesn't have a driver-free robo-taxi service already) to get a more permissive permit than Waymo.
in russia the rampant insurance fraud ended in everybody getting dashcams to defend themselves. waymo and cruise have way more than just a dashcam - if you can successfully commit insurance fraud while being recorded on a dozen cameras as well as lidar and infrared sensors, you probably deserve the settlement.
Short answer: in the U.S., it will depend on the facts and circumstances. Common law has many drawbacks. But organic adaptability is one of its advantages.
While this is not exclusive of other liability, probably one or both of:
(1) The person hit, if they were breaking the law in a way which made it unreasonable to expect a driver to avoid hitting them,
(2) The manufacturer of the self-driving vehicle, under normal defective product liability principles.
The owner and/or, where different, operator of the vehicle, as well as other people in the chain of commerce may also be liable, especially when (2) applies.
As is usually the case with the legal system, whoever has the least amount of political/economic power. So either the safety driver or the victim.
https://en.wikipedia.org/wiki/Death_of_Elaine_Herzberg