> According to the Texas Transportation Code, the owner of a driverless car is “considered the operator” and can be cited for breaking traffic laws “regardless of whether the person is physically present in the vehicle.”
This is all well and good when the owner is Waymo or Cruise and actually has direct control over the code that caused the vehicle to fail, but for consumer cars it brings up important questions of what "ownership" even means these days.
Let's say I own a self-driving car that's able to reliably park itself. I pay close attention the first few months to make sure that it's safe and reliable, and once it's proved itself I start trusting it. I've been as careful as can reasonably be expected, but one day the manufacturer pushes a mandatory, automatic over-the-air update that contains a critical bug, and my car suddenly starts misbehaving and I get ticketed.
This isn't an unlikely scenario—how many of us have "owned" a product that has suddenly changed its behavior in undesirable ways after we purchased it? It doesn't seem reasonable to me to hold an individual consumer liable for bugs in a machine that they are unable to fully control.
Either manufacturers need to be required to provide important software freedoms to vehicle owners, or the manufacturers themselves need to be liable for infractions.
There are two components here that both already have an answer in the real world.
1) making owners of cars responsible for (most) traffic infractions
2) issues caused by third parties, with the first party bearing initial liability
Number 1 is surprisingly common in overseas countries. Some of those countries have figured that your garden variety traffic fines clog up the courts, so they’ve lifted simple garden variety infractions out of criminal law and placed them in administrative law.
In doing so, they’ve made the owner of the car (i.e., license plate holder) responsible for these types of traffic infractions, who then receive their fines by mail (which are technically settlement offers to avoid prosecution) with a generous appeal window, after which the fine is finalized (doesn’t end up on criminal record because it’s not criminal anymore).
This frees up court resources, allows for ticketing without stops, and shifts responsibility to the car's owner, who then, if applicable, can settle it with whomever they borrow their car to.
For more severe infractions, it’s still tied to the driver, with the car owner being the initial presumptive driver.
Number 2 is a common situation in the US and overseas and is resolved by going after whoever is responsible (e.g., manufacturer) after the fact to be made whole again.
It’s similar to going after the seller of the stolen goods after the original owner takes back their property from you in jurisdictions where property rights don’t transfer after stolen goods are sold.
>"so they’ve lifted simple garden variety infractions out of criminal law and placed them in administrative law."
Many "garden variety infractions" are human errors. I do not think those were ever a subject to criminal prosecution disregarding whether the ability to clog the courts.
As individuals we are never going to own self-driving cars.
If buying the hardware is even an option to begin with, there's going to be a hefty fee to subscribe to the required 'cloud based' self-driving service.
(And there's certainly no way individuals/small businesses will be able to buy such a car and run a self-driving taxi service profitably when a megacorporation could have all the income from all the cars)
I still believe the free market can work for self driving cars. If there's demand for offline self driving cars, someone will make them. Probably someone in Asia will make them affordable.
But I believe regulatory capture and anti-competitive tariffs are likely to make affordable, buy-and-use cars impossible to sell in the US. Regulations will likely change annually, like they do for cars right now. The govt will be worried that China has control of our morning commute.
I hate being cynical, but cars do seem like a lost cause.
I don't have strong opinions about self-driving cars in this context, but the pattern of this argument doesn't track: you can omit "self-driving" and the underlying premise is still the same (that companies have an incentive to lease rather than sell objects to capture service revenues).
But the reality is that people do own cars, and no car manufacturer that I know is also trying to become a taxi operator (except maybe in the indirect investment sense). Or in other words: sales in a market aren't affected by the abstract possibility of vertical integration, because any entity that attempts to vertically integrate will be undercut by other parties making sales at the same time.
Individuals can buy self-driving cars from Mercedes-Benz today. They are only level 3 autonomous, but the manufacturer does assume legal liability for incidents that occur during autonomous operation (traffic tickets may be a separate issue). There is a hefty subscription fee for the Drive Pilot service.
Never seems too strong here. In 2000 years when it has been maybe 1500 years since a human turned a steering wheel I can see a situation where an individual owns a self driving car like thing.
Even well before some such future, the type of people who buy Bugatti Veyron will still want to buy something unique and out of reach of the 99.99% of the population. Maybe that is human driven but what if human driven becomes illegal?
We already have laws that make it so the driver is liable for infractions, crashes, etc. Even if you own the car, but someone else drives, driver gets the ticket. If this weren’t the case, rent-a-car agencies wouldn’t exist.
The same rule should extend to self-driving cars. The manufacturer is driving, ergo they should be liable for everything the car does.
> Either manufacturers need to be required to provide important software freedoms to vehicle owners
FOSS robot cars seem like a great way to bypass emissions controls, safety regulations, and plain common sense. It'd take what, half a day, for some teenager to modify the firmware to double the 0-60 and blow up the battery pack at the same time?
I'd want the opposite: for self-driving firmware to be locked down and hardware signed to heck, with an independent auditing body code-reviewing every update.
> or the manufacturers themselves need to be liable for infractions.
Can't it be both? Split legal responsibility between each consumer and manufacturer. Doesn't have to be 50/50, but personal liability (even capped at a few grand) could make people pay attention and take over when the self-driving doesn't seem to be doing its job. We don't need more drivers on the roads who think their wondercars are infallible and blame everyone else for their accidents.
>FOSS robot cars seem like a great way to bypass emissions controls, safety regulations, and plain common sense. It'd take what, half a day, for some teenager to modify the firmware to double the 0-60 and blow up the battery pack at the same time?
generally we don't pen laws in order to save a singular idiot teenager -- more-so the aforementioned teenager bought a product and then ruined it; the teenager already received the punishment for his actions by the destruction of their own property without the need for jurisprudence and complexity.
Also : Nearly every major car manufacturer in the world has been caught semi-recently violating emissions standards. Some have violated safety standards. Given that this hasn't been an impasse for the worlds largest companies, why presume that the FOSS equivalents would be worse? Our regulations have failed to constrain the largest companies who are apparently beholden to the most regulation and law the world has ever seen, but somehow FOSS individuals/collectives need to do better than these billion dollar behemoths? Nice.
I don't get why people suddenly think that when something is 'dangerous' then it's okay to throw in the towel on personal property ownership and take whatever lend-lease-borrow program is offered in exchange. Cars have always been dangerous, and aside from gross safety and emissions laws, we're allowed to modify them.
I don't understand why this changes radically when a vehicle becomes electric, were gas tanks and high pressure fuel injectors suddenly safer than I remember them being? "The stupid teen who burns his house down" certainly applied to things in the past, and we didn't go out of the way to lock down the fuel tank into an inaccessible black box in response, we just dealt with the inherent risk of dangerous equipment and somehow avoided full-blown chaos and calamity through educated precaution and training.
Here's what I want: I want the freedom to modify my own property, and when that modification becomes in violation of an agreed upon law I should be punished.
Forget all this pre-crime 'what-if' save the children bullshit. Punish offenders and let civilian non-offenders do as they please until such time that they violate a law or standard -- which would be more enforcement than the world is seemingly applying to every other car manufacturer in the world.
> FOSS robot cars seem like a great way to bypass emissions controls, safety regulations, and plain common sense. It'd take what, half a day, for some teenager to modify the firmware to double the 0-60 and blow up the battery pack at the same time?
Hasn't that always been possible? Just because it's in software now I don't see why it's suddenly different
That's a reasonable argument. In that case, I argue that the manufacturer who wrote and signed the firmware should be liable as the operator of the vehicle.
>FOSS robot cars seem like a great way to bypass emissions controls, safety regulations, and plain common sense. It'd take what, half a day, for some teenager to modify the firmware to double the 0-60 and blow up the battery pack at the same time?
People already mod their cars, and there are vast communities of teens and early-twenties following exhaustive video tutorials on how to illegally and unsafely make their car faster and louder. Cars aren't quite FOSS, but they're better documented than most FOSS is.
> Let's say I own a self-driving car that's able to reliably park itself. I pay close attention the first few months to make sure that it's safe and reliable, and once it's proved itself I start trusting it. I've been as careful as can reasonably be expected, but one day the manufacturer pushes a mandatory, automatic over-the-air update that contains a critical bug, and my car suddenly starts misbehaving and I get ticketed.
But this is very easy. The self-driving is a component, same as brakes or lights. If your brakes stop working because of manufacturing issue and you crash into something it's the manufacturer's fault. Nothing new here.
I agree that's how it should work, but the TX law that the article quotes changes that rule to make the owner of the vehicle at fault for self driving failures.
> one day the manufacturer pushes a mandatory, automatic over-the-air update that contains a critical bug
This is a hidden two-fault scenario, which isn't always "bad engineering" but is usually a smell in that direction. You're (1) accepting that the operator/owner/whatever of the vehicle is technically responsible, (2) acknowledging that this supervision is likely to work in practice, then (3) imagining a hardware/systems failure in the car and simultaneously (4) supposing that the supervision you took for granted in (2) fails at the same time.
Basically, you don't need point 3 to make your point about safety logically. You added it to change the villain from the driver to the manufacturer.
I set up the scenario to show that no matter how careful the consumer is, their supervision will not work in practice (since the vehicle's behavior can change at any time and without warning) and therefore they cannot be held responsible for the behavior of the vehicle.
Either the manufacturer should not be allowed to market it (because it cannot be operated safely) or the manufacturer should be considered the operator.
> It doesn't seem reasonable to me to hold an individual consumer liable for bugs in a machine that they are unable to fully control.
They chose to buy it. Manufacturers should also be potentially liable for some classes of problem, but consumers are the ones pressing the "run the self-driving car" button. The court system exists to apportion liability properly past that.
No, absolutely not, never ever. This just highly incentivizes Manufacturers to dump trash garbage into the market (since you've just forced consumers to bear all liability for Manufacturer failures).
You do this once, and you might as well kiss all of society away. Imagine a toy that kills children, and you tell parents, "well, you pressed the 'On' button, so it's your fault". Imagine a furnace that burns down your house, and you tell homeowners, "well, you pressed the 'heat' button, what did you expect?". No one would be willing to buy anything at all, because every manufacturer would be in a race to see the shittiest thing they could trick people into buying. Every product on every shelf would be snake oil.
---
Manufacturers should always be fully liable for any and all problems, that can not be directly attributed to extreme gross negligence on the consumers part.
"wel, consumers are the ones pressing the "run the self-driving car" button" -- absolutely not. Manufacturers sold car with a "self-driving" button, consumers hold zero fault for pressing the button Manufacturers promised would work, unless there's some kind of extreme gross negligence involved. (i.e., pressing the button, while the car is already dangling off a cliff-side, or pressing the button after gouging out every camera and sensor on the car, or whatever).
"But then Manufacturers couldn't sell a self-driving car" -- cool, then the car isn't ready to be sold anyway. If a car company isn't ready to take on the liability for their self-driving product, then it isn't freaking ready for the road -- no exceptions.
They chose to buy a product with a set of known characteristics
The manufacturer then changed those characteristics.
If a manufacturer makes changes post purchase to a product resulting in harm, they should bear the majority of the liability, if not all of it.
If I had a microwave that for years cooked a potato in 3 minutes safely, but then was modified by an update to catch fire under the same conditions, I shouldn't be the one on the hook for pushing that button.
So we should just plan on every self-driving vehicle ticket ending up in the court system so that we can properly apportion liability between the consumer (who pressed the button) and the manufacturer (who designed the car, wrote the code, advertised the vehicle, and then updated it many times after the consumer purchased it)?
But there are games like Rocket League who have never changed their controls or physics. They changed everything else except for those things so that people could predictably build and maintain skill. I think it’s not holding a car company to too high or a standard.
OTA updates would require not only 5y simulation testing (dine in 5 days w the right computing power) but would also not be applied all at the same time, in order to get flaws in time before full fleet deployment.
In this case the consumer should of course still be held responsible, and if they think it's wrong they can join in on the class action lawsuit against the manufacturer.
Or, owners should enter into a pre-existing legal relationship with the manufacturer to get reimbursed for any manufacturer error that gets pushed out.
Those who choose to operate driverless vehicles must have some skin in the game in order for this to be fair at all. Bringing your driverless vehicle into an area is the customer's choice, after all.
That's the mostly overly American thing I've read this year... Make consumers file a class action against car manufacturers (good luck with a Tesla and mandatory arbitration), all to avoid doing the sensible thing: Make some rules for this market.
The likes of Musk will moan and bitch about it, but regulation is a perfectly good solution for this.
Going to repeated class action lawsuits every time there is a bad software update sounds like an inefficient use of the court system to me.
I think most people will eventually want the manufacturer to be the defendant in court cases that result from crashes due to a manufacturer updating cars. I can only imagine NHTSA (US Auto Regulator) will evolve to make this the case.
So basically the eventual outcome here is known (industry regulation) but we must take every step to do it the hardest way possible? Intentionally start with a free-for-all then have deaths, a social movement, lawsuits, court cases, politicians talking about it, campaigns, legislative debates, etc to arrive at what we all know is the end game to begin with?
I think there may be some optimization steps here.
Here is a (very) incomplete list. Not all are ones that I've personally experienced, but they're all products that either I or someone in my direct acquaintance have owned/used.
* Microsoft Windows starts showing ads in new and creative places, even though I shelled out $200 for the Pro version.
* An Android update broke my preferred home screen.
* Roomba vacuums go through regular periods of stupidity in between software updates. One day it's able to dodge obstacles, the next it's sucking up anything and everything.
* Basically every SaaS product changes its UI on a semi-regular basis in ways that throw off existing workflows.
EDIT: Oh, and a bonus one that's more on topic—Tesla recently rushed an update to their software and introduced a lot of bugs:
Really? Hell my Google Mini has only recently decided to scream "by the way, the mic is off!" on startup every single time. I've had the mic off for years. Google decided to punish me for using it only as a speaker.
Really any cloud-connected device would fit this bill. Many have gone out of business and turned off their backends or changed the way they work in a way that is incompatible with the original purchasing intent of the user. Not to mention software products that evolve in ways the user doesn't like all the time. https://www.bbc.com/news/technology-64249388 is an article with some more specific hardware examples if that's what you're looking for.
> one day the manufacturer pushes a mandatory, automatic over-the-air update that contains a critical bug, and my car suddenly starts misbehaving and I get ticketed
Don’t buy it if the company who controls it does not put it in writing that they indemnify you properly.
It's immaterial. Auto manufacturers' product designers and executives aren't being held responsible for rising pedestrian deaths because their oversized killer vehicles keep running over people when driven by humans. Self-driving is responsible for almost no deaths or injuries, most likely net-negative.
Maybe this is an unpopular opinion here, but "tickets" don't make any conceptual sense for driverless cars. The entire concept of tickets and fines are to act as a deterrent to individual drivers who might make unsafe choices -- that they'll actively choose whether to obey a speed limit or not.
Driverless cars don't seem to be programmed to break laws in the first place. They're incapable of making a choice to break laws to save time. They're simply programmed not to speed, not to run through red lights. And when they do, it's not out of some "choice" -- it's just a bug or deficiency in their code.
Driverless cars absolutely do need to be held to safety standards. But the way to do that isn't through issuing tickets. It's simply for a governmental regulatory agency to set statistical limits for the manufacturer's software for mistakes like running a red light, and if the whole fleet of cars goes above that limit you require the software manufacturer to fix it, risking fines if they don't fix it quickly, or losing the ability to operate at all.
Issuing tickets for driverless cars makes as much sense as issuing tickets if pharmaceutical companies put the wrong amount of a drug in a pill. Tickets are for individuals who make choices; safety regulations with fines and revoking licenses are for corporations and things like driverless cars.
Avoidable costs are still an incentive to improve for the companies owning or running driverless cars. As it feels unlikely all things that will ever happen are captured within regulations, fines could still be useful. The bigger question is what to do with things exceeding fines, i.e., stuff that would lead to a custodial sentence for a driver.
No it doesn't, but they should ticket the cars anyway. The ticket gets paid by Waymo or whoever runs the car. They don't want any more resentment from other drivers.
Tickets also set dangerous incentives. Cities would be rewarded for intentionally creating traffic situations that will be misinterpreted by driverless cars, making the roads less save for everyone.
Of course this can already be an issue today, but driverless cars might tip it from being a necessary evil to just plain evil
I agree that regular, run-of-the-mill tickets wouldn’t motivate driverless car companies. But I disagree that there shouldn’t be any penalty for a company whose fleet does not exceed some threshold of red-light-running, for example. If a vehicle unjustifiably runs a red light, there should be a penalty, regardless of how it’s being driven. The penalties could differ for driverless car companies, to make sure they’re big enough to meaningfully motivate these companies.
Otherwise it feels patently unfair to human drivers, who can’t say “well, my family hasn’t broken many traffic laws recently, so I shouldn’t have to pay a ticket for just running this one red light/almost hitting this one pedestrian.
> Tickets are for individuals who make choices; safety regulations with fines and revoking licenses are for corporations and things like driverless cars.
Fine the owner of the self-driving car enough and she will sell it and never buy from this company again. Fine enough owners and the company is broke.
Similarly, we do not punish the Coca-Colas of the world for pumping out fossil-fuel based plastic packaging by a ton, but countries do increasingly enforce consumer recycling (with penalties).
So you’re suggesting the effect is correct but it shouldn’t be first order and some invisible free market force will actually work the way you pretend?
I basically agree with all that. I think I'd further say that the way any kind of practical regulation shakes out, it may never make sense for individuals to own self-driving cars.
If we are to hold the creator of self-driving systems responsible for any poor behavior of their creations (as we very well should!), then they must have both the ability to be certain that all of their deployed software and hardware is always in the intended correct working order, and the ability to perform fixes and updates on demand of either themselves or some regulating agency.
Therefore, it seems pointless for it to ever be possible for an individual to "own" such a thing. They would inevitably be mandated to never attempt to modify the system and to accept any updates and maintenance tasks immediately on the creator's demand. So why should any reasonable individual, or banks who would have to be writing loans to those individuals, want to own such a thing when they have so little control over it and so many liabilities?
Consider a situation like - the car has 6 laser/lidar sensors. It is designed to continue operating as well as possible if one of them malfunctions. If one of them does infact malfunction, who is responsible for deciding when to replace it and paying for such service? If you truly own such a vehicle, wouldn't you be the one paying for it? And if you're paying for it, how could you not be the one deciding exactly when it happens? So then inevitably some people would put it off as long as possible and have the vehicle operating in a degraded condition. Maybe it could try to refuse to operate until serviced, but when? Maybe when you're out of town somewhere, or when you have an important meeting to make it to later?
Liability, even conditional, for the end user doesn't seem like a very good solution either. Is it really a good system for every unfortunate incident to become a legal battle between an individual owner and what would necessarily be a gigantic mega-corporation with a fat legal budget for who is really responsible? Yeah, no.
So I think I just proved that self-driving cars pretty much always have to be operated as a taxi-like system as Waymo etc currently do. Lidar sensor breaks? Not my problem, the service needs to send another one, no matter what the location - it can't go to any location it couldn't send a replacement to anyways. If there's any problems, there's always a clear party responsible - a giant wealthy mega-corp that is very easy to regulate and fine and capable of fixing the whole fleet fast.
Am I missing something here? Maybe, but I don't see it right now.
In between tickets and regulations, there's insurance. Say that my car bumps into a parked car. My insurance pays for the repairs. The insurance company figures out how much to charge me. Simply saying that a driverless car pays for the damage that it causes might be the way to manage this issue.
who is liable in case of a self-driving car causes an accident? Is the liability any different between an accident and a potential accident (ie ticket) ?
I for one do not buy the argument that the intent to break the law is not there. To understand why this is the case: imagine a really fast self driving car that enters the runway at an airport. It speeds up as it attempts to take off. It does not take off and in the process - best case - causes delay for everyone using the airport. Now, I could argue that this car is a plane and it was in fact NOT programmed to NOT take off. Therefore everything is good.
In my opinion: self-driving cars do not belong on the road unless there are real consequences to their screw ups. Right now it's a beta test where you might just lose your life and nobody is going to be held accountable.
They shouldn't be immune from traffic tickets, but manufacturers should be the target of those tickets, under the condition that the driverless features were used in the way the manufacturer intended.
I'm not liable for a traffic ticket when I get in a taxi or on a bus and the driver does something wrong. In a driverless system, the "driver" is the manufacturer, or whoever wrote and updates the self-driving software.
But this also needs to come with some regulation. Driverless mode needs to be audited and certified against a standard, by a third party. Similar (in spirit, anyway) to how human drivers take a driving test before being issued a license.
When I've taken Waymo rides, the weirdest part of the experience was that it actually obeyed speed limits and stopped at stop signs. The latter actually freaked out other drivers.
I can't wait until the selfishness of humans is no longer a factor when it comes to road safety.
here you see two waymos turning right, and the one behind actually turns right from the left lane before the first car proceeds. neither were using indicators either. honestly you’d have to be an incredibly selfish human to pull this off.
Makes sense if the 2nd waymo identified the 1st. If I drove around waymo cars enough, I would probably consider driving around them like this at turns like this where gaps in traffic were typically long enough for humans but not waymo.
Traffic tickets will need to belong to the competent supervisor of an autonomous system violating stated restrictions, even if they lack a driver's license. (The need for a driver's license will decline for FSD L4-L5, especially if the human won't be operating the vehicle anymore.)
Also, under what circumstances/edge-cases should speed limits and traffic laws should be breakable by an autonomous system?
a. If you were someone bleeding out to death and need to get to a hospital, do you honestly want it to complete to complete stop like a Canadian Mormon nerd at every stop sign rather than be the most aggressive NYC cab driver-meets-ludicrous speed safe-ish enough so you don't die?
b. Evading someone believed intending on causing great bodily harm to reach a safe place like a police station?
c. Following a vehicle containing a kidnapping or human trafficking victim?
d. Commandeering by a LEO in active pursuit of a dangerous suspect?
e. Fleeing an incoming regional disaster threat?
It is probable to likely that, by over-regulation, untold numbers of preventable deaths, injuries, suffering, and property damage will be incurred by vehicles created by corporations to always follow laws blindly rather than spend additional money to handle edge-cases or allow exceptions.
This kind of reminds me of when the Concorde first started flying and technically could cancel IFR flight and proceed under visual flight rules over the atlantic because it was above 60,000 feet. An interesting legal possibility but likely not a practical consideration until driverless cars are afforded the same legal status as a human driver. As it currently stands, an AI would be considered a driving aid under the command of a licensed human driver, no different than cruise control.
> “It seems like while they make fewer of the kind of mistakes that we see from human drivers, they make interesting new kinds of mistakes,” Raicu said. “It has the feel of a human subject mass experiment, right? Without the kind of consent that we usually want to see as part of that.”
"human subject mass experiment", "consent"?
No, she's wrong there -that's the past. For the past 4 years mass experimentation is fine. Just do it!
This is all well and good when the owner is Waymo or Cruise and actually has direct control over the code that caused the vehicle to fail, but for consumer cars it brings up important questions of what "ownership" even means these days.
Let's say I own a self-driving car that's able to reliably park itself. I pay close attention the first few months to make sure that it's safe and reliable, and once it's proved itself I start trusting it. I've been as careful as can reasonably be expected, but one day the manufacturer pushes a mandatory, automatic over-the-air update that contains a critical bug, and my car suddenly starts misbehaving and I get ticketed.
This isn't an unlikely scenario—how many of us have "owned" a product that has suddenly changed its behavior in undesirable ways after we purchased it? It doesn't seem reasonable to me to hold an individual consumer liable for bugs in a machine that they are unable to fully control.
Either manufacturers need to be required to provide important software freedoms to vehicle owners, or the manufacturers themselves need to be liable for infractions.
1) making owners of cars responsible for (most) traffic infractions
2) issues caused by third parties, with the first party bearing initial liability
Number 1 is surprisingly common in overseas countries. Some of those countries have figured that your garden variety traffic fines clog up the courts, so they’ve lifted simple garden variety infractions out of criminal law and placed them in administrative law.
In doing so, they’ve made the owner of the car (i.e., license plate holder) responsible for these types of traffic infractions, who then receive their fines by mail (which are technically settlement offers to avoid prosecution) with a generous appeal window, after which the fine is finalized (doesn’t end up on criminal record because it’s not criminal anymore).
This frees up court resources, allows for ticketing without stops, and shifts responsibility to the car's owner, who then, if applicable, can settle it with whomever they borrow their car to.
For more severe infractions, it’s still tied to the driver, with the car owner being the initial presumptive driver.
Number 2 is a common situation in the US and overseas and is resolved by going after whoever is responsible (e.g., manufacturer) after the fact to be made whole again. It’s similar to going after the seller of the stolen goods after the original owner takes back their property from you in jurisdictions where property rights don’t transfer after stolen goods are sold.
Many "garden variety infractions" are human errors. I do not think those were ever a subject to criminal prosecution disregarding whether the ability to clog the courts.
If buying the hardware is even an option to begin with, there's going to be a hefty fee to subscribe to the required 'cloud based' self-driving service.
(And there's certainly no way individuals/small businesses will be able to buy such a car and run a self-driving taxi service profitably when a megacorporation could have all the income from all the cars)
Positive HN discussion: https://news.ycombinator.com/item?id=36927971
But I believe regulatory capture and anti-competitive tariffs are likely to make affordable, buy-and-use cars impossible to sell in the US. Regulations will likely change annually, like they do for cars right now. The govt will be worried that China has control of our morning commute.
I hate being cynical, but cars do seem like a lost cause.
But the reality is that people do own cars, and no car manufacturer that I know is also trying to become a taxi operator (except maybe in the indirect investment sense). Or in other words: sales in a market aren't affected by the abstract possibility of vertical integration, because any entity that attempts to vertically integrate will be undercut by other parties making sales at the same time.
https://www.mbusa.com/en/owners/manuals/drive-pilot
Even well before some such future, the type of people who buy Bugatti Veyron will still want to buy something unique and out of reach of the 99.99% of the population. Maybe that is human driven but what if human driven becomes illegal?
The same rule should extend to self-driving cars. The manufacturer is driving, ergo they should be liable for everything the car does.
Mercedes’s approach to self-driving feels the most promising here. Afaik they are so far the only manufacturer who has said “we’re driving, we’ll take the liability” – https://insideevs.com/news/575160/mercedes-accepts-legal-res...
FOSS robot cars seem like a great way to bypass emissions controls, safety regulations, and plain common sense. It'd take what, half a day, for some teenager to modify the firmware to double the 0-60 and blow up the battery pack at the same time?
I'd want the opposite: for self-driving firmware to be locked down and hardware signed to heck, with an independent auditing body code-reviewing every update.
> or the manufacturers themselves need to be liable for infractions.
Can't it be both? Split legal responsibility between each consumer and manufacturer. Doesn't have to be 50/50, but personal liability (even capped at a few grand) could make people pay attention and take over when the self-driving doesn't seem to be doing its job. We don't need more drivers on the roads who think their wondercars are infallible and blame everyone else for their accidents.
generally we don't pen laws in order to save a singular idiot teenager -- more-so the aforementioned teenager bought a product and then ruined it; the teenager already received the punishment for his actions by the destruction of their own property without the need for jurisprudence and complexity.
Also : Nearly every major car manufacturer in the world has been caught semi-recently violating emissions standards. Some have violated safety standards. Given that this hasn't been an impasse for the worlds largest companies, why presume that the FOSS equivalents would be worse? Our regulations have failed to constrain the largest companies who are apparently beholden to the most regulation and law the world has ever seen, but somehow FOSS individuals/collectives need to do better than these billion dollar behemoths? Nice.
I don't get why people suddenly think that when something is 'dangerous' then it's okay to throw in the towel on personal property ownership and take whatever lend-lease-borrow program is offered in exchange. Cars have always been dangerous, and aside from gross safety and emissions laws, we're allowed to modify them.
I don't understand why this changes radically when a vehicle becomes electric, were gas tanks and high pressure fuel injectors suddenly safer than I remember them being? "The stupid teen who burns his house down" certainly applied to things in the past, and we didn't go out of the way to lock down the fuel tank into an inaccessible black box in response, we just dealt with the inherent risk of dangerous equipment and somehow avoided full-blown chaos and calamity through educated precaution and training.
Here's what I want: I want the freedom to modify my own property, and when that modification becomes in violation of an agreed upon law I should be punished.
Forget all this pre-crime 'what-if' save the children bullshit. Punish offenders and let civilian non-offenders do as they please until such time that they violate a law or standard -- which would be more enforcement than the world is seemingly applying to every other car manufacturer in the world.
this is basically why the NHTSA told car manufacturers not to obey Massachusetts' Right to Repair laws.
https://www.wbur.org/news/2023/06/14/federal-highway-traffic...
Hasn't that always been possible? Just because it's in software now I don't see why it's suddenly different
People already mod their cars, and there are vast communities of teens and early-twenties following exhaustive video tutorials on how to illegally and unsafely make their car faster and louder. Cars aren't quite FOSS, but they're better documented than most FOSS is.
Dead Comment
But this is very easy. The self-driving is a component, same as brakes or lights. If your brakes stop working because of manufacturing issue and you crash into something it's the manufacturer's fault. Nothing new here.
> Exam invalidated over... Facetime emojis??? > https://www.reddit.com/r/AWSCertifications/comments/18p8uub/...
This is a hidden two-fault scenario, which isn't always "bad engineering" but is usually a smell in that direction. You're (1) accepting that the operator/owner/whatever of the vehicle is technically responsible, (2) acknowledging that this supervision is likely to work in practice, then (3) imagining a hardware/systems failure in the car and simultaneously (4) supposing that the supervision you took for granted in (2) fails at the same time.
Basically, you don't need point 3 to make your point about safety logically. You added it to change the villain from the driver to the manufacturer.
I set up the scenario to show that no matter how careful the consumer is, their supervision will not work in practice (since the vehicle's behavior can change at any time and without warning) and therefore they cannot be held responsible for the behavior of the vehicle.
Either the manufacturer should not be allowed to market it (because it cannot be operated safely) or the manufacturer should be considered the operator.
They chose to buy it. Manufacturers should also be potentially liable for some classes of problem, but consumers are the ones pressing the "run the self-driving car" button. The court system exists to apportion liability properly past that.
No, absolutely not, never ever. This just highly incentivizes Manufacturers to dump trash garbage into the market (since you've just forced consumers to bear all liability for Manufacturer failures).
You do this once, and you might as well kiss all of society away. Imagine a toy that kills children, and you tell parents, "well, you pressed the 'On' button, so it's your fault". Imagine a furnace that burns down your house, and you tell homeowners, "well, you pressed the 'heat' button, what did you expect?". No one would be willing to buy anything at all, because every manufacturer would be in a race to see the shittiest thing they could trick people into buying. Every product on every shelf would be snake oil.
---
Manufacturers should always be fully liable for any and all problems, that can not be directly attributed to extreme gross negligence on the consumers part.
"wel, consumers are the ones pressing the "run the self-driving car" button" -- absolutely not. Manufacturers sold car with a "self-driving" button, consumers hold zero fault for pressing the button Manufacturers promised would work, unless there's some kind of extreme gross negligence involved. (i.e., pressing the button, while the car is already dangling off a cliff-side, or pressing the button after gouging out every camera and sensor on the car, or whatever).
"But then Manufacturers couldn't sell a self-driving car" -- cool, then the car isn't ready to be sold anyway. If a car company isn't ready to take on the liability for their self-driving product, then it isn't freaking ready for the road -- no exceptions.
The manufacturer then changed those characteristics.
If a manufacturer makes changes post purchase to a product resulting in harm, they should bear the majority of the liability, if not all of it.
If I had a microwave that for years cooked a potato in 3 minutes safely, but then was modified by an update to catch fire under the same conditions, I shouldn't be the one on the hook for pushing that button.
Or, owners should enter into a pre-existing legal relationship with the manufacturer to get reimbursed for any manufacturer error that gets pushed out.
Those who choose to operate driverless vehicles must have some skin in the game in order for this to be fair at all. Bringing your driverless vehicle into an area is the customer's choice, after all.
The likes of Musk will moan and bitch about it, but regulation is a perfectly good solution for this.
I think most people will eventually want the manufacturer to be the defendant in court cases that result from crashes due to a manufacturer updating cars. I can only imagine NHTSA (US Auto Regulator) will evolve to make this the case.
I think there may be some optimization steps here.
I'd be interested in some examples, as I can't think of any offhand, despite your claim that this "isn't unlikely".
Also, when it comes to vehicles misbehaving in traffic, I'd say "harmful" is a more apt term than "undesirable".
* Microsoft Windows starts showing ads in new and creative places, even though I shelled out $200 for the Pro version.
* An Android update broke my preferred home screen.
* Roomba vacuums go through regular periods of stupidity in between software updates. One day it's able to dodge obstacles, the next it's sucking up anything and everything.
* Basically every SaaS product changes its UI on a semi-regular basis in ways that throw off existing workflows.
EDIT: Oh, and a bonus one that's more on topic—Tesla recently rushed an update to their software and introduced a lot of bugs:
https://www.autoevolution.com/news/tesla-s-holiday-update-sc...
Windows, macOS, iOS, ipadOS, android. All try to ram updates down your throat, and make it difficult or impossible (android) to disable.
How I wish dumb TVs were still the norm.
Don’t buy it if the company who controls it does not put it in writing that they indemnify you properly.
Perhaps there is something we can do before the, nearly obvious, harm occurs.
the best of capitalism will definitely happen.
Driverless cars don't seem to be programmed to break laws in the first place. They're incapable of making a choice to break laws to save time. They're simply programmed not to speed, not to run through red lights. And when they do, it's not out of some "choice" -- it's just a bug or deficiency in their code.
Driverless cars absolutely do need to be held to safety standards. But the way to do that isn't through issuing tickets. It's simply for a governmental regulatory agency to set statistical limits for the manufacturer's software for mistakes like running a red light, and if the whole fleet of cars goes above that limit you require the software manufacturer to fix it, risking fines if they don't fix it quickly, or losing the ability to operate at all.
Issuing tickets for driverless cars makes as much sense as issuing tickets if pharmaceutical companies put the wrong amount of a drug in a pill. Tickets are for individuals who make choices; safety regulations with fines and revoking licenses are for corporations and things like driverless cars.
Of course this can already be an issue today, but driverless cars might tip it from being a necessary evil to just plain evil
The benefit is that ticketing is the administrative mechanism we already have so we can use that to aggregate and judge performance.
Otherwise it feels patently unfair to human drivers, who can’t say “well, my family hasn’t broken many traffic laws recently, so I shouldn’t have to pay a ticket for just running this one red light/almost hitting this one pedestrian.
Fine the owner of the self-driving car enough and she will sell it and never buy from this company again. Fine enough owners and the company is broke.
Similarly, we do not punish the Coca-Colas of the world for pumping out fossil-fuel based plastic packaging by a ton, but countries do increasingly enforce consumer recycling (with penalties).
Not saying it’s good but it is what it is.
If we are to hold the creator of self-driving systems responsible for any poor behavior of their creations (as we very well should!), then they must have both the ability to be certain that all of their deployed software and hardware is always in the intended correct working order, and the ability to perform fixes and updates on demand of either themselves or some regulating agency.
Therefore, it seems pointless for it to ever be possible for an individual to "own" such a thing. They would inevitably be mandated to never attempt to modify the system and to accept any updates and maintenance tasks immediately on the creator's demand. So why should any reasonable individual, or banks who would have to be writing loans to those individuals, want to own such a thing when they have so little control over it and so many liabilities?
Consider a situation like - the car has 6 laser/lidar sensors. It is designed to continue operating as well as possible if one of them malfunctions. If one of them does infact malfunction, who is responsible for deciding when to replace it and paying for such service? If you truly own such a vehicle, wouldn't you be the one paying for it? And if you're paying for it, how could you not be the one deciding exactly when it happens? So then inevitably some people would put it off as long as possible and have the vehicle operating in a degraded condition. Maybe it could try to refuse to operate until serviced, but when? Maybe when you're out of town somewhere, or when you have an important meeting to make it to later?
Liability, even conditional, for the end user doesn't seem like a very good solution either. Is it really a good system for every unfortunate incident to become a legal battle between an individual owner and what would necessarily be a gigantic mega-corporation with a fat legal budget for who is really responsible? Yeah, no.
So I think I just proved that self-driving cars pretty much always have to be operated as a taxi-like system as Waymo etc currently do. Lidar sensor breaks? Not my problem, the service needs to send another one, no matter what the location - it can't go to any location it couldn't send a replacement to anyways. If there's any problems, there's always a clear party responsible - a giant wealthy mega-corp that is very easy to regulate and fine and capable of fixing the whole fleet fast.
Am I missing something here? Maybe, but I don't see it right now.
I for one do not buy the argument that the intent to break the law is not there. To understand why this is the case: imagine a really fast self driving car that enters the runway at an airport. It speeds up as it attempts to take off. It does not take off and in the process - best case - causes delay for everyone using the airport. Now, I could argue that this car is a plane and it was in fact NOT programmed to NOT take off. Therefore everything is good.
In my opinion: self-driving cars do not belong on the road unless there are real consequences to their screw ups. Right now it's a beta test where you might just lose your life and nobody is going to be held accountable.
I'm not liable for a traffic ticket when I get in a taxi or on a bus and the driver does something wrong. In a driverless system, the "driver" is the manufacturer, or whoever wrote and updates the self-driving software.
But this also needs to come with some regulation. Driverless mode needs to be audited and certified against a standard, by a third party. Similar (in spirit, anyway) to how human drivers take a driving test before being issued a license.
I can't wait until the selfishness of humans is no longer a factor when it comes to road safety.
https://youtu.be/kN0MLclnWa0?feature=shared
The turn indicators use PWM'ed LEDs and can be invisible in daylight with fast enough shutter speed on the camera.
[0] https://old.reddit.com/r/SelfDrivingCars/comments/18myjc6/dr...
Also, under what circumstances/edge-cases should speed limits and traffic laws should be breakable by an autonomous system?
a. If you were someone bleeding out to death and need to get to a hospital, do you honestly want it to complete to complete stop like a Canadian Mormon nerd at every stop sign rather than be the most aggressive NYC cab driver-meets-ludicrous speed safe-ish enough so you don't die?
b. Evading someone believed intending on causing great bodily harm to reach a safe place like a police station?
c. Following a vehicle containing a kidnapping or human trafficking victim?
d. Commandeering by a LEO in active pursuit of a dangerous suspect?
e. Fleeing an incoming regional disaster threat?
It is probable to likely that, by over-regulation, untold numbers of preventable deaths, injuries, suffering, and property damage will be incurred by vehicles created by corporations to always follow laws blindly rather than spend additional money to handle edge-cases or allow exceptions.
The ticket was $100 and I could appeal it by paying $175 of which $100 would be refunded if my appeal was in my favor.
Their city ordinance said I was responsible because I owned the car.
I imagine it’s the same for driverless cars. The registered owner will get and pay the ticket. They don’t care who is driving.
An interesting question though is they driverless cars should probably have different speed limits as they can respond very differently from humans.
"human subject mass experiment", "consent"?
No, she's wrong there -that's the past. For the past 4 years mass experimentation is fine. Just do it!