this post was submitted on 23 Mar 2026
28 points (86.8% liked)

Ask Lemmy

38711 readers
1885 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, toxicity and dog-whistling are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS
 

Legal chalenges are this: the driverless EV ran over someone but what happens is that corporations (often) bribe the judicial parties not charging them with a hit & run even though the victims families want justice for their vehicles killing pedestrians. The only “prevention” is harm reduction (investing into technology that’s able to detect human presence & sensors that activate in pedestian heavy areas stopping the vehicle).

Usually, when it’s a EV (with no human driver behind the wheel): is it still considered a criminal offense if a driverless EV ran over somebody as it just continues driving? In that case it’s mainly rideshare companies (i.e. Uber, Lyft, DiDi, etc) face criminal liability. Regardless, the companies who dispatch EV’s are sued when their vehicles run over somebody and the EV didn’t stop whilst doing so.

top 30 comments
sorted by: hot top controversial new old
[–] Kolanaki@pawb.social 11 points 21 hours ago* (last edited 21 hours ago)

You're fucking lucky if a human driver is held accountable for killing a pedestrian, in the US. Ain't nobody getting held to shit if it's a robot car.

[–] slazer2au@lemmy.world 18 points 1 day ago (2 children)

While criminal charges may not be followed, insurance companies absolutely will find someone to blame and bear the cost.

[–] JackbyDev@programming.dev 2 points 8 hours ago (1 children)
[–] slazer2au@lemmy.world 1 points 6 hours ago

A government safety agency doing it's job in this day and age? What a novel idea.

[–] DarrinBrunner@lemmy.world 6 points 22 hours ago

Which for the company is just the cost of doing business, perhaps. If they're obscenely profitable.

[–] eksb@programming.dev 13 points 1 day ago (1 children)

Depends where. In America, human drivers are not held accountable for running over pedestrians, so I am not sure why a computer would be.

[–] LordFireCrotch@lemmy.today 3 points 18 hours ago

They're not? Omw to run over all pedestrians free of accountability!

[–] JohnnyCanuck 9 points 1 day ago (1 children)

Why does it matter if it's an EV?

[–] WoolyNelson@lemmy.world 5 points 1 day ago (1 children)

Mainly because EVs are the vehicle type trying out driverless technology.

[–] JohnnyCanuck -1 points 1 day ago (1 children)

Okay, but a quote like: "Usually, when it’s a EV (with no human driver behind the wheel)" makes it seem like the poster thinks that being an electric vehicle is a prerequisite to being driverless. Also, it seems like the poster thinks that "EV" almost implies "driverless".

[–] quediuspayu@lemmy.dbzer0.com 1 points 23 hours ago* (last edited 23 hours ago)

Acronyms, at some point someone doesn't care what each letter stands for and makes an assumption, then other people use them that way.

[–] bacon_pdp@lemmy.world 8 points 1 day ago (1 children)

The vehicle owner is criminally liable for murder in that case.

I support the corporate death penalty.

[–] village604@adultswim.fan 2 points 20 hours ago (1 children)

I'm not sure that you know what the legal definition of murder is.

[–] bacon_pdp@lemmy.world -3 points 20 hours ago (1 children)

Murder is the unlawful killing of a human being with malice aforethought.

https://www.law.cornell.edu/uscode/text/18/1111

It is no different than firing a bullet from a gun into an area where people are walking. Claiming software doesn’t change that fact

[–] village604@adultswim.fan 2 points 16 hours ago* (last edited 16 hours ago) (1 children)

And how exactly does a self driving car exhibit malice? Do you think they're specifically programmed to kill people?

The word you're looking for is manslaughter, or just homicide. Murder requires intent.

[–] bacon_pdp@lemmy.world -3 points 16 hours ago (1 children)

Yes, they have to explicitly tell it in software to kill others (such as pedestrians) to save the lives of the passengers or prevent the theft of the vehicle itself.

[–] village604@adultswim.fan 3 points 12 hours ago (1 children)

I don't think you have a solid understanding of how code like that works.

[–] bacon_pdp@lemmy.world 0 points 9 hours ago (1 children)

It is just a simple OODA loop (observe, orient, decide, act loop) that needs to have the decision made for when constraints conflict. Which by definition must decide who dies. The manufacturers have been explicitly clear about that point so that they can doge liability.

[–] village604@adultswim.fan 1 points 9 hours ago* (last edited 9 hours ago) (1 children)

That's like saying a CPU is just a few switches.

What the computer is doing is picking the least bad option with more weight towards the occupant's survival. But if possible it's going to pick an option where no one dies.

You wouldn't charge a human with murder for making the choice between saving themselves or someone else.

And this is ignoring the fact that a self driving vehicle would be in such a situation far less often than humans are.

[–] bacon_pdp@lemmy.world -1 points 9 hours ago

We are currently charging women for murder because they have had ectopic pregnancies removed. So no, that is not how the laws about murder are currently being applied in the USA.

And yes, I agree that a Turing complete CPU could be just a few dozen transistors. (Or just one depending on how you count ROM)

[–] Iconoclast@feddit.uk 3 points 21 hours ago (3 children)

These autonomous vehicle trolley problems go just as deep as you want them to - and there are no real right or wrong answers for any of it.

What if the car is faced with a situation where it can either hit a pedestrian to save the passenger or drive over a cliff and save the pedestrian by sacrificing the passenger?

What if a collision is unavoidable but it has the option to choose between hitting a child or hitting a granny?

It's only a matter of time until we have self-driving vehicles that are far safer drivers than humans - but they still won't be flawless, and accidents will keep happening. Can we live with there being no one to blame for it? Or do we just go back to human drivers with higher accident rates - at least then we have someone to point our fingers at?

[–] porcoesphino@mander.xyz 5 points 21 hours ago (1 children)

All this is pretty much true... but a company still did QA on the code / hardware combo. The cost of death to the individuals that they caused should be part of the cost they need to pay. Its part of their negative externalities even if they are lower than the human driver (that ideally is charged with theirs)

I'm not claiming the legal system says this, or that it's likely to happen. Just that the logic doesn't seem complicated or ambiguous

[–] Iconoclast@feddit.uk 1 points 20 hours ago* (last edited 20 hours ago) (1 children)

We punish people for speeding, driving under the influence, or texting while driving because those behaviors are reckless and we want to deter them. That makes sense.

But what about a freak accident where the driver did nothing wrong? Should they still be punished just because it happened to be them in the wrong place at the wrong time? In my opinion, no. If they didn't do anything reckless or negligent, there's no reason to think punishment would teach them anything useful. At that point it just feels like we're satisfying our need for vengeance rather than serving any logical purpose.

With a self-driving car, every accident would basically fall into that "freak accident" category. The car wasn't distracted, drunk, or driving recklessly. Maybe you could argue the company should pay compensation to the victim's family or at least cover medical costs if the person survived as a gesture of good will - but I don't see how the company would be morally responsible in a way that justifies fining or punishing them.

Just thinking aloud here. I don't know what the actual answer is.

[–] porcoesphino@mander.xyz 3 points 20 hours ago (1 children)

Accidents with driverless cars simply doesn't fall into the freak category like you claim. Thousands of hours were put into making decisions that led to this point. They were all made by the manufacturer (or the software + hardware combo with final QA, for now the same company but financial punishments are not difficult to split)

The legal system here is in place for someone to pay for the fact a person is no longer alive that ideally would be. Its not complicated when reasoning about what caused them to no longer be alive

Again, legal system may not come close to agreeing and society may never either. Kind of like I find it hard to imagine someone being fined for stealing candy from a baby even though it seems obvious there was harm and who caused the fault

[–] porcoesphino@mander.xyz 1 points 20 hours ago

Actually... simple point.

You take a corner, there is a defect in your tires so you cant turn well, you hit someone, the investigation shows the defect in the tires. Who pays / is to blame?

I'm not saying the trolley problem style arguments aren't true for driverless cars; society will need to adapt. I just think having the companies pay still gets us to safer roads but with accountability and without society hiding the costs these companies impose

[–] DougPiranha42@lemmy.world 2 points 17 hours ago (1 children)

OP is asking about an accident and responsibility, not a trolley problem.
You pretend AVs can’t make mistakes just face difficult choices. That’s false.
You also pretend AVs are safer than human drivers. Nobody knows if that’s true so let’s not pretend it is.

And even if they're safer than drivers, someone (other than the victim, ideally) still needs to be responsible when they inevitably hit someone.

[–] village604@adultswim.fan 1 points 21 hours ago

We already have self driving cars that are way safer than humans (Waymo).

But that bar is so low the devil would have to start digging to go under it.

[–] DarrinBrunner@lemmy.world 3 points 22 hours ago

We won't get different laws for them until this happens enough to catch public ire, and we shout about it. Till then, the cops don't care, and neither do the courts, or lawmakers. Our "justice" system exists primarily to keep we poors in line and to protect the rich.

[–] Skyrmir@lemmy.world 2 points 23 hours ago

It's the same as when the Bloodline..I mean Bright Line, hit's a person. Fault is derived from the persons behavior, and safety protocols between the corporation and the state.