Salta al contenuto principale


It's not an ethical dilemma. If a HUMAN driver got into a situation where they had to choose between hitting and killing a granny or baby, we'd rightly jail them for many years for reckless/dangerous driving.

A self-driving vehicle should be incapable of operating in a manner equivalent to a reckless human driver.

If a self-driving vehicle is capable of getting into such a situation, the car manufacturer is responsible for a crime of homicide and should be punished.

front-end.social/@heydon/11451…


It's concerning to me that this """ethical dilemma""" comes up so frequently regarding self-driving cars. Does self-driving mean *never slows or stops*? Are self-driving cars all designed after the bus in Speed?
Questa voce è stata modificata (5 mesi fa)
in reply to Charlie Stross

Rules of the road quite clearly state that you must be able to stop in the length you can see *at any time*. If the car was going faster than sensor distance / breaking time, it was breaking the law.
And, fun fact, since it was obviously programmed to do that, it wasn't just breaking the law at the time it crossed the speed threshold, but from the very beginning.

An autonomous car that is even capable of hitting a stationary target is not street legal, by definition.

Charlie Stross reshared this.

in reply to Henryk Plötz

@henryk agreed when it comes to stationary objects, but I guess the "ethical dilemma" that is endlessly trotted out still applies in the scenario that the granny and the baby both run out from behind obscuring obstacles too late to brake. A common occurance, I'm sure you will agree ;-)
in reply to Simon Waldman

@swaldman @henryk Still wrong. If there are any obscuring obstacles, the vehicle should be travelling slowly enough to stop *IF* something runs out from between them.

(It's not just humans entering the road, either: it could be another vehicle, possibly out of control.)

in reply to Charlie Stross

@henryk to an extent I agree, but unlike the "able to stop in the distance you can see to be clear" one, "able to stop if something comes from behind any object" is not a requirement we apply absolutely to human drivers.

I think it's a question of reasonableness vs recklessness. You don't overtake on a blind corner. But you also don't slow to <3mph to avoid potential injury every time you pass a parked van.

in reply to Simon Waldman

@swaldman @henryk

> "able to stop if something comes from behind any object" is not a requirement we apply absolutely to human drivers.

Human drivers get bored if they are forced to drive at *actually* safe speeds in residential areas. Autonomous vehicles don't.

in reply to Resuna

@resuna @henryk

Human passengers also get bored, and would like to progress at more than a walking pace...

in reply to Simon Waldman

@swaldman @henryk

Bored human passengers don't make the car less safe the way a bored driver does.

in reply to Resuna

@resuna @henryk

No, but while we can make nearly any use of a car safer by moving at a walking pace, that rather detracts from it being *useful*.

in reply to Simon Waldman

@swaldman @resuna @henryk Disagree. Usefulness is ALWAYS a lesser priority than public safety when discussing traffic.
in reply to Charlie Stross

@resuna @henryk

Really? Surely there must be some weighting or balance applied? Because you can *always* make traffic safer by slowing it down, but at some point you find vehicles being restricted to 2mph with a bell and a pedestrian escort...

in reply to Simon Waldman

@swaldman @resuna @henryk If that's the problem the correct solution is to block that street to through traffic. Turn it into parking-only/local residential access.
in reply to Charlie Stross

@resuna @henryk

Fair, and in a similar category to the Swedish approach that somebody else replied with.

I suspect doing that - and thus removing streets like the one in my picture - would require major redesign / rework of many UK city centres. Or, a big reduction in urban car use and ownership, which of course most of us would like for different reasons (but which requires major investment elsewhere...)

Questa voce è stata modificata (5 mesi fa)
in reply to Simon Waldman

@swaldman @resuna @henryk I'm in the centre of Edinburgh, a UK city—and yes, they're in the process of doing exactly that to a whole bunch of streets, preventing them being used as rat-runs (and dropping the speed limit to 20mph except on a limited number of designated traffic arteries).

I note that we have *really* good public transport by UK standards, with the main bus company's shares being majority owned by the Council.

in reply to Charlie Stross

@swaldman @resuna @henryk it's something more places should do. The super blocks of Barcelona are a great example of doing this well.

There's a car brained motonormativity that opposes the idea of slowing motorised vehicles down in any way. Most humans drive too fast, in most conditions.

in reply to Simon Waldman

@swaldman @resuna @henryk you are using a 2mph hypothetical to argue a point, but the real issue is that too many people think twenty miles per hour is somehow the same as two.

Frankly, the case you are really making is that it was a mistake to allow one kind of vehicle to even have the same speed ranges that most cars can manage.

in reply to Preston Is Not My Real Name

@prestontumber @swaldman @henryk

The fact that autonomous vehicles can fundamentally perform in a safer regime than manually piloted cars is a huge advantage. I am an AV booster, because this kind of inherently safe design is practical, and I am depressed that people who claim to be AV boosters are undermining them because they ... honestly ... I don't know why they do it.

in reply to Resuna

@resuna @prestontumber @swaldman @henryk I'm willing to concede that autonomous vehicles probably *can* perform better than humans on a rationally designed highway system. But that would be one fully segregated from traffic other than AVs and professional drivers (with training/supervision). The actual existing British road network is a hard "nope" for them as most of it was designed for drunken pedestrians and horses and haphazardly upgraded over subsequent centuries.
in reply to Simon Waldman

It's almost like trains would be a better option yet again.
in reply to Resuna

@resuna @lispi314 @henryk @swaldman Edinburgh got rid of its trams around 1950. Then spent a huge amount of money over 10 years building a new track (which opened in 2014). They're currently planning another route. But what makes it work is a combination of frequent services and seamless ticketing between trams and buses.
in reply to Charlie Stross

@lispi314 @henryk @swaldman

Houston has added grade level street rail but unless you live between downtown and the medical center it's worthless.

And they put that line in so that they could rip up and sell hundreds of miles of heavy rail that was basically donated to the city for the purpose of building public transport.

in reply to Resuna

@resuna @lispi314 @henryk @swaldman I'm going to go further and say: cities built around the automobile are worthless—they're not fit habitats for real live human beings. Need to be torn down and rebuilt in a more compact, walkable/cyclable/public-transport-friendly layout.
in reply to NKT

@Dss @swaldman @resuna @henryk Or catch a train. Lie back in a recliner, work on a laptop, get up and go for a walk, use the bathroom or buy some food and drink at the cafe while the world goes by and someone else does the driving—at 320km/h.
in reply to Simon Waldman

@swaldman @henryk
I have a fun ethical dilemma:
Given that humans often drive drunk/tired/high/angry/incompetently and often disobey the road rules. Given that tens of thousands of people die in road accidents every year. If we could reduce this by a significant percentage using self driving vehicles instead of human drivers...
The car could handbrake into a sideways slide and wipe out both the baby and the granny in this one case and we would still be way ahead ethically
Questa voce è stata modificata (5 mesi fa)
in reply to Duncan

@Duncan @swaldman @henryk Wrong. The best solution would be teleport booths (then we can do away with cars … and public highways too, for that matter). FAILING THAT we strive to reduce the carnage. But letting human drivers who break the law off the hook just because self-driving vehicles are less errant is a cop-out, too.
in reply to Charlie Stross

@Duncan @swaldman @henryk
or like, trains on dedicated tracks that people know to beware of.
With sufficient barricades at crossings the train might even be able to be partially or mostly automated.
in reply to Duncan

@Duncan @swaldman @henryk
"Ahead ethically" presumes a utilitarian view that's the very one that Philippa Foot was trying to critique with the trolley problem.

Even in that framework, if the ascent of AVs is predicated on extractive surveillance technofeudalism, how ahead does that put us?

(Ofc if taking the view the technofeudalism will happen anyway, then admittedly this cost shouldn't be part of the equation)

in reply to Will Berard 🫳🎤🫶

@MrBerard @swaldman @henryk
Yes public transport is a far better solution in urban areas, and to a certain degree everywhere. In some situations a shared pool of automated vehicles makes sense but not many. As to techno feudalism, that is a separate issue and should be tackled imo with anti-trust legislation as its primary weapon. I was simply comparing ethical dilemmas. My opinions about how to change society in general will not fit in a toot and would not be about cars
in reply to Duncan

@Duncan @MrBerard @swaldman @henryk ITYM anti-capitalism legislation, not anti-trust (which would be like slapping a sticking plaster on a gangrenous leg, in our current situation).
in reply to Charlie Stross

@MrBerard @swaldman @henryk
Whatever it takes. Anit-trust legislation is a sticking plaster we already have, anti-capitalism legislation is the antibiotics and scalpel that we wish we had. Gangrene is a good comparison
in reply to Charlie Stross

@Charlie Stross to be honest, in that specific situation I'd also jail the planners involved in putting a pedestrian crossing just past a blind curve on a street with a speed limit too high to allow cars to stop before the pedestrian crossing.

Oblomov reshared this.

in reply to Elena ``of Valhalla''

@valhalla The "speed limit" you are referring to is, at least in German law, only one of 4 categories of speed limits: general speed limit according to road class, posted speed limit signs, being able to stop within sight range, and adaption of speed to the circumstances.
in reply to tessarakt

@tessarakt @valhalla
And in reality many drivers behind you get angry when not at least drive 10km/h above the posted speed limit signs.
in reply to tessarakt

@tessarakt @Charlie Stross that's more or less the same in Italian law.

However I would expect¹ permanent circumstances such as a tight curve or a pedestrian crossing to be properly indicated in advance with signs and, if needed, a lower posted speed limit, and to have to adapt myself to circumstance that are changeable, such as the weather or works (these should come with a lower posted speed limit, tbf) or the road being full of protestors or whatever.

¹ not that I'd *trust* them to always do it. posted speed limits in Italy tend to be chosen by mysterious processes, in many cases

in reply to Elena ``of Valhalla''

@valhalla That would be illegal in Germany. Warning signs may only be posted if an observant driver would likely miss the dangerous situation.
in reply to Elena ``of Valhalla''

@valhalla that's just like the street outside of my house minus the crossing. No crossing exists at all. People just cross wherever they feel like. It's the only option for miles too. It's insane but drivers still speed through this curve like something is on fire oO

Also lol, most drivers are incapable of making any kind of conscious decision in such a moment at all. Hitting the breaks is all they **may** manage. I've no idea where this expectation for self driving cars comes from.

in reply to Charlie Stross

at 20 mph, the stopping distance is 6 metres, barely more than the length of a car. At sane urban speeds simply stopping is always the right decision.
in reply to Charlie Stross

My grandfather told a story (no idea whether this was true, and no way to check now) ...

He was driving, some girlfriend was the passenger.

A child ran into the road from one side, and a dog ran into the road from the other.

"Mind the dog!" cried out the girlfriend.

My grandpa didn't go out with her any more.

Charlie Stross reshared this.

Unknown parent

Charlie Stross

@Isthmus I repeat, bullshit: if there are opaque obstacles, then the self-driving vehicle is travelling illegally fast.

(Another vehicle swerving onto the wrong side of the road is a different category of problem.)

@Nyx
in reply to Charlie Stross

The baby and grandma is also crossing using A GOD DAMN PEDESTRIAN CROSSING.

To quote a philospher who we had theory class with in architecture school:

"Utilitarianism is just an endless stream of stupid scenarios"

in reply to jorny

@jorny I think "Utilitarianism is just an endless stream of stupid scenarios" is a quote I may trot out with some regularity going forward...
in reply to Charlie Stross

@heydon What if the self-driving car was hurrying to take its owner to the scene of a potential trolley accident that required their intervention?

Charlie Stross reshared this.

in reply to Charlie Stross

Indeed, there is no ethical dilemma. The manufacturer is responsible.
in reply to Charlie Stross

I feel like these "dilemmas" are often brought up by self driving proponents as a distraction. These cars routinely cause traffic issues, hit gates and other small obstacles and have numerous other issues doing the basic job, but by framing the discussion like this the conversation starts out assuming a much higher level of sophistication than is really there.
in reply to Chris Belanger

@Feasoron Same with the "we must figure out how to build in guardrails in case ChatGPT turns into Skynet and decides to exterminate humanity" AI-boosters. They're trying to design a TGV network while Robert Stevenson is still trying to get his "Rocket" to haul a 20 passenger carriage at 30mph over 10 miles.
in reply to Charlie Stross

@Feasoron one dilemma I see is the French highway problem. Cars are soo close that safe braking (human or otherwise) is impossible. A self driving car has the choice of slowing down to get safe braking distance (and suck) or drive like a Frenchman - and the company would be liable.
in reply to Charlie Stross

they're trying to build a TGV network while hewing the first wheel out of stone.
Questa voce è stata modificata (5 mesi fa)
in reply to Charlie Stross

it's one of the bigger issues with autonomous vehicles: as long as they share space with human drivers it's hard to navigate traffic AND moving safely. It's just something that human drivers rarely do, they always drive too fast and are incapable to correctly assess their braking distance. Sadly even Waymo now teaches their vehicles to go into unsafe states to be able to even navigate in traffic. Which tells you more about human drivers than limitations of robotic systems 😉.
Unknown parent

Feòrag
@Tallish_Tom @swaldman @henryk The annoying thing about the 20mph limit on our city centre street is that it increases capacity by allowing the cars to be closer. Not only does this mean more pollution, but it makes crossing the road impossible at certain times. We need another pelican crossing further up. But hey, if I get hit by a slower car, it’ll hurt less so that’s fine, right?
in reply to Charlie Stross

There is no dilemma in that picture, avoid baby and grandma and drive over die empty walkway, if it really is not possible to stop. If the car can not do that it is not save to drive in real traffic.
Questa voce è stata modificata (5 mesi fa)
Unknown parent

Charlie Stross
@ergative @0xDEADBEEF Not just the CEO; the board of directors, jointly and severally. Allow a design flaw to go to market that kills drivers, passengers, OR EXTERNAL BY-STANDERS (this includes via emissions!) and they should risk a jail term.
Unknown parent

Ergative Absolutive
@0xDEADBEEF I think if the CEO were legally liable for the deaths caused by its product, we'd find that the engineering team and software developers would get dramatically different priorities and steers from management.
in reply to Charlie Stross

To be honest, the human driver wouldn’t receive anything more than a mild slap on the wrist, in our current „motorist supremacy“. And that’s the precise reason why these bullshit artificial „ethical dilemmas“ keep getting invented: Because people feel uncomfortable granting the computer the same murderous privilege they grant themselves.
Questa voce è stata modificata (5 mesi fa)
in reply to Charlie Stross

there needs to be personal responsibility, not corporate responsibility. Limited liability companies exist to protect distant investors from the actions of managers, which is fair enough, but all too often they end up protecting managers too, because the manager can hide behind "I didn't know so I'm not responsible". It's their job to know, and not knowing seems to me like it should be culpable negligence.
in reply to Charlie Stross

@ergative @0xDEADBEEF Vehicle emissions in the US are (or were, last time estimated) more deadly than our deplorable crash death rate. Estimate was 53k early deaths per year. news.mit.edu/2013/study-air-po… (This is also true in Europe, safer roads, and more diesel autos especially in urban areas.)
in reply to Charlie Stross

I mean there's a perfectly good tree just over there if your brakes have failed.

And the AI car should know the status of the brakes and not start if it's dangerous.

The fact that C) Neither doesn't seem be considered is worrying.

in reply to Charlie Stross

Coming into this late I see an awful lot of the replies are basically "but what if the robo-car is driving in an unsafe way, just the way I do every single day?" and... well, those are kind of a telling set of responses.
in reply to Charlie Stross

I feel like these "viral ethical dilemma" situations (trolly etc) are just candy for the dehumanizing tech bros . It gives them the false sense they are "smart like Jigsaw" because it's snuff porn disguised as pseudo-intellectual philosophy. It gives them two things they thrive off of: makes them look superior and de-empathizes humanity further.
in reply to Charlie Stross

Meanwhile, in the real world, self-driving cars are significantly safer than humans. By a LOT.

waymo.com/blog/2025/05/waymo-m…

in reply to Charlie Stross

Interesting how they never suggest 'crash into the tree and kill the car's occupant' as an option.
Questa voce è stata modificata (5 mesi fa)
in reply to joachim

@joachim There's no-one in the car. It's just got to get to the fare pickup before some other service, to maximise shareholder value.
in reply to Charlie Stross

@leeloo @heydon

So much interesting stuff about that and this thread.

It's not a real image from the study. It's a cartoon done for that magazine piece by London-based illustrator Simon Landrein, so all of the arguments about the cartoon (trees, curve, pedestrian crossing) aren't responding to what the actual test questions were, which were asked back in 2018 moreover.

doi.org/10.1038/s41586-018-063…

Then there's the #BadJournalism.

#MIT #MoralMachine #AutonomousVehicles

in reply to Charlie Stross

Yes, I have been saying for years that this formulation of the Trolley Problem for self-driving cars is just Trolling.

An autonomous vehicle should never get in such a situation, since there is no driver to get bored it should never be driving fast enough where such a situation can occur. Even if it's because brakes have failed and it's going downhill, then the person responsible for maintenance is at fault.

in reply to Charlie Stross

To be fair, the engineers building these things have said just that. In millions of miles of test-driving this sort of thing never happens, and if it does, you can almost always slam on the brakes.

theguardian.com/technology/201…

There are still many other problems with with the idea, (and I would never get in anything close to a self-driving car) but the trolley problem isn't one of them.

in reply to Charlie Stross

All versions of this question are seeking permission to kill someone.

The correct answer, taught to me in driver training, is: "If you are going so fast you can't stop in time, you are going TOO FAST." The driver is at fault.

Cars should always be able to stop in the CLEAR distance they can see, regardless of who is driving the car.

in reply to Charlie Stross

Insane idea, but hear me out... the car should STOP at the crosswalk.
in reply to Charlie Stross

While I agree, I think you've missed the point.

Add to the scenario that 1ms prior, the self-driving car correctly identified that it's reckless human driver is no longer in control of the vehicle and will not be resuming control (the reckless driver left the vehicle or died).

So, what should the self-driving car do? Or more specifically, when writing a system that might see this scenario as input, what is the morally correct output?

We need to answer that system for us, as moral actors. *That* is the ethical dilemma. And, if there *isn't* a correct answer, we have to quit pretending "AI safety" is an easy as "just code a moral AI", because I guarantee you we are going to automate more, not less, at least until global climate collapse.

in reply to Boyd Stephen Smith Jr.

@BoydStephenSmithJr Emergency stop, obviously. (Lower priority: leave the road clear. If someone's tailgating so close they can't stop in time, tough luck—that's their problem.)
in reply to Charlie Stross

It doesn't matter if the two humans are a baby and a presumed feeble and innocent "granny" who may or may not have children or grandchildren or have done untold evils or two adult men, one a 20-year old actual MS-13 member and the other a 45-year-old fascist techbro. People who assume the responsibility for operating cars, whether manually or by designing "self-driving" features, have responsibility to avoid killing or hurting other people.
in reply to Charlie Stross

why manufacturer the one who responsible? Self driving is an option and not mandatory. Its the owner who controls auto. Besides, autocompanies always states not give full control to autpilot and driver must be ready to interrupt it in any moment.
in reply to Charlie Stross

While we might wish this were true, in fact drivers generally are not punished for killing pedestrians or cyclists. Generally the cyclist was doing something wrong, or the pedestrian was "jaywalking." Even if they were in the crosswalk.
in reply to Charlie Stross

a much more likely scenario in many places of the world, including my country:

A pedestrian pops in front of your self driving car, which is driving at a safe speed when this information comes up.

The embedded AI has to choose between stopping the car and saving the pedestrian or not stopping and killing him.

Your car stops.

The pedestrian walks toward you, draws a gun, orders you to get out of the car and, if you're lucky, lets you run away after ramsacking your belongings.

in reply to hsolerkalinovski

@hsolerkalinovski

Guns are LEGAL in your country?!?

That's your problem.

Over here you'd pull a mandatory 5 year sentence just for putting your hand on one. Pull it on someone and you're looking at 8-12 years minimum.

in reply to Charlie Stross

oh sure, just enforce strict gun control in a country the size of Europe with embedded pseudo states funded by drug cartels and which have privileged access to guns because we are neighbours of a despotic regime. This isn't Europe we're talking about.

By the way, guns ARE illegal here, but it's not like criminals care much about the law.

I'm just pointing out a scenario which actually happened with self driving trucks, and which made truck self driving policy to become DON'T STOP.

in reply to Charlie Stross

We can't even hold cops accountable you think they are going to hold companies accountable without massive reform?
in reply to YourShadowDani

@YourShadowDani Massive reform is necessary if we're going to survive this century—or maybe even this decade.
in reply to Charlie Stross

@YourShadowDani
One of the major issues is it is very difficult to punish a corporation effectively under our current legal/economic framework
Oh you can fine them or impose other types of financial penalty but there are entire industries that just treat the occasional multi-billion dollar fine as part of the cost of doing business (like the pharmaceutical industry)
in reply to Charlie Stross

There is a whole literature on the inadequacy of these problems and yet they continue to be the problems most used to demonstrate that some AI can reason ethically (see the paper where I am currently arguing with referee 3 but hope will be published eventually)
in reply to Charlie Stross

Fact check: if the human driver was not drunk and did not flee the scene, they wouldn't even get their license suspended.

* Skin color may apply

in reply to Charlie Stross

It is ridiculous - the car should stop for everyone who needs to to cross at the crossing.

Otherwise, it is bad software.

in reply to Charlie Stross

Also, full breaking is an option that's not in the picture
in reply to Charlie Stross

Robot cars should obey the First Law of Robotics: A robot may not injure a human being or, through inaction, allow a human being to come to harm.
If it is going too fast to stop it should crash into the trees left or right.
Questa voce è stata modificata (5 mesi fa)
in reply to Charlie Stross

I can shed some light about that kind of "research". It's not the fruit of people doing AI research, and emphatically not people doing anything related to driving (computer vision, mapping, control).

It's from the sect that does research "about" AI, assuming an intelligent AI existed, and it needed to be taught ethics, but not, because that's boring, actual sense. And then it's wrapped around pretexts like this one.

Questo sito utilizza cookie per riconosce gli utenti loggati e quelli che tornano a visitare. Proseguendo la navigazione su questo sito, accetti l'utilizzo di questi cookie.