It's not an ethical dilemma. If a HUMAN driver got into a situation where they had to choose between hitting and killing a granny or baby, we'd rightly jail them for many years for reckless/dangerous driving.
A self-driving vehicle should be incapable of operating in a manner equivalent to a reckless human driver.
If a self-driving vehicle is capable of getting into such a situation, the car manufacturer is responsible for a crime of homicide and should be punished.
Questa voce è stata modificata (5 mesi fa)
Henryk Plötz
in reply to Charlie Stross • • •Rules of the road quite clearly state that you must be able to stop in the length you can see *at any time*. If the car was going faster than sensor distance / breaking time, it was breaking the law.
And, fun fact, since it was obviously programmed to do that, it wasn't just breaking the law at the time it crossed the speed threshold, but from the very beginning.
An autonomous car that is even capable of hitting a stationary target is not street legal, by definition.
Charlie Stross reshared this.
Simon Waldman
in reply to Henryk Plötz • • •Charlie Stross
in reply to Simon Waldman • • •@swaldman @henryk Still wrong. If there are any obscuring obstacles, the vehicle should be travelling slowly enough to stop *IF* something runs out from between them.
(It's not just humans entering the road, either: it could be another vehicle, possibly out of control.)
Simon Waldman
in reply to Charlie Stross • • •@henryk to an extent I agree, but unlike the "able to stop in the distance you can see to be clear" one, "able to stop if something comes from behind any object" is not a requirement we apply absolutely to human drivers.
I think it's a question of reasonableness vs recklessness. You don't overtake on a blind corner. But you also don't slow to <3mph to avoid potential injury every time you pass a parked van.
Resuna
in reply to Simon Waldman • • •@swaldman @henryk
> "able to stop if something comes from behind any object" is not a requirement we apply absolutely to human drivers.
Human drivers get bored if they are forced to drive at *actually* safe speeds in residential areas. Autonomous vehicles don't.
Simon Waldman
in reply to Resuna • • •@resuna @henryk
Human passengers also get bored, and would like to progress at more than a walking pace...
Resuna
in reply to Simon Waldman • • •@swaldman @henryk
Bored human passengers don't make the car less safe the way a bored driver does.
Simon Waldman
in reply to Resuna • • •@resuna @henryk
No, but while we can make nearly any use of a car safer by moving at a walking pace, that rather detracts from it being *useful*.
Charlie Stross
in reply to Simon Waldman • • •Simon Waldman
in reply to Charlie Stross • • •@resuna @henryk
Really? Surely there must be some weighting or balance applied? Because you can *always* make traffic safer by slowing it down, but at some point you find vehicles being restricted to 2mph with a bell and a pedestrian escort...
Charlie Stross
in reply to Simon Waldman • • •Simon Waldman
in reply to Charlie Stross • • •@resuna @henryk
Fair, and in a similar category to the Swedish approach that somebody else replied with.
I suspect doing that - and thus removing streets like the one in my picture - would require major redesign / rework of many UK city centres. Or, a big reduction in urban car use and ownership, which of course most of us would like for different reasons (but which requires major investment elsewhere...)
Charlie Stross
in reply to Simon Waldman • • •@swaldman @resuna @henryk I'm in the centre of Edinburgh, a UK city—and yes, they're in the process of doing exactly that to a whole bunch of streets, preventing them being used as rat-runs (and dropping the speed limit to 20mph except on a limited number of designated traffic arteries).
I note that we have *really* good public transport by UK standards, with the main bus company's shares being majority owned by the Council.
Quixoticgeek
in reply to Charlie Stross • • •@swaldman @resuna @henryk it's something more places should do. The super blocks of Barcelona are a great example of doing this well.
There's a car brained motonormativity that opposes the idea of slowing motorised vehicles down in any way. Most humans drive too fast, in most conditions.
Preston Is Not My Real Name
in reply to Simon Waldman • • •@swaldman @resuna @henryk you are using a 2mph hypothetical to argue a point, but the real issue is that too many people think twenty miles per hour is somehow the same as two.
Frankly, the case you are really making is that it was a mistake to allow one kind of vehicle to even have the same speed ranges that most cars can manage.
Resuna
in reply to Preston Is Not My Real Name • • •@prestontumber @swaldman @henryk
The fact that autonomous vehicles can fundamentally perform in a safer regime than manually piloted cars is a huge advantage. I am an AV booster, because this kind of inherently safe design is practical, and I am depressed that people who claim to be AV boosters are undermining them because they ... honestly ... I don't know why they do it.
Charlie Stross
in reply to Resuna • • •LisPi
in reply to Simon Waldman • • •Resuna
in reply to LisPi • • •I miss trolleys.
Charlie Stross
in reply to Resuna • • •Resuna
in reply to Charlie Stross • • •@lispi314 @henryk @swaldman
Houston has added grade level street rail but unless you live between downtown and the medical center it's worthless.
And they put that line in so that they could rip up and sell hundreds of miles of heavy rail that was basically donated to the city for the purpose of building public transport.
Charlie Stross
in reply to Resuna • • •NKT
in reply to Simon Waldman • • •Charlie Stross
in reply to NKT • • •Duncan
in reply to Simon Waldman • • •I have a fun ethical dilemma:
Given that humans often drive drunk/tired/high/angry/incompetently and often disobey the road rules. Given that tens of thousands of people die in road accidents every year. If we could reduce this by a significant percentage using self driving vehicles instead of human drivers...
The car could handbrake into a sideways slide and wipe out both the baby and the granny in this one case and we would still be way ahead ethically
Charlie Stross
in reply to Duncan • • •Coolcoder360
in reply to Charlie Stross • • •or like, trains on dedicated tracks that people know to beware of.
With sufficient barricades at crossings the train might even be able to be partially or mostly automated.
Will Berard 🫳🎤🫶
in reply to Duncan • • •@Duncan @swaldman @henryk
"Ahead ethically" presumes a utilitarian view that's the very one that Philippa Foot was trying to critique with the trolley problem.
Even in that framework, if the ascent of AVs is predicated on extractive surveillance technofeudalism, how ahead does that put us?
(Ofc if taking the view the technofeudalism will happen anyway, then admittedly this cost shouldn't be part of the equation)
Duncan
in reply to Will Berard 🫳🎤🫶 • • •Yes public transport is a far better solution in urban areas, and to a certain degree everywhere. In some situations a shared pool of automated vehicles makes sense but not many. As to techno feudalism, that is a separate issue and should be tackled imo with anti-trust legislation as its primary weapon. I was simply comparing ethical dilemmas. My opinions about how to change society in general will not fit in a toot and would not be about cars
Charlie Stross
in reply to Duncan • • •Duncan
in reply to Charlie Stross • • •Whatever it takes. Anit-trust legislation is a sticking plaster we already have, anti-capitalism legislation is the antibiotics and scalpel that we wish we had. Gangrene is a good comparison
Elena ``of Valhalla''
in reply to Charlie Stross • •like this
Uilebheist, Martin Schröder, priryo, Oblomov, hex, LisPi e X-tof like this.
Oblomov reshared this.
tessarakt
in reply to Elena ``of Valhalla'' • • •Arnd Layer
in reply to tessarakt • • •And in reality many drivers behind you get angry when not at least drive 10km/h above the posted speed limit signs.
Elena ``of Valhalla''
in reply to tessarakt • •@tessarakt @Charlie Stross that's more or less the same in Italian law.
However I would expect¹ permanent circumstances such as a tight curve or a pedestrian crossing to be properly indicated in advance with signs and, if needed, a lower posted speed limit, and to have to adapt myself to circumstance that are changeable, such as the weather or works (these should come with a lower posted speed limit, tbf) or the road being full of protestors or whatever.
¹ not that I'd *trust* them to always do it. posted speed limits in Italy tend to be chosen by mysterious processes, in many cases
tessarakt
in reply to Elena ``of Valhalla'' • • •Beko Pharm
in reply to Elena ``of Valhalla'' • • •@valhalla that's just like the street outside of my house minus the crossing. No crossing exists at all. People just cross wherever they feel like. It's the only option for miles too. It's insane but drivers still speed through this curve like something is on fire oO
Also lol, most drivers are incapable of making any kind of conscious decision in such a moment at all. Hitting the breaks is all they **may** manage. I've no idea where this expectation for self driving cars comes from.
Clinton Anderson SwordForHire
in reply to Charlie Stross • • •We let human drivers get away with killing grandmothers and babies all the fucking time
And it's just as disgusting then
#CarsKillCities
#TheWarOnCars
#FuckCars
#VisionZeroNow
Bruno Postle
in reply to Charlie Stross • • •Tim Ward ⭐🇪🇺🔶 #FBPE
in reply to Charlie Stross • • •My grandfather told a story (no idea whether this was true, and no way to check now) ...
He was driving, some girlfriend was the passenger.
A child ran into the road from one side, and a dog ran into the road from the other.
"Mind the dog!" cried out the girlfriend.
My grandpa didn't go out with her any more.
Charlie Stross reshared this.
Charlie Stross
Unknown parent • • •@Isthmus I repeat, bullshit: if there are opaque obstacles, then the self-driving vehicle is travelling illegally fast.
(Another vehicle swerving onto the wrong side of the road is a different category of problem.)
jorny
in reply to Charlie Stross • • •The baby and grandma is also crossing using A GOD DAMN PEDESTRIAN CROSSING.
To quote a philospher who we had theory class with in architecture school:
"Utilitarianism is just an endless stream of stupid scenarios"
Dan Sugalski
in reply to jorny • • •Angus McIntyre
in reply to Charlie Stross • • •Charlie Stross reshared this.
xs4me2
in reply to Charlie Stross • • •Chris Belanger
in reply to Charlie Stross • • •Charlie Stross
in reply to Chris Belanger • • •🔏 Matthias Wiesmann
in reply to Charlie Stross • • •Daburu Dar
in reply to Charlie Stross • • •blackkite
in reply to Charlie Stross • • •Feòrag
Unknown parent • • •CyberPunker
in reply to Charlie Stross • • •Charlie Stross
Unknown parent • • •Ergative Absolutive
Unknown parent • • •patrislav ♾️ #RIPNatenom
in reply to Charlie Stross • • •David Cantrell 🏏
in reply to Charlie Stross • • •dr2chase
in reply to Charlie Stross • • •Study: Air pollution causes 200,000 early deaths each year in the U.S.
MIT News | Massachusetts Institute of TechnologyScimon Proctor
in reply to Charlie Stross • • •I mean there's a perfectly good tree just over there if your brakes have failed.
And the AI car should know the status of the brakes and not start if it's dangerous.
The fact that C) Neither doesn't seem be considered is worrying.
Dan Sugalski
in reply to Charlie Stross • • •PetterOfCats
in reply to Charlie Stross • • •zenkat
in reply to Charlie Stross • • •Meanwhile, in the real world, self-driving cars are significantly safer than humans. By a LOT.
waymo.com/blog/2025/05/waymo-m…
New Study: Waymo is reducing serious crashes and making streets safer for those most at risk
Waymojoachim
in reply to Charlie Stross • • •NKT
in reply to joachim • • •JdeBP
in reply to Charlie Stross • • •@leeloo @heydon
So much interesting stuff about that and this thread.
It's not a real image from the study. It's a cartoon done for that magazine piece by London-based illustrator Simon Landrein, so all of the arguments about the cartoon (trees, curve, pedestrian crossing) aren't responding to what the actual test questions were, which were asked back in 2018 moreover.
doi.org/10.1038/s41586-018-063…
Then there's the #BadJournalism.
#MIT #MoralMachine #AutonomousVehicles
The Moral Machine experiment - Nature
NatureResuna
in reply to Charlie Stross • • •Yes, I have been saying for years that this formulation of the Trolley Problem for self-driving cars is just Trolling.
An autonomous vehicle should never get in such a situation, since there is no driver to get bored it should never be driving fast enough where such a situation can occur. Even if it's because brakes have failed and it's going downhill, then the person responsible for maintenance is at fault.
Profane tmesis
in reply to Charlie Stross • • •To be fair, the engineers building these things have said just that. In millions of miles of test-driving this sort of thing never happens, and if it does, you can almost always slam on the brakes.
theguardian.com/technology/201…
There are still many other problems with with the idea, (and I would never get in anything close to a self-driving car) but the trolley problem isn't one of them.
Self-driving cars don't care about your moral dilemmas
Alex Hern (The Guardian)Kathmandu
in reply to Charlie Stross • • •All versions of this question are seeking permission to kill someone.
The correct answer, taught to me in driver training, is: "If you are going so fast you can't stop in time, you are going TOO FAST." The driver is at fault.
Cars should always be able to stop in the CLEAR distance they can see, regardless of who is driving the car.
kalin5
in reply to Charlie Stross • • •Boyd Stephen Smith Jr.
in reply to Charlie Stross • • •While I agree, I think you've missed the point.
Add to the scenario that 1ms prior, the self-driving car correctly identified that it's reckless human driver is no longer in control of the vehicle and will not be resuming control (the reckless driver left the vehicle or died).
So, what should the self-driving car do? Or more specifically, when writing a system that might see this scenario as input, what is the morally correct output?
We need to answer that system for us, as moral actors. *That* is the ethical dilemma. And, if there *isn't* a correct answer, we have to quit pretending "AI safety" is an easy as "just code a moral AI", because I guarantee you we are going to automate more, not less, at least until global climate collapse.
Charlie Stross
in reply to Boyd Stephen Smith Jr. • • •PedestrianError :vbus: :nblvt:
in reply to Charlie Stross • • •hugy
in reply to Charlie Stross • • •Ted Lemon
in reply to Charlie Stross • • •hsolerkalinovski
in reply to Charlie Stross • • •a much more likely scenario in many places of the world, including my country:
A pedestrian pops in front of your self driving car, which is driving at a safe speed when this information comes up.
The embedded AI has to choose between stopping the car and saving the pedestrian or not stopping and killing him.
Your car stops.
The pedestrian walks toward you, draws a gun, orders you to get out of the car and, if you're lucky, lets you run away after ramsacking your belongings.
Charlie Stross
in reply to hsolerkalinovski • • •@hsolerkalinovski
Guns are LEGAL in your country?!?
That's your problem.
Over here you'd pull a mandatory 5 year sentence just for putting your hand on one. Pull it on someone and you're looking at 8-12 years minimum.
hsolerkalinovski
in reply to Charlie Stross • • •oh sure, just enforce strict gun control in a country the size of Europe with embedded pseudo states funded by drug cartels and which have privileged access to guns because we are neighbours of a despotic regime. This isn't Europe we're talking about.
By the way, guns ARE illegal here, but it's not like criminals care much about the law.
I'm just pointing out a scenario which actually happened with self driving trucks, and which made truck self driving policy to become DON'T STOP.
YourShadowDani
in reply to Charlie Stross • • •Charlie Stross
in reply to YourShadowDani • • •Complexity of systems
in reply to Charlie Stross • • •One of the major issues is it is very difficult to punish a corporation effectively under our current legal/economic framework
Oh you can fine them or impose other types of financial penalty but there are entire industries that just treat the occasional multi-billion dollar fine as part of the cost of doing business (like the pharmaceutical industry)
Louise Dennis
in reply to Charlie Stross • • •jwz
in reply to Charlie Stross • • •Fact check: if the human driver was not drunk and did not flee the scene, they wouldn't even get their license suspended.
* Skin color may apply
Schroedinger
in reply to Charlie Stross • • •It is ridiculous - the car should stop for everyone who needs to to cross at the crossing.
Otherwise, it is bad software.
R.L. LE
in reply to Charlie Stross • • •no king crabs 🇺🇸
in reply to Charlie Stross • • •If it is going too fast to stop it should crash into the trees left or right.
int%rmitt]nt sig^al. ...~!...)
in reply to Charlie Stross • • •Nemo
in reply to Charlie Stross • • •I can shed some light about that kind of "research". It's not the fruit of people doing AI research, and emphatically not people doing anything related to driving (computer vision, mapping, control).
It's from the sect that does research "about" AI, assuming an intelligent AI existed, and it needed to be taught ethics, but not, because that's boring, actual sense. And then it's wrapped around pretexts like this one.