You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

deontology: what is difference between the trolley problem and bentham's act utiliturianism?

ernestm January 21, 2020 at 13:23 2775 views 4 comments
The simple youtube comic book video about the trolley problem, with some others describing variations of the issue:

https://youtu.be/bOpf6KcWYyw

and a wikipedia article

https://en.wikipedia.org/wiki/Trolley_problem

Being an Oxford philosopher I regard it as a problem first defined properly as act utilitarianism by Jeremy Bentham, typically stated, historically since the gold rush, as whether a wild west sheriff calling townspeople to catch a criminal is creating a posse or a lynch gang, and whether you would thus join it.

Modern ethics people are snide about 'classic' definitions, and I am equally snide about them, because they only seem to respect thoughts of living people who are actually trying to make a profit by restating the problem in modern terms for the journals and philosophy lectures. Am I wrong about that?

Comments (4)

Heiko January 31, 2020 at 01:34 #377343
Quoting ernestm
Modern ethics people are snide about 'classic' definitions, and I am equally snide about them, because they only seem to respect thoughts of living people who are actually trying to make a profit by restating the problem in modern terms for the journals and philosophy lectures. Am I wrong about that?


Modern times need answers to modern problems. The scope of decisions is different.
Take the trolley problem: The question in modern times is not about making a singular decision.
It is about implementing drive assistants. Which philosophical approaches are (legally and philosophical) sound enough to be implemented to be executed rigidly by a machine?
The question is: may a machine be programmed to always kill the single person?
A slight deviation: If programming a car - do the inmates count? May a drive-assistant evade a sure collision with a truck and kill a cyclist instead? Does it matter if the driver did something (like driving too fast) that arguably made the situation arise in first place? Does it matter how many inmates in both situations? We need answers saying "Yes, 1101".

At least one could argue that such applications shed a new light on those problems and hence positions have to revisited. Starting with the question if implementing or enforcing a certain ethical policy or not doing so is in itself ethically sound.

Reading wikipedia surveys were taken among populations if choosing the track with fewer peoples on it was the way to go. Of what worth is that survey? It seems people were questioned about their own ethical belief, not about the question if machines should execute that belief. This is a fundamental difference. I do not have the impression that "classical" formulations were meant to be put into such a context.

Woth a look: The "Stop Button Problem"
https://www.youtube.com/watch?v=3TYT1QfdfsM&t=6s
Agent Smith May 25, 2022 at 10:02 #700491
Food for thought:

1. Quality: There's happiness/suffering.
2. Quantity: There's amount of suffering/happiness.

Those who'll pull the lever - killing one to save many - are looking at it quantitatively (2).

Those who're are in two minds - should I kill one to save many? - are looking at it qualitatively (1).
Possibility May 25, 2022 at 10:57 #700496
Quoting Agent Smith
Food for thought:

1. Quality: There's happiness/suffering.
2. Quantity: There's amount of suffering/happiness.

Those who'll pull the lever - killing one to save many - are looking at it quantitatively (2).

Those who're are in two minds - should I kill one to save many? - are looking at it qualitatively (1).


A little more...

1. Quality: There’s different kinds of happiness/suffering.
2. Quantity: There’s different amounts/levels of happiness/suffering.

It seems to me that the notion that one can make good decisions and be right all the time is based on an arbitrary preference for one of these over the other.
Agent Smith May 25, 2022 at 11:25 #700500