You are viewing the historical archive of The Philosophy Forum.
For current discussions, visit the live forum.
Go to live forum

The trolley problem - why would you turn?

Wheatley January 03, 2018 at 15:11 12175 views 46 comments
Suppose you're the driver of a trolley coming down the tracks and it's headed towards three workers. You press the breaks and realize that they don't work. You can turn onto a different track, but there is a problem; one worker is on the other track, and if you turn you're going to crash into him, killing one but saving three others. What do you do? Most people would turn. But why? How is it different from murdering one person to save three others?

Comments (46)

CasKev January 03, 2018 at 17:33 #139595
I think there is a tendency in most people to make decisions that will minimize the level of unwanted suffering in others. Being forced to make a decision quickly, with no information other than the number of people that will die, the odds are that there will be less suffering if you choose the track with one person on it.

Of course, having more information may change the decision... For example, the three workers on one track are convicted murderers that show little chance of rehabilitation, and the single worker is a hard-working father of three.
Wheatley January 03, 2018 at 17:51 #139597
Reply to CasKev There are two options: you do nothing, and your not responsible for the death of the three workers because it's not your fault that the break malfunctioned. Or you choose to save the lives of three workers and commit murder by killing one worker. Do most people think it's okay to commit murder to save three innocent lives?
CasKev January 03, 2018 at 17:57 #139598
There's no murder involved - it's simply minimizing the damage that will occur in the circumstances. It's kill three, or kill one. Not kill three, or murder one. For it to be murder, you would have had to intentionally sabotage the brakes, or know the brakes are malfunctioning, and intentionally arrange for there to be people on the track.
Larynx January 03, 2018 at 18:19 #139602
Not to de-rail (no pun intended) this discussion at all, but a noteworthy point here: these sorts of thought experiments are under fire by some contemporary commentators. The arguments tend to center on the idea of practical ethics and that the mis- or over- use of these types of problems create an unnecessary distance from the actual practice and content of ethical decision making. For instance: while it might be sort of a fun puzzle to try and solve - the trolley problem is not a situation most of us are ever likely to encounter in the bulk of our ethical determinations. The further problem being: the trolley problem alters our view of how we think or think about thinking about ethical issues.

A potential third concern is that the trolley problem (like most thought experiments) is structured in much the same way as a mathematical problem. I mean just look at how many variations there are from Foot's original set up, how different parameters are established, how the numbers change for intended effect, etc. And yes, we might say something like, "well the alterations of the variables themselves are irrelevant, what's more important is that we are able to extrapolate some type of broad ethical norms from the scenario." Which might sound good, but probably doesn't accurately reflect the actual business of doing ethics or being in an ethical conundrum. Instead of 'feeling' ethics we just problematize it, and instead of practicing or considering the practice of more real world situations which demand ethical attention we shift our efforts toward a puzzle-solving motif where the focus becomes theory construction and the categorization of ethical attitudes based upon a seemingly unrealistic hypothetical.

Again, I don't say this to de-rail anything. Just to add a potentially useful consideration point to the discussion.
Wheatley January 03, 2018 at 18:30 #139606
Quoting CasKev
There's no murder involved - it's simply minimizing the damage that will occur in the circumstances. It's kill three, or kill one. Not kill three, or murder one.


You're not simply minimizing the damage from three to one. You're choosing to kill different people than the ones destined to be killed by the trolley. It's let the trolley kill three, or choose to kill one. And if choosing to kill a innocent bystander is not murder, I don't know what is. (I'm just playing devil's advocate, I don't necessarily believe this.)
Wheatley January 03, 2018 at 18:46 #139613
Reply to Larynx Good point.
Deleted User January 03, 2018 at 22:32 #139663
This user has been deleted and all their posts removed.
Abaoaqu January 03, 2018 at 22:34 #139664
I would do nothing.

If I face a problem like this and am well aware that if I save them I kill another worker it would seem like it is in my responsibility to save them, because I am the only one that has the power to do so. However, in this problem where someone will die no matter what, the question is, who will or should die? Now, because I do not have any information about these workers the question remains unsolved. Even if I had some information (like in the case where the 3 of them are murderers), I do not believe I have the right to decide who should live and who should die. Either way it is not my fault that the incident happened and so I do not have any responsability.

Plus, I think that when people say they would turn, it's mostly just talk, if they were actually confronted to that situation they would probably freeze.
Deleted User January 03, 2018 at 23:17 #139672
I would use the emergency brake; and if that failed, yell for people to get off the tracks.
SnowyChainsaw January 03, 2018 at 23:43 #139679
I would throw my own body in the way to stop the trolley.
philoskepsis January 04, 2018 at 06:52 #139722
It's worth to consider different competing views on what you ought to do in the Trolley case. One view states that you should pull the trolley because it leads to the best consequence (e.g. maximal happiness). This is the consequentialist view. Second view is the Doctrine of Doing and Allowing which states that doing harm is worse than allowing harm, so you don't pull the lever.

Third view is the Doctrine of Double Effect, so you intend to save three people, but the death of one person is simply a side effect of your action of pulling the lever. According to this view, as long as your intended goal is to save people rather than to kill people, it is morally permissible.There's another view by Foot which states that negative duties (e.g. you ought not to kill) trump positive duties (e.g. you ought to help the poor), so since pulling the lever would violate your negative duties you shouldn't pull it. There's also this view that it's morally permissible to pull the lever because you aren't initiating a chain of events, but rather redirecting it.

There are many other different (and equally important) views, but the above ones are ones I can think on top of my head.
Noble Dust January 04, 2018 at 06:57 #139723
Reply to Purple Pond

Is there a theoretical scenario that is more realistic than people being strapped to tracks?
Wheatley January 04, 2018 at 09:29 #139751
Reply to Noble Dust See modern variations (7 of them) of the trolley problem.
Noble Dust January 04, 2018 at 09:30 #139752
Streetlight January 04, 2018 at 09:36 #139753
Reply to Larynx Damn fine first post! Nicely summerizes why I reckon the trolley problem is basically toy ethics - fun to play with, but almost entirely unilluminating.
Noble Dust January 04, 2018 at 09:36 #139754
Reply to Purple Pond

Ah, so the article is just updated tech vs. trollies.

The real issue with the dilemma that no one seems wiling to address is the issue of more loss of life versus less. More pain vs. less. And of course, there's no morally sacrosanct answer. Pain and death are equally incomprehensible, regardless of in what quantity.
Michael January 04, 2018 at 09:57 #139755
Reply to Larynx I think a real-life example of the trolley problem would be the general who has to decide whether or not to launch a strike against an enemy that would entail civilian casualties. Is it better to not get involved or to intentionally kill a few to save many?
Larynx January 04, 2018 at 10:53 #139760
Reply to Michael I suppose that's true, but only in a sense. Again we're mostly back to square one if we're talking about generals and civilian killing zones, eh? How many people talking about trolleys are, in fact, also generals in war-time scenarios where they are required to determine whether to kill civilians or not? We're still left with the same problem(s) we started with, but now with an extra layer that seemingly makes it slightly less hypothetical. And to really drive this point home, let me just ask: how many of us - in our frequent ethical determinations - really need to to talk about intentionally vs unintentionally killing persons? Let me be honest with you: I've had a lot of moral and ethical dilemmas come up, but none that have ever had those types of immediate life or death stakes.

One thing that might be worth acknowledging is that puzzles like the trolley problem are flashy and relatively simple in terms of structure, yes? If you had magical powers to stop an asteroid from hitting earth but only insofar as you could divert it away to a less densely populated alien planet would you? And then we sit around and muse how exciting it would be to have magical powers and what it means to save planets. But aside from the three problems I listed in my first post we have another, more common sense issue here: I don't have magical powers, and even if by some minute chance I ever had something resembling magical powers, why would I be in a situation where I would need to decided between two planets for a potential asteroid impact? It sounds a little silly right?

A real ethical dilemma may not be as exciting. Let me give you a genuine ethical problem I faced last night: it's quite cold in the city I live in. My wife and I were walking our dogs through one of the parks last night and we saw a homeless man under an awning and against the wall of one of the buildings in the park (presumably to protect himself from the chilly wind and potential rain). It was fairly late, and he was trying to sleep but his sleeping bag looked thin I didn't get the sense he had any additional blankets. He appeared to be tossing and turning in the near freezing temperature.

I live approximately six city blocks from this park. Should I go home, get a few old blankets from our closet and run back there so he has some additional layers? I have an arctic rated sleeping bag that I barely use any more - should I just give it to him? Then there are other potentially mitigating thoughts: e.g. it's not as cold as it has been - he obviously survived the snow and ice we had, surely he can survive a twenty degree increase from those conditions? Or is it really my responsibility to provide blankets to the homeless and otherwise indigent? Where does it end? Technically I have lots of non-perishable food I could also give away - should I also give that to him? If I help him out should I also help out the cold homeless man on the street a few blocks away who sleeps near the bagel place? It's only a bit further and I have extra blankets after all. Of course at what point should I just canvass and donate for more homeless shelters in my city? What sort of financial commitment can I afford to make and still support my family? I make enough money to alright, but I likely do not make enough that I could prudently support a large homeless initiative as well.

That's not a flashy problem. It's not an easily structured problem that really allows us the categorization potential for ethical determinations that are typically generated from puzzles like the trolley problem. It's not simple to categorize all the divergent thoughts that run through our minds in those situations, and the nuances alone make it a bit less simple to think about in the types of terms we often partition out when dealing with something like the trolley problem. But it stands that this was a very real ethical question I had to think about with my wife, and it's a type of question that many in my city face on daily basis (from one side or the other) in the winter.

Wheatley January 04, 2018 at 11:29 #139769
Reply to philoskepsis How do the solutions where you actually change tracks deal with the scenario of the fat man? The scenario where you can throw a fat man off the bridge to stop the trolley hitting five workers.
sime January 04, 2018 at 11:55 #139777
The idea that ethical interventions shouldn't interfere with "destiny" undermines the very idea of ethics. And why shouldn't a Jury consider it "destiny" to have pulled the lever?
philoskepsis January 05, 2018 at 02:21 #140034
Purple Pond:"How do the solutions where you actually change tracks deal with the scenario of the fat man? The scenario where you can throw a fat man off the bridge to stop the trolley hitting five workers."


Consequentialism would say that pushing the fat man is morally permissible since it yields the best consequence (e.g. saving more people). However, the Doctrine of Doing vs. Allowing, Doctrine of Double Effect, Positive vs. Negative Duties, and Foot's distinction between initiating vs. redirecting causal sequence would all say that it's morally impermissible.

Since the Doctrine of Doings vs. Allowing states that doing harm is worse than allowing harm, pushing the fat man counts as doing harm and thereby it is morally impermissible. Doctrine of Double Effect states that if the intended goal constitutes harm (e.g. pushing the fat man in order to stop the trolley), then it is morally impermissible. Since your goal is to stop the trolley by pushing the fat man, it is morally impermissible. The view that negative duties trump positive duties would say that pushing the fat man violates a negative duty, so it is morally impermissible. Foot's distinction between initiating vs redirecting causal sequence would say that since pushing the fat man is initiating a causal sequence (e.g. pushing him leads to his death), it is morally impermissible.



Cavacava January 05, 2018 at 16:37 #140148
Thinking about how the Trolley Problem might apply in the world.

It seems that many if not all cars in the near future may have automatic controls (some cars already have versions of this option) so that a person may, or perhaps must give up control of the car and hand it over to an on board computer system. The computer probably will face situations similar to the Trolley Problem....where multiple dangerous courses of action are possible and that it must make a split second decision.

Two possibilities
a) The car have a mandatory values system that common to all autonomous driving vehicles which is built in, and that regulate its normal as well as its course in extraordinary cases.
b) The owner gets some say in the ethics of the car, perhaps by prioritizing the car's occupants welfare over the welfare of those external to the vehicle.

My guess, option a is the only one that I can envision insurance companies accepting. Autopilot is expected to reduce accidents and injuries, so if it works as advertised, insurance companies will go along with it, but my guess is that they will want say in any ethically auto driven procedures.



ssu January 05, 2018 at 17:26 #140158
Quoting Michael
I think a real-life example of the trolley problem would be the general who has to decide whether or not to launch a strike against an enemy that would entail civilian casualties. Is it better to not get involved or to intentionally kill a few to save many?

Especially when it comes to warfare, real world "ethics" are totally different starting from laws and jurisdiction of a nation, the international laws on war and ending with military doctrine, strategy an objectives of an armed forces. After all, in many wars even today civilians are deliberate targets themselves. The legal and social maze that humanity has built especially around conflict between nation states shouldn't be underestimated.

An unintensional accident (where likely the fault would be put to the builder of the breaks or the maintaner of the trolleys) is a bit different from war. And let's not forget that the workers likely understand the dangers of their work on actively used tracks.
charleton January 05, 2018 at 17:54 #140167
There are a series of variants to this problem.
One is the option of physically throwing a fat guy onto the tracks to save ten people.

Turns out that when you have to get up close and personal people are more reluctant to obey their utilitarianist ideals.
David Solman January 05, 2018 at 18:22 #140177
Reply to Purple Pond if you have the time to choose between changing tracks you likely have the time to scream at the workers to move off of the tracks but for the sake of the post i think that choosing to kill in any scenario is wrong, it isn't your fault that the trolley failed it's brakes but if you choose to change the track and kill the one worker then you've played a role in the tragedy and it is your fault that one worker died, whether or not you saved three lives in the process
Wheatley January 05, 2018 at 18:27 #140180
Quoting David Solman
whether or not you saved three lives in the process


What if it's a hundred workers?
David Solman January 05, 2018 at 18:31 #140181
Reply to Purple Pond it still isnt your fault that brakes failed and so your conscious is still clear, if you choose to kill a person then you made that choice. 10000000 people or not
CasKev January 05, 2018 at 18:31 #140182
I still say it's better to change tracks and minimize the unwanted suffering. I would change the track even if it were one person and someone's pet lizard, versus one person.
Wheatley January 05, 2018 at 18:42 #140185
You can think of the tracks as contingent in that the trolley is headed towards both groups of people. Would that solve the moral dilemma?
T Clark January 05, 2018 at 18:45 #140187
Quoting ssu
And let's not forget that the workers likely understand the dangers of their work on actively used tracks.


Actually, this is a very good, if not ethically relevant point. All railroads, including commuter rail, I've worked for have very stringent requirements for working on the tracks. Those include notifications, signals, watchmen, and derailers. Derailers do just that, if a train goes somewhere people are working, it is nudged off the tracks in a, it is hoped, safe way. The need for worker protection rules, regulations, and work practices is a much more interesting, relevant, and realistic ethical issue than fat guys stopping trolleys.
T Clark January 05, 2018 at 18:49 #140188
Quoting David Solman
it still isnt your fault that brakes failed and so your conscious is still clear, if you choose to kill a person then you made that choice. 10000000 people or not


Although I agree with those who say the situations described are unrealistic and unhelpful (and silly) from an ethical standpoint, your point is also a good one. There must come a point when the ethical fault caused by actively killing one person is balanced by passively allowing many to die.
Thorongil January 05, 2018 at 18:53 #140189
I agree with the solution presented in this article: https://orthosphere.wordpress.com/2017/09/28/the-trolley-problem-solved/
BC January 05, 2018 at 19:08 #140193
Thank you for providing moral clarity here:

Thorongil's Link:It is not morally permissible to kill innocent people no matter how handy it might be to get rid of someone. The trolley problem is trying to get people to engage in sacrificial conduct.
T Clark January 05, 2018 at 19:17 #140195
Quoting David Solman
it still isnt your fault that brakes failed and so your conscious is still clear, if you choose to kill a person then you made that choice. 10000000 people or not


Oh, wait. I misunderstood. You think it would be ok to let the Death Star destroy Alderaan rather than drop Jar-Jar Binks down the vent pipe into the reactor core.
David Solman January 05, 2018 at 20:57 #140215
Quoting T Clark
Oh, wait. I misunderstood. You think it would be ok to let the Death Star destroy Alderaan rather than drop Jar-Jar Binks down the vent pipe into the reactor core.

If you are choosing someone to die then you are the cause of someone's death. If you let the 3 workers die then you played zero part in the accident because none of it was caused by you. Lives will be lost either way and there will be suffering either way I don't think you have the right to doom someone's life just to save more lives if they were safe to begin with.
T Clark January 06, 2018 at 00:59 #140234
Quoting David Solman
If you are choosing someone to die then you are the cause of someone's death. If you let the 3 workers die then you played zero part in the accident because none of it was caused by you. Lives will be lost either way and there will be suffering either way I don't think you have the right to doom someone's life just to save more lives if they were safe to begin with.


You would let billions die in order to keep your conscience clean? So you can say "It's not my fault?"

Once again I say - Arguments like that are why people don't take philosophy seriously. Hey @Baden - How do I set up a macro that will print that phrase out automatically. I'm getting tired of typing it in so often.
dog January 06, 2018 at 08:09 #140363
Quoting Larynx
Which might sound good, but probably doesn't accurately reflect the actual business of doing ethics or being in an ethical conundrum. Instead of 'feeling' ethics we just problematize it, and instead of practicing or considering the practice of more real world situations which demand ethical attention we shift our efforts toward a puzzle-solving motif where the focus becomes theory construction and the categorization of ethical attitudes based upon a seemingly unrealistic hypothetical.


Excellent point. I usually enjoy these problems as parodies of philosophy at its most tone-deaf. It's like the tragicomedy of a Vulcan working out an algorithm to maximize virtue. Everything profound and high is reduced to a maximization or minimization problem.
TheMadFool January 06, 2018 at 13:03 #140451
Reply to LarynxVery interesting remark. Point well made. The Trolley problem is, as you said, a hypothetical and, thereby you claim it misses the mark in being practical ethics.

However, as @Michael showed the scenario isn't completely unrealistic. I read a true story of a sinking ship where one sailor was put in the exact same situation and he made the decision to sacrifice the few for the many. I believe he was honored as a hero.

Also, the Trolley problem is specific in its criticism. It's about Consequentialism (maximal happiness) and how its foundational premise, to the say the least, needs more work.

Some say rationality is paramount in all human endeavors; that being rational is akin to carrying a bright torch in the darkness. If you believe this then the Trolley problem carries weight for it exposes a hole in Consequentialist moral theory. It needs to be modified or discarded.

Strangely, I think we are all, instinctively, consequentialists in moral outlook. We always look to the effects of our thoughts and actions. Consequentialism seems to be our moral principle. All the reason to evaluate it thoroughly don't you think? The Trolley problem is important, even if only hypothetical.
Larynx January 06, 2018 at 20:33 #140548
Reply to TheMadFool Well it's not the hypothetical nature of the trolley problem that I believe bothers most critics - as I mentioned it's the distance from more common real world applicability and the structure of the hypothetical that is the problem. I would have no problem believing that at some point in since the dawn of the railroad there has been a true-to-life example of the trolley problem. But we might simply ask: how many times has it actually happened? How applicable is that one example to all of us? As I mentioned in my second post those sorts of high-stakes, life or death, and relatively flashy ethical dilemmas do not play a role in our day-to-day determinations. To a degree that's fine if our objective is puzzle-solving, right? After all we can hypothesize various participants, change variables, and have a lot of fun trying to solve a puzzle, but - as I mentioned - that procedure tends not to reflect the actual doing of ethical problem solving that we encounter in our lives.

Let me re-frame the criticism differently before going to far down the path of practical ethics as I think I may have miscommunicated the role of the hypothetical. What Michael mentioned with the war-time general, and what you mentioned with a sinking ship are examples that do happen - that's just fine. But those sorts of events are exceptions rather than rules - specific instances of ethical problems encountered very rarely by a select few and ethical problems themselves that do not have the common content that most of us encounter. Now I casually mentioned this in my first post I think: the argument in favor of this suggests that the applicability and/or exceptional nature of the circumstances are irrelevant - instead we're just attempting to extrapolate a mode of thinking from the example; that's essentially the point of these sorts of puzzle-solving activities. But the overarching issue I believe most critics have is that the grossly unrealistic nature of the hypothetical (and please read the term "unrealistic" as "not something likely to ever be encountered by the vast majority of people in real-world ethical scenarios") tends to the muddy the waters of our ethical judgement.

Ethical judgement becomes the medium of exchange when looking at the difference between a far more applicable ethical problem (e.g. helping the homeless person you pass on the street corner) and the exceptional case of the standard thought experiment. Take the trolley problem for instance - our ethical judgement(s) take on a very different character when we have to abstractly determine quantities of potential dead people, the nature of trolleys in the role of their death, and act of attempting to "solve" the problem. In and of itself that does may not sound as though it would cloud our judgement, but the critical response is to note that because of the structure of the trolley problem (which is used to categorize, compare, and evaluate ethical attitudes) we might cultivate a propensity to look at the far more common ethical problems in that same way. In order to truly articulate the distance of applicability and highlight the role of ethical judgement we need only inquire, "why not simply look at real problems that most of us face, have faced, or will face?" And I think that's the approach that is coming into favor with some contemporary ethicists.
TheMadFool January 07, 2018 at 04:59 #140663
Reply to LarynxWell, you're right. Consequentialism has practical uses; as I said we instinctively look to effects of our actions. In a sense it's like scientific theory that is approximate in nature - works most of the time except in rare instances e.g. a blackhole singularity. I think the Trolley problem and others like it are evidence for the case that a complete and consistent theory for morality isn't possible or is difficult to achieve. What do you think?
T Clark January 07, 2018 at 05:11 #140665
Quoting TheMadFool
Well, you're right. Consequentialism has practical uses; as I said we instinctively look to effects of our actions. In a sense it's like scientific theory that is approximate in nature - works most of the time except in rare instances e.g. a blackhole singularity. I think the Trolley problem and others like it are evidence for the case that a complete and consistent theory for morality isn't possible or is difficult to achieve. What do you think?


Except people aren't particularly rational and we don't make our moral judgments on a primarily rational basis. And that's a good thing. We follow our hearts, and we should. Hearts aren't stupid. They're not ignorant of consequences.

My heart tells me - "Drop Jar-jar down the effing vent pipe. What are you, an idiot."
Larynx January 07, 2018 at 05:42 #140671
Reply to TheMadFool You know, it's funny: when I started teaching ethics a few years ago I was quite a bit more in favor of consequentialist approaches and the standard modes of teaching and talking about that style of ethical theorization. I think as I've started to talk with students and colleagues a bit more about these sorts of things and I have had the opportunity to see the typical pedagogical structure of ethics in school I've started to shift away from those standardized models. I think one of the driving factors for me was that I noticed students shied away from personalizing moral reasoning. Complex problems, perhaps problems we might ask each other in a day-to-day situation (e.g. is political policy A good for B, what is our responsibility to disadvantaged group C, etc.) took on a more theoretical character that tended to insulate the student from the criticism of their peers and their grader (me) rather than encourage them to really feel their way through it and determine what they believed was best (I owe a portion of this to reading Bernard Williams by the way, I don't claim to be the architect).

I mean you're certainly not wrong - I think the comparison to a scientific theory is fairly appropriate with one minor addition: scientific theories rely on a specific notion of verification criteria and consensus. Ethical theorization of the variety that comes about through puzzles like the Trolley Problem have the same procedural objective as a scientific theory but lack the constituent elements that transform the observation process into one of concrete theory construction. Don't get me wrong: I'm not trying to drive some wedge between science and philosophy here - that's far beyond the scope of this post. Rather I think it's important to acknowledge the aim of the scientific theory and how that aim is shared in consequentialist ethical theorization while lacking certain aspects of scientific theorization.

One way to potentially frame that lack is in terms of personalization. We needn't and often times shouldn't begin the long and detailed process of scientific theory building with a vested personal interest in the outcome or, technically speaking, the content and means of interpreting our data. Ethical problems however often do take on a deeply person character - and with that comes a certain level of mental and emotional complexity. What happens if we start to depersonalize ethics though? What happens if we try and remove the moral agent with all their feelings, reservations, and thoughts in favor of a model that facilitates an abstraction of content that might have a serious impact on our lives? Now to be fair - I am painting this in unfair terms, and I do not which to suggest that the process of ethical abstraction is sociopathic in someway as compared to a more virtuous structure of personalization (it might well be, but that's a big claim that would need a lot of backing). But it's important to observe that something is lost in translation when we move the ethics from the personal to the abstract; the specific to the approximate.

TheMadFool January 07, 2018 at 05:47 #140674
Quoting T Clark
We follow our hearts


That's something very interesting. Contrary to what you think I feel our ''heart'' is a misconception of the ancients. Modern science has ''proven'' that our brains do the thinking AND feeling. Keeping the terminology for the discussion, do you really think our ''heart'' reasons through in its interaction with the world and ourselves? I don't know. Our ''heart'' is instinct-based or call it intuition. In neurological terms our ''hearts'' are reflexive - the seat of the much-maligned knee-jerk response. Do you think this involves any kind of mental processing to which we can apply the term ''rational''?
TheMadFool January 07, 2018 at 06:04 #140687
Quoting Larynx
But it's important to observe that something is lost in translation when we move the ethics from the personal to the abstract; the specific to the approximate.


I think @T Clark might have something to say about this. I don't know. Speaking in very general terms, rationality applied to ethics hasn't resulted in anything practically useful. All moral theories, rationally generated, have holes in them. This makes me believe that morality is, well, irrational. [I]The heart has reasons the mind knows not.[/i] Too radical a view?
Pseudonym January 08, 2018 at 13:50 #141240
Quoting TheMadFool
The heart has reasons the mind knows not.


Last time I checked our thoughts did not have labels attached to them; 'heart' or 'mind'.

Its not an unreasonable principle that our intuition (which is what I'm presuming you mean by heart) is privy to information that our conscious brain is not, but then we are still left with distinguishing one from the other. What reason do we have for thinking our first thoughts are more 'intuitive' than our later ones?
TheMadFool January 09, 2018 at 04:38 #141496
Quoting Pseudonym
Last time I checked our thoughts did not have labels attached to them; 'heart' or 'mind'.


[quote=Blaise Pascal]The heart has its reasons which reason knows nothing of... We know the truth not only by the reason, but by the heart.[/quote]

Quoting Pseudonym
Its not an unreasonable principle that our intuition (which is what I'm presuming you mean by heart) is privy to information that our conscious brain is not, but then we are still left with distinguishing one from the other. What reason do we have for thinking our first thoughts are more 'intuitive' than our later ones?


I believe that there are many rational theories on morality out there and also that each one of them has imperfections. The end result is that none of these moral theories can pass as a comprehensive guide for moral decisions.

Yet, we, all of us, have a sense of morality.

What does that speak of?

May be it's not correct but, given the above is the case I find it convenient to make the distinction of mind and heart. Morality comes from the heart and the mind, reason, can't make sense of it.