Deciding the Standards for Morality (Moral/Immoral/Amoral)
How do you define morality? I have done research for a number of months and come across many theories which did not resonate well with me except one. This theory is a logistic table that tries to define morality as dependent on will or drive. Any action that is done unconsciously is automatically considered to be not moral or immoral, but amoral because it was done on accident. Only when an individual aims to directly affect themselves or others can an actions be looked at through a moral lens. But the question still remains is to what is considered moral and immoral?
Comments (22)
Different moralities promote different strategies for cultural survival. Conflict between moralities is conflict over what culture has dominance among a people.
Decide what kind of "states of the soul" you want to promote, and you will have your morality. Keep in mind you don't have to be egalitarian, ie, different classes can have different moralities.
The honest thing to call this is "polytheism", but we're all far from honest.
It's partly something people are naturally inclined to (genetically, and of course that varies across ethnic groups and races), but it's then reinforced - precisely because the variance is on a bell-curve, so some people need to be encouraged to do more of what they already slightly intend to do, and others more strongly intend to do. The ones who are stronger in the trait and more numerous, keep in line the ones who are weaker in the trait and less numerous by various means (social shaming, social sanction, laws, etc.).
There's a subjective and relative component, and an objective component. Part of morality is a shared survival/flourishing strategy, and that is obviously objective (i.e. somewhere out there in possibility space, there's an ideal set of rules for interpersonal interaction in the group, given the nature of people in the group, and the nature of the world).
But at the same time individuals still have a choice whether to stick to those rules or not, or to live by other rules (if they can get agreement, or move to another group). That's how morality evolves over time (sometimes outliers can influence the tendency of the group, analogously to the way mutations can spread through a population).
Religions have been the main carriers of morality historically, but more recently (i.e. over the past 3,000 years or so) philosophy has hived off some of the guidance practice - moral rules are subject to intense intellectual scrutiny, which helps the process of discovering what's best (relative to whatever - to a given group, or to humanity as a whole, etc.).
I'm not knowledgeable enough to really know, but this sounds kind of Kantian to me. There was a recent discussion where this came up and I got beaten up for disagreeing that reason is required for moral action. It was Being or Having: The Pathology of Normalcy.
And I do disagree. For me, impulses from the heart underlie all moral action. It may or may not pass through reason on its way to implementation. A lot, most?, human action is not mediated by reason. And that's not a bad thing, although many seem to think it is.
In my opinion, the source of all moral action is our human nature. People are made to like other people, live with other people. How could we be successful social animals without some sort of drive to kindness and fellow feeling? As for specific mechanism, I would have to step off in to rank speculation.
In general: Morality is treating each being according to its proper ontological value (value as being). I.e., treat God as God, man as man, animals as animals, plants as plants, and objects as objects. Any other combination would be immoral. E.g., treat God as an object or an object as a god.
Concerning man only: All men have equal ontological value, therefore it is moral to treat all men as equal, including yourself. Thus we get the Golden Rule of ethics: do unto others as you would want them to do unto you. Failing the Golden Rule entails that there is an unequal intention of treatment somewhere.
This is not any kind of definition.
It's not clear whom you are responding to. You should use the "quote" function. Highlight the text you want to quote and push the "quote" button that pops up. The quote and a reference will be copied down to your response.
If you're responding to me, then I disagreed with what you wrote in the original post. I don't think morality is based in reason.
My bad. I was responding to the OP. I have changed my previous response above.
While I am here, I might as well respond to your previous post.
Quoting T Clark
I agree. The first principles of morality come from our 'conscience', or likely what you call 'heart'. Then reason is used to determine the correct actions that comply with these principles.
That said, reason is necessary for morality because only reason can know universals, as is the case for first principles of morality. I think 'ethics' is also called 'practical reason'.
Then what does one call compassionate or socially conscientious behavior which is not mediated by reason?
To my knowledge, conscience only gives general principles; it does not inform us of the specific morally right action for the specific situation; which can only be obtained by reason. And I think the first principle is "act justly" or "obey the golden rule".
Example: For a given situation, even if lying would make things easier on me, I choose to tell the truth, because truth is what I would want to hear in that situation.
This is not consistent with my personal experience.
What would be an example from your experience?
My personal experience with myself behaving morally. I can watch my feelings and behavior. I perform most actions, including moral actions, without the intercession of reason. I know what's right and wrong. I don't need to be told, even by myself, what to do.
But if you claim that conscience informs you without the use of reason, then it seems you are told what to do, by the conscience, without understanding the reason why it is morally good. In which case, how could you know that the information is morally good, and not information rising from some selfish desire?
"The conscience" is no less me than my reason. I don't need to understand any reason to know what's right, and, on a day-to-day basis, that's not how I make moral decisions. For a lot of things, there are no reasons.
Alright. What if the information told by your conscience contradicts the information told be somebody else's conscience. The usual way to resolve a conflict is through reasoning.
Sure, I talk about moral issues and try to figure things out with my friends, family, and coworkers all the time.
So reason is present in the topic, even if it is not made explicit in every moral actions.
Reason is secondary, a tool. I make moral decisions, not my will, or my reason, or my consciousness, or my intention, or my mind. All of me.
Yeah, it may be the case that after a while, through good habits, we perform a morally good act with little to no reasoning. The ability to rationalize the act may still be present, but not necessarily activated.
It is wise to establish a definition, stupid to assume that your definition is a perfect form that has existed since the dawn of time, absolute, objective and universal.
Claiming such a definition is a great way to establish an ideal code and to assess other moral positions from that perspective. It is not a means to stand on a high horse and demand that your standards are the only ones.
By asserting a definition it is possible to say how and why such consequences of that position could work for others. Kant asserted the Categorical Imperative, Bentham Utilitarianism. Both positions useful and assessable through comparison; neither are natural and neither are objective, though they may co-op objective knowledge, and the natural "is" of conditions; neither entails an "ought".