Skip to main content

Verified by Psychology Today

Ethics and Morality

When Right Is Not Right

Why do we confuse what is morally right with what is factually right?

Dick: "Harrisburg is the capital of Pennsylvania."

Jane: "Good. That is the right answer."

Dick: "When the teacher asked me if I cheated on the test, I admitted that I had."

Jane: "Good. That was the right answer."

Does "right" mean the same thing in both conversations?

Not really. In the first conversation, "right" means "correct" or "true." In this sense, "right" is referring to what is really, actually the case. Harrisburg really is the capital of Pennsylvania.

In the second conversation, "right" means what is morally good. In this sense, "right" is referring to what a person ought to do.

The same point can be made by looking at the two meanings of "wrong."

Dick: "Philadelphia is the capital of Pennsylvania."

Jane: "That is wrong."

Dick: "I cheated on the test."

Jane: "That is so wrong."

Descriptions of "what is" and prescriptions of "what ought to be" are two different kinds of statements. Even though it is right (true or factually correct) that humans have been slaughtering each other for thousands of years, this does not mean that this kind of slaughter is right (morally good). "What is right" is not always "what is right."

None of what I've written so far is news. The difference between is- and ought-statements was discussed masterfully by the Scottish philosopher David Hume in the 1700s. Although I do not usually suggest Wikipedia articles as the first source on a topic, I can recommend the Wikipedia article on the is-ought problem.

The new bit of scientific analysis I would like to add to the is-ought discussion is my perspective on the idea of "moral truths." To many of us, the existence of moral truths is obvious. For example, the statement "we ought to protect and care for young, helpless children" strikes most of us as an obvious moral truth.

But wait a minute. Doesn't the notion of "moral truth" confuse the two meanings of "right?" Moral statements containing the word "ought" are referring to what is morally good, not what is objectively true. "It is right to protect and care for helpless children" expresses a feeling about the appropriate way to treat children, and it is meant to encourage others to treat children this way. In contrast, "it is right that Harrisburg is the capital of Pennsylvania" is not an expression of feelings and is not meant to encourage any particular kind of behavior. Morality concerns feelings about what we ought to do, not descriptions of what is objectively true, so the expression "moral truth" seems to confuse "right-as-is" with "right-as-ought."

Still, the notion of "moral truth" remains compelling. And it can be even more compelling when we talk about what is not right, that is, what is obviously wrong. For example, "it is wrong to torture and abuse helpless children" certainly seems like a moral truth. Only a sadistic sociopath could disagree about the wrongness of torturing children. To us non-sociopaths, the wrongness of torturing children is not just an expression of how we feel, it is an objective truth about reality.

Now, a number of philosophers have tried their best to find a way to justify the idea that morals reflect objective truths. If you want to see what they have said, you can start with the aforementioned Wikipedia article.

But my position as a psychological scientist rather than a philosopher is that moral ought-statements cannot be objective is-statements. I would like to explain why moral truths seem to exist when they really do not.

From my study of this topic, I offer the following explanation. First, we know that moral issues typically involve feelings—sometimes very strong feelings. To see or even imagine a helpless child being physically abused automatically arouses in us strong feelings of revulsion, sadness, and/or anger. In contrast, watching parents caring for their children puts a smile on our face and gives us a warm glow inside.

This is a brute fact about morality, firmly established by scientific research—we make immediate judgments about what is morally right or wrong based on how we feel. Ordinary judgments about moral right and wrong in everyday life are based on emotions, not dispassionate analyses of what is objectively true. If we are asked to explain why we feel that something is right or wrong, we can sometimes conjure up such an explanation after the fact. But the feeling comes first, signaling us about whether something is right or wrong. But, in some cases, we cannot come up with an explanation for why we feel something is morally wrong—it just feels wrong. This is called moral dumbfounding.

But why, if I point out to you that your moral judgments are based on emotions rather than objective descriptions of reality, does it still seem to you like the rightness or wrongness of certain thoughts, feelings, and behaviors is an objective moral truth?

Here is where I think it gets interesting. People often assume that we arrive at truth by observation and rational thought, not by emotions. That can be true, but sometimes we use our emotions as an indicator of what is true. Did you ever work on a difficult puzzle for a long time, and, when you suddenly found the answer, you experienced a burst of joy? In science, we call this the "aha" or "Eureka" moment. We even know what it looks like, neurologically—there is a burst of electrical activity in the right temporal lobe. That burst moves us to a sense of certainty about what is true.

So, in some (many?) cases, truth is signaled by an emotional reaction. However, just because emotions can give us a sense of certainty about what is true, emotional truth-signaling is not 100% reliable. Research has shown that strokes, seizures, and electrical stimulation of the temporal lobe can cause a sense of bliss and profound truth. But those abnormal brain states are presumably not indicating what is actually true.

If I am on the right track, the strong emotions that accompany "ought-statements" are tricking us into thinking that they are "is-statements" about truth, even though the two kinds of statements are logically different.

I have one further hunch about the reason our emotions trick us into thinking that there are moral truths. The disposition to feel that an "ought" is an "is" allows us to better persuade and encourage each other toward moral behavior. If we realize that so-called moral truths are just feelings, we might start to doubt their validity. But that is less likely to happen if we present moral rules as if they are natural laws.

Swiss psychologist Jean Piaget demonstrated that young children are naturally predisposed to accept moral rules as if they were law-like truths. Although he claimed that children outgrow this and begin to see morals as human conventions for getting along with each other, the fact is that most adults are still predisposed to view a "moral right" as a "factual right." Even after an explanation such as this one about the differences between "ought" and "is."

advertisement
More from John A. Johnson Ph.D.
More from Psychology Today