43 Moral Response and Reflection
Letitia Meynell and Clarisse Paron
1.1 Moral Response
Often when we make moral judgements, we find they are tied up with our emotional reactions. For instance, we typically feel happy when good things happen to good people and angry when we witness things that are unjust. We may also feel personal satisfaction at having done the right thing and pride in having it recognized. Similarly, we often feel guilt for acting badly and shame when others call us out for it. These familiar experiences are moral judgements just as much as emotional reactions.
Although emotions can be important and instructive by alerting us to moral issues, they are sometimes not well justified on reflection. Indeed, in some instances, once we reflect on our emotions, we may find that they are ethically quite misleading. Even positive emotions, like love, may lead us to misjudge a situation, prompting us to defend friends or family members who have, in fact, behaved badly. Negative emotions can be equally misleading. Most of us have had the experience of being in a fit of anger and doing something (or at least thinking of doing something) that we later recognize was morally wrong. The Roman historian Tacitus believed that many people have a tendency to hate those whom they have injured.[1] Our emotional reactions to our own bad behaviour might distort our perception of our victims in ways that would make us prone to harm them yet further. This should trouble anyone who is inclined to let their emotions govern their actions. Indeed, philosophical traditions that foreground moral emotions tend to emphasize the importance of cultivating virtuous or appropriate emotional responses (as we will see in Chapter 5).
If our emotions can be fallible guides to moral action, what else might we consider? We might think about how others will judge our actions or how they would act were they in our place. Again, this can be instructive in terms of alerting us to moral considerations (as we shall see in section 5.3 and section 6.1). Nonetheless, this is typically insufficient for coming to a justified moral judgement. There are good reasons for this. There are many biases in our society and many people who behave badly. If we simply judge as others judge and follow what others do or what they expect us to do, we may end up making some terrible judgements and engaging in some heinous behaviour.
It can be deeply disturbing to discover that those who hold a respected place in our community or the people we love have immoral attitudes or have engaged in morally repugnant behaviour. Nonetheless, if we truly care about doing the right thing, we must be open to making such discoveries. We may even discover that attitudes or conventions that are widely accepted in our society are nonetheless morally pernicious.
Of course, many social conventions are perfectly morally acceptable. Some may even be morally required. After all, conventional norms and practices offer a set of rules for behaviour that help the members of society understand one another and fruitfully interact with each other. However, in order to be able to distinguish conventions that are useful and good from those that are bigoted and bad we need to go beyond the conventions themselves. This is where normative ethics, philosophical analysis and argument come in.
Stop and Think
Take a moment to consider a norm or a practice that was (or perhaps is) thought to be ethically acceptable in some culture or society (perhaps even your own) that you believe is morally wrong.
Now try to articulate the reasons why it’s wrong.
You have just started doing moral philosophy!
1.2 Reflection
Now, one might wonder how we can discover that we ourselves or members of our community have been following customs that are morally wrong, if we are located in societies and communities that follow these customs. This is where moral theory, conceptual analysis, and argumentation come in. We can use moral theories to assess the norms, conventions, and practices of our own communities. Even so, it is difficult to understand how things might be different from within our own culture. This is where outside perspectives are particularly valuable.
As a number of philosophers who study the theory of knowledge have argued, the critical eye of people with very different beliefs, norms, and values to our own can be extremely useful for assessing the claims we endorse and the things we do. The idea is that if a claim or practice can withstand criticism from a wide variety of different perspectives with very different assumptions then it must be pretty good, or at least it is likely to be morally acceptable. It is rather like using various experiments to test the same hypothesis. If your hypothesis is confirmed using a wide array of very different experimental designs, then your scientific investigations have given you good reason for thinking it is likely right.
Notice, that this process does not give us grounds for dogmatically claiming that the matter is permanently decided in either science or ethics. Moreover, our assessments must be done in good faith. If we value scientific knowledge we should welcome having multiple rigorous tests of our favored theories. In the same way, if we want to do the right thing, we should be open to criticism from a wide variety of different people whose views are very different from our own. Of course, others may or may not be right in their criticisms. Either way, being able to understand and assess them will give us insight into the relevant ethical issues and better justification for our own ethical decisions.
Unfortunately, we often don’t have access to a variety of people from many different backgrounds to give us feedback on our ideas and activities. Even if we do, these folks may have better things to do than help us with our moral dilemmas. Fortunately, we do have access to published work by thinkers from around the globe and we can draw on this and our own imaginations to guess what those who disagree with us might say. This kind of dialogic reasoning is characteristic of philosophical work (as we will see in Chapter 2). If you want to do the right thing then sincerely considering arguments both for and against the various possible actions that are open to you is one of the best ways of ensuring that you do.
Stop and Think
Can you remember a moment of your life in which someone with a completely different background or perspective said or did something that prompted you to reconsider one of your own cherished ethical or political commitments?
What was the difference that made the difference?
If you have never had such an experience, why do you think that might be?
Now, it might reasonably be asked whether such a process of rational reflection, judgement, and action will always provide the right answer. Philosophers have disagreed on this point. However, the very fact of their disagreement suggests that, for practical purposes, all philosophers are going to have to admit that seemingly rational people do in fact disagree about moral issues and sometimes these disagreements are intractable.
1.3 Disagreement
It is worth articulating the different ways in which philosophers disagree, as this will help us better analyze and assess competing theories. Sometimes philosophers disagree about the facts. For instance, two philosophers might share the same basic normative views but disagree about relevant features of the world. Suppose two philosophers agree that what matters morally is to make people as happy as possible. However, one believes that, psychologically speaking, what actually makes people happy is ensuring their safety, while the other believes that happiness depends on maximizing people’s freedom. Both agree that happiness is a particular emotional state, but they disagree about the facts regarding what causes it. Notice that if they both really care about doing the right thing, they are probably going to want to look at some empirical work here. For example, they might examine research in social psychology to see what really does make people happy.
Another possibility is that the philosophers disagree about what happiness means or, alternatively, what type of happiness is morally relevant. One philosopher might think that true happiness is an emotional state that is experienced moment to moment while the other might think that true happiness depends on achievement and overcoming various struggles and obstacles over a lifetime. These philosophers are effectively disagreeing about what a certain concept means. Scientific investigations are unlikely to be helpful. In order for science to discover what causes happiness, first it must be determined what we’re talking about when we refer to happiness. This brings us back into the realm of philosophy.
Notice that this question about what a moral concept means is intimately related to who counts. Here, again, our philosophers might disagree. After all, many nonhuman animals appear to experience emotional states like happiness, in which case the first philosopher should, presumably, include these animals in their moral decision-making. The second philosopher might not agree. They might argue that other animals can’t formulate the kinds of life projects that are required for happiness, and claim that only humans (or perhaps most humans and a handful of other species) count. [2] While the sciences might be invaluable for identifying which animals (and humans) have the capacity to be happy, they can only do this work after philosophers have defined it.
Finally, we might simply accept different moral theories and values or rank them differently in importance. One philosopher might think that maximizing happiness is the single most important moral goal while another thinks it is irrelevant because freedom is the only thing that matters morally, whether it makes people happy or not. Here again, there is philosophical work to be done.
Summary of the types of moral disagreement
- Disagreement about the facts
- Disagreement about what a key philosophical term means
- Disagreement about who counts
- Disagreement about which moral theories or values are right or relevant
Notice that if we agree about the facts, the meaning of moral concepts, who counts, and the applicable moral theory or values, we should agree about the right course of action. If we are reasoning carefully and disagree about the right course of action it is almost certainly because we disagree about the relevant facts, the meaning of moral concepts, who counts, or the relevant moral theories or values (or their relative importance).
Importantly, whatever we decide to do, we are morally responsible for that decision and its outcome—good or bad. We should expect to be held accountable for our actions. Happily, if we have carefully considered our options, listened to and learned from those who disagree, and looked at the situation through each ethical lens and from all relevant perspectives, we can expect to have a robust and convincing justification for our actions.
In applied contexts, there is the possibility that even if we disagree about the facts, the interpretation of moral concepts, who counts, and the correct normative theories, we may nonetheless agree about what the right action is in a given situation. This gives us another reason for not just choosing one lens or theory over the others but instead taking a more pluralist approach. If we can show that the same action is required by a broad set of very different moral views, then this becomes very powerful evidence that the action is morally required. So, even if you are inclined to think that one of the approaches discussed below is right to the exclusion of the others, you may be able to provide far more compelling arguments if you notice when they agree.
- Tacitus, The Germany and the Agricola of Tacitus (Project Gutenberg, 2013), Agricola para. 42, https://www.gutenberg.org/files/7524/7524-h/7524-h.htm. ↵
- Notice that restrictive views about who counts morally may lead us to exclude some nonfetal humans too, such as the very young and some of the very old, so such restrictive approaches to moral status may turn out to have unacceptable implications. ↵