It is easy to see the faults of others, but difficult to see one’s own faults. One shows the faults of others like chaff winnowed in the wind, but one conceals one’s own faults as a cunning gambler conceals his dice. (Buddha, the Dhammapada)
Why do you see the speck in your neighbor’s eye, but do not notice the log in your own eye? ….You hypocrite, first take the log out of your own eye, and then you will see clearly to take the speck out of your neighbor’s eye. (Matthew 7:3-5)
One of the “great truths” about human nature is that we have difficulty seeing our own ethical failures as clearly as we see those of others. Why don’t we see how often we betray our own ethical standards? A large part of the answer is that the human decision-making system—like the human visual system—has blind spots. Ethical blind spots often obscure important aspects of an ethical decision. As a result, we don’t realize that the decisions we make have ethical implications and we make unethical choices without knowing it. It is no surprise, then, that the great majority of people believe themselves to be more ethical than the average—a statistical impossibility!
A gap exists between who we think we are and how we actually behave. People need to more clearly understand their own behavior and, in the process, raise themselves up to the ethical standards they already hold.
IDEAS TO APPLY (Based on research covered below)
- Be more humble.
To overcome your blind spots, you first need to realize that you are not as ethical as you think you are and that you won’t behave as ethically as you think you will. Research has demonstrated that thinking about the motivations that will be present at the time of the decision will increase the accuracy of your predictions.
- Examine and correct the reward systems that lead to ethical fading and motivated blindness.
What are you really rewarding your employees to do? What are you rewarding them to see, and to not see? Are you rewarding for sales revenue without concern for how those sales are obtained? Are you promoting individuals who contribute to the bottom line but lack integrity?
- Use data.
What data are you collecting that would prove, or disprove, how ethical you and your organization are? What’s being done with the data? Is it being tracked and reported on, or getting shuffled into a dark drawer, never to be seen again?
- Set the stage for “psychological safety” when it comes to ethics.
How often are you publicly describing your own growth as an ethical person? If you are not willing to discuss the process of learning about your blind spots, no one else will be either. Amy Edmondson describes psychological safety as a team member’s belief that it is safe to take risks, such as discussing failures or confessing uncertainty. Set the stage for psychological safety about ethics in your organization by making it “something we talk about,” not just something about rules and laws.
- Examine the language euphemisms that hide ethics from the decision maker.
Do you call it “earnings management” or “earnings manipulation?” Do you call your customers “clients” or “muppets”? Is your factory producing pollution or run-off? Similarly, people are more comfortable when described as cheating than being cheaters; lying than being liars; the language of verbs let us off the hook by not forcing us to define what we are doing as representative of who we are. Nouns are about who we are.
AREAS OF RESEARCH
Much of the research on blind spots in ethical decision making is based on the concepts of bounded ethicality and ethical fading.
How might bounded ethicality affect ethical decision-making?
Bounded ethicality is rooted in psychologist Herbert Simon’s groundbreaking concept of bounded rationality, a framework that describes the systematic, predictable, and biased psychological processes that contribute to the gap between our true preferences and our behavior. Bounded ethicality refers to the systematic and predictable ways in which people engage in unethical acts without their own awareness that they are doing anything wrong. Chugh, Bazerman, and Banaji (2005; Banaji, Bazerman, and Chugh, 2003) argue that all of us engage in behaviors that are inconsistent with our actual ethical preferences. The goal of the bounded ethicality approach is not to preach to people about how we should behave, but rather to help raise us to the ethical level we would endorse upon greater reflection about our own behavior.
How might ethical fading impact ethical decision-making?
Ethical fading refers to the process whereby ethics are removed from the decisions we face, a process that contributes to bounded ethicality. When I don’t see the ethical implications of a decision, ethical considerations aren’t part of the decision criteria. As a result, I behave unethically and in a way that is against my values and yet I am not aware of this inconsistency. Ethical fading can be caused by situational factors, which can change the type of decision that the decision maker believes that they are making. For example, one study found that the introduction of a compliance system designed to reduce undesirable behavior was found to reduce the likelihood that people saw a question as an ethical decision; it made it easier for them to see it purely as a business decision. It’s as though the introduction of the compliance system allowed decision makers to off-load their moral responsibility. Undesirable behaviors became more frequent, not less.
Why don’t we recognize our unethical behavior?
One reason is that our actions are hidden between predictions that we will behave ethically and recollections that we behaved more ethically than we really did. In other words, when we are predicting how we will behave, we believe that we will behave in line with our “should” self and behave ethically; however, at the time of the decision our “want” self wins out and we behave unethically. Yet when we recall that behavior, we see it through our “should” self and believe we behaved more ethically than we actually did. As a result, our long record of unethical behavior remains hidden to us.
One of the best examples of prediction errors comes from a study done by Woodzicka and LeFrance who asked college-aged women to predict how they would respond to the following questions if they were asked during an interview: “Do you have a boyfriend?” “Do people find you desirable?” “Do you think it is important for women to wear bras to work?” While a majority of women predicted that they would refuse to answer these inappropriate and sexually harassing questions, their actual behavior was quite different: Everyone in that situation actually answered the questions.
When recalling actions that we engaged in that did not fit our standard, we engage in “revisionary” ethics and revise the behavior so that our inflated image of our own ethicality remains intact. Revisionary ethics can take the place of biased attributions, or “blame the other guy” and by changing our perception of how wrong the behavior actually was. It has been found, for example, that when people are in an environment to cheat, they change their perception of how morally wrong cheating is. Similarly, research has found that the more tempted people are to behave unethically, the more likely they are to rationalize that behavior by being more likely to believe that “everyone else is doing it.”
Research demonstrates that not only are we blind to our own unethical behavior but also to the unethical behavior of others when it is not in our best interest to see it. This is known as motivated blindness. So if I am an auditor and hope for future auditing and consulting business from you, I will be less likely to see your unethical behavior than if such future rewards were not possible. Motivated blindness is more likely to occur when:
- the unethical behavior is committed by a third party or intermediary.
- the unethical behavior occurs gradually, with actions representing only small deviations from one’s standard.
- the unethical behavior is associated with good versus bad outcomes.
How does hypocrisy figure in to decision-making?
There is a long line of research from Daniel Batson on hypocrisy, showing that people will make unethical choices in part by applying standards flexibly, until they can find a way to justify the outcome that they want for themselves. (See similar findings on our Cheating and Honesty page; people cheat up to the point that they can continue to believe that they are honest.)
- The Madoff Ponzi scheme: Why didn’t investors notice the unusually consistent returns?
- The Challenger disaster: Why didn’t management “see” the problems associated with the launch?
- Sears: How did the reward system lead to unethical behavior and eventually lawsuits and reduced customer confidence?
- Ford Pinto: Why did they decide against fixing the problem when it would have saved lives?
- News International: Why did the phone-hacking continue for so long without anyone noticing?
- What organizational factors – beyond incentives – increase ethical fading? How can we design organizations that keep ethical concerns from fading out of business decisions?
- How to we prepare for the onslaught of the “want” self at the time of the decision, which often makes us behave unethically?
- How can we get people to follow the morality that they would advocate with greater self-awareness?
TO LEARN MORE
- The “Maximize Profits” Trap in Decision Making, Harvard Business Review, September 2016.
- 20 Cognitive Biases That Screw Up Your Decisions, Business Insider: Australia, August 27, 2015.
- Bazerman, M.H., & Tenbrunsel, A.E. Blind Spots: Why We Fail to Do What’s Right and What to Do about It. Princeton University Press, 2011. (public library)
- Gino, Francesca. Sidetracked: Why Our Decisions Get Derailed and How We Can Stick to the Plan. Boston, MA: Harvard Business Review Press, 2013. (public library) [Important context: this author has several publications retracted or under review]
- Haidt, J, (2006). The Happiness Hypothesis. (public library) You can read ch. 4, on the causes of our hypocrisy, here.
Max Bazerman, at Queen’s School of Business, discusses Why Are We Strangely Blind to our Own Ethical Lapses (summary included):
Max Bazerman speaking in the European Science Association’s European Conference at the University of Cologne:
Here is a slide presentation by Max Bazerman: Behavioral Decision Research and Management Accounting: The Case of Conflicts of Interest
Ann Tenbrunsel discusses business ethics and blind spots:
Adam Grant on diagnosing a culture of (un)ethical decision-making:
Francesca Gino talks about what can derail our decisions:
Dan Ariely, in one of his TED talks, on decision making:
- See more from Francesca Gino and many other experts discussing decision making concepts and cases on our Decision Making playlist, at the Ethical Systems YouTube channel.
Roy Baumeister has studied the role of self-control in decision-making. Given how hard it is to resist temptations in-the-moment (when our “want self” wins out over our “should self”), it makes sense that factors that reduce self-control (including low blood sugar, and high cognitive load) may make it harder to make good ethical decisions.