Yesenia Guitron knew something was wrong at the bank branch where she worked. She was getting complaints from customers—many from Mexico and undocumented—that they were being charged for accounts they had never opened and were receiving debit cards they had never requested. Guitron, a personal banker at a local Wells Fargo in the Napa Valley town of St. Helena, began to realize that some of her colleagues, under intense pressure to open accounts, were doing so without customers’ knowledge.
This was back in 2008. Guitron, now 35, didn’t know then that the behavior at her local branch was part of a nationwide scandal, one that wouldn’t fully come to light until eight years later. But one thing she did know: It was wrong and she wasn’t going to go along with it. So she called the Wells Fargo ethics hotline. She told her manager. She told human resources. She spoke up in meetings, despite the acrimony directed at her by some of her colleagues. The bank’s practices didn’t change, but Guitron felt herself the target of retaliation by managers, who were suddenly scrutinizing her every move. “It was a toxic atmosphere,” she said.
Then, in 2010, after nearly two years at the bank, she was fired for failing to meet her quarterly sales goals and for insubordination. It was a heavy price to pay, but Guitron doesn’t regret her actions. “A lot of people needed jobs,” she said. “I did too. But I wasn’t going to carry that on my conscience.”
Whistle-blowers like Guitron are generally portrayed—usually long after the fact—as heroes, people who stood up against wrong when others failed to do so. Their actions are seen as spurred by honesty and bravery, rare and admirable attributes that the rest of us too often lack. But increasingly, academics and ethicists are trying to understand what makes some of us do the right thing, in the hope that it can be taught and replicated.
Don A. Moore, a professor at Berkeley’s Haas School of Business who teaches a first-year core MBA course called “Leading People,” said many MBA programs increased their ethics course offerings in the early 2000s in response to high-profile corruption scandals at companies such as Enron, Arthur Andersen, and Halliburton. There was a corresponding push to integrate ethics into general MBA course work and not “ghettoize” the subject. “I was personally shaken by those scandals and the questionable accounting practices that led to the great recession,” he said. “I took seriously the need to address it in my class.”
How to do that, he says, was another issue, one he continues to grapple with.
A basic principle of psychology is that someone in a position of power, by virtue of their position, can make most subordinates do their bidding, even when the directives are immoral. In the past, those who studied and taught ethics believed that when students clearly understand the principle, they can resist it, explains Brooke Deterline, CEO of Courageous Leadership, a consulting firm that offers workshops and programs on business ethics. “But that’s not true—just because we rationally know” that we shouldn’t follow immoral orders “doesn’t mean we don’t do it. It’s much more complicated than that.”
Deterline, whose training includes certificate programs in corporate social responsibility and corporate board management at Haas, says that, in addition to understanding, we need strategies to overcome such ingrained behavior. So while it’s important to know that most people dread conflict and ostracism, and to appreciate how deep-seated that dread is, it’s even more crucial to learn how to cope with those fears and how to prioritize, at the critical moment of decision making, long-term values over short-term anxieties.
It’s also crucially important to understand that most people believe they are moral and have confidence that they will act courageously when it counts. Most of us, reading about Yesenia Guitron, for example, identify with her, not with the managers who fired her. But research suggests we’re fooling ourselves. “Everyone talks a great game when nothing is on the line,” said Thomas White, a professor of business ethics at Loyola Marymount University.
Two notorious social psychology experiments vividly demonstrate how easily we capitulate to authority: the Milgram experiment in 1961 at Yale University, named after Stanley Milgram, the professor who ran the project; and, a decade later, the Stanford Prison Experiment designed by Professor Philip Zimbardo.
In the first project, participants who didn’t know the real point of the experiment were assigned the role of “teacher” and instructed to administer a word-pairing test to a “learner,” who was actually an actor working off a script provided by the researchers. Teachers were invited to witness the learner being strapped into a chair with an electrode pasted to their arm. Then the teachers were told to administer a shock when a learner failed to get a pairing right, and to increase the voltage with each missed answer. The participants did as instructed, and most teachers administered the maximum voltage, despite the screams of the learners.
In the Stanford experiment, undergraduates were recruited to serve as either “guards” or “prisoners,” to evaluate the power dynamics in prisons. Within a very short time, the guards, in spite of having been given instructions to stay within normative behavioral bounds, turned sadistic, denying food, water, and sleep to the prisoners, spraying them with fire extinguishers and stripping them naked. The ersatz guards became so abusive so quickly that the intended two-week experiment was halted after just six days.
Although the ethics of both experiments were themselves called into question, the findings—that people were willing to submit to authority even under manufactured circumstances they could easily have walked away from—chillingly demonstrated a human tendency toward conformity. Of course most of us firmly believe we would have behaved morally, but Loyola Marymount’s White says it’s not that simple. “You can’t predict how you will act when you have to make a decision that harms your own interest for some greater good. At moments like that, you’ll be awash in conflicting emotions, and you can’t rely on your gut feeling because it may be wrong. Or it may be right, but not strong enough.”
Berkeley Haas’s Moore says his graduate students, most of whom already have corporate experience, viewed the Wells Fargo scandal “with a mixture of disgust and empathy. They were horrified at the outcome, but understood how the employees got there,” he said.
In truth, it doesn’t take much for us to do the wrong thing. We often cut corners just to make our lives easier. In one experiment, David DeSteno, a professor of psychology at Northeastern University, had subjects flip a virtual coin to determine whether they would do a short, fun task or one that was long and tedious. The coin toss was rigged so that, at first, it would land on the onerous task several times in succession. Unbeknownst to the participants, they were being video-recorded. In the end, only 10 percent of them performed the experiment honestly; 90 percent lied or kept tossing the coin until it came up the way they wanted. Yet, when the cheaters were asked if they had acted fairly, most said they had. And when they watched another participant cheat—really someone planted by the researchers—they indignantly denounced the actions, even though they themselves had behaved the same way.
“They had to create a story for themselves, saying they weren’t really a bad person,” DeSteno said. But here’s the really interesting part. If participants were asked to hold seven digits in their mind before being asked about their own actions, they were more likely to condemn themselves just as they had the other participant. The implication is that their minds, preoccupied with remembering the numbers, were too busy to rationalize the bad behavior.
Such findings may seem depressing, but the insights they offer into human behavior are critically important to the teaching of ethics. And research can also show us ways to be better people. As Moore said of his students, “These are bright, capable people, and I don’t need to teach them right from wrong. Rather, I aim to teach how to anticipate an ethical conflict and respond in a way that is compatible with their own values. Evidence suggests that those who act in unethical ways later regret it because they haven’t thought enough about it beforehand.”
The good news is that even seemingly small measures can nudge our behavior in the right direction. For example, numerous experiments have shown that attesting to the validity of a form such as a tax filing, test, or expense form before we fill it out tends to make us more honest. Another way to encourage decency is to ask those facing a moral predicament to change their thinking from “What should I do?” to “What could I do?” Francesca Gino, a professor of business administration at Harvard University, said five different studies found that the latter frame of mind opens up a range of possibilities rather than narrowing the choices down to simple right and wrong.
Gino is working with Dan Ostergaard, who founded the company Integrity by Design. They have developed an app called Social Compass that a user can open when faced with a moral dilemma. The app prompts them with questions about impact, motivation, perception, and purpose to help them analyze their choices. For example, it asks a user whether their potential decision is in line with their values, whether it would make them a good role model, and whether it would damage their reputation. The ready availability of the app is a potential key to its utility. “The problem with compliance programs is that they have to hit exactly when needed,” said Ostergaard, a former chief ethics compliance officer with the healthcare company Novartis. “Companies spend 99 percent of their resources on programs that have no impact on behavior.”
Or maybe they do, but the company itself is the problem. Take Guitron. When she was hired at Wells Fargo, she was paid to attend a month-long training session—all day, every day—and “code of conduct and ethics were a big part of it,” she said. “We were told that we need to follow the rules, and if we see something suspicious, we should report it.” But when she did just that, she was thwarted instead of rewarded.
Getting people to understand and rethink typical responses to conflict and fear is one of the goals of Philip Zimbardo, professor emeritus of psychology at Stanford University, who conducted the Stanford Prison project. Zimbardo spent a career examining immorality and humans’ basest behavior, but later in life he switched his focus to the question of why some of us do the right thing, and whether moral behavior could be taught. In 2008, he started the Heroic Imagination Project, which endeavors to teach young people about the dangers of obedience to authority and how individuals can recognize and circumvent abuses when they arise. “We have to become aware of the power of the dark side,” Zimbardo said.
Through classes and workshops, using video clips and real-life situations, teachers raise awareness of our less admirable predispositions in order to resist them. Instructors explain group dynamics, peer pressure, obedience to authority, anonymity, and the “bystander effect” (the social psychology phenomena in which most people will not help someone in need if many other people are standing around doing nothing). They also encourage students to share personal stories of standing up for what’s right, or failing to do so. The project has found, through questionnaires and focus groups, that students exposed to such teachings reported, among other things, an increase in empathy and a greater tendency toward collaboration.
Teaching (or reminding) people of their “ethical roots” is even more crucial for people in power, said Berkeley’s Dacher Keltner, professor of psychology and co-director of the University’s Greater Good Science Center.
“Power and wealth makes people act unethically,” said Keltner, much of whose research has focused on the relationship between money, authority, and morals. He believes that although people generally gain power through such virtues as empathy, collaboration, openness and fairness, these same qualities begin to disappear as a person’s authority increases. And it occurs on the most mundane level. One of Keltner’s experiments, called the “cookie monster” study, assigned participants to a group writing task and randomly put one person in charge. A half hour into the task, a plate of freshly baked cookies was placed before the group; one cookie for each team member, plus an extra. The person who was named the team leader almost always ate the last cookie and was more likely to eat it rudely, mouth open, lips smacking, etc. Power not only corrupts, the experiment suggests, it also makes us vulgar.
To avoid falling into what he calls the power paradox, Keltner says those in positions of authority need to be aware of the tendency. As he writes in the Harvard Business Review, “new studies in neuroscience find that by simply reflecting on those thoughts and emotions … we can engage regions of our frontal lobes that help us keep our worst impulses in check.”
Power’s companion, money, can also induce ethical blindness. Laura Kray, a professor at Haas, and her colleagues Jessica A. Kennedy of Vanderbilt University and Gillian Ku of the London Business School, conducted a meta-analysis that found that women tend to act more ethically than men, unless there is a strong financial incentive inducing them to act differently. In one experiment, women participants were told they were managers who had to get a “job applicant” to commit to the lowest salary. They were also told they couldn’t promise the applicant stability because, unbeknownst to the applicant, the job would be eliminated in six months. The manager would be awarded a certain amount of money for getting the applicant to commit to a lower salary. Resistant at first, the women managers were more likely to prevaricate or lie in the negotiations as the financial reward increased. Men were quicker to deceive the applicant.
The difference is not explained by any innate biological difference, Kray contends, but by the fact that women are socialized to care more about fairness and cooperation, while men care more about being seen as strategic and as winners. Other research also found that women tend to act more unethically—by telling a lie, for example—when they’re trying to help someone else than when advocating for themselves. This information is important in trying to change behavior, Kray said. Women should realize that “when there are big shiny incentives, they have to be more aware” of the potential to become just as unscrupulous as their male counterparts. For men “who aspire to the highest ethical standard, they should know how deeply embedded … self-interest and masculine identity [are] driving their decisions.” And women who go overboard for others should ask themselves, “Would I do the same thing on my own behalf?”
Deterline, of Courageous Leadership, says one of her goals when teaching a workshop is to normalize the feeling most people have when faced with a stressful situation. “Our natural survival instinct easily overcomes doing the right thing, and the thought is ‘How do I protect myself in this moment?’ That doesn’t mean you’re a coward.” In her workshops, people discuss instances when they did and didn’t act as they wanted to, and explore what triggers might have sent them down the wrong path. Do they tend to crumple before authority? Are they too eager to fit in? Do they avoid conflict at any cost? They are also taught to recognize physical manifestations of stress, such as sweaty palms or a constricted throat, which could signal the need for them to step back and consider what is happening.
And, perhaps most importantly, Deterline said, people need to practice, through role playing, how to act courageously despite fear and confusion. It’s not that such feelings will disappear, but rather that you can learn to go forward despite them. “Courage feels like discomfort—it doesn’t usually feel powerful. It feels uncomfortable and awkward,” Deterline said. “We’ve got to get used to acting with discomfort.”
But it’s not just a matter of taking a deep breath and spending a moment in the bathroom, cautions DeSteno. That may simply give you time to justify bad behavior, as shown in his coin-toss experiment. “You have to be very careful not to consciously talk yourself into something. If you want to behave correctly, know where your weaknesses lie. We tend to be irrational and self-serving. The mind is a motivated reasoner.” And, he noted, “it’s easier to be ethical in good economic times than bad.”
White, of Loyola Marymount, recalled a top executive who visited his class telling his students, “Odds are you’re going to be asked to do something you know is wrong at some point—and either you can do it or you can leave. Everyone has to have a line they won’t cross.” It’s easier, he told them, to figure out that line before coming to it. After all, the consequences of doing the right thing can be costly.
After she was fired, Guitron, a single mother, filed and lost a lawsuit appealing her firing. She was out of work for two years, and believes she was effectively blacklisted by the banks. “I had to go on food stamps, and that was not a good feeling,” she said. Although she’d prefer to still be in banking, she now works as a property manager. Nonetheless, she knows her children are proud of her, and she has received messages from former colleagues who are grateful for her actions. In any case, Guitron says, she never considered any other course.
“I’ve always stood up to the bully. I think everyone can tell right from wrong—I think we all have that understanding. But the big driver of this was money.”
Alina Tugend is a New York–based writer and author of Better by Mistake: The Unexpected Benefits of Being Wrong. She can be reached online at www.alinatugend.com or @atugend.