Our buggy moral code - Dan Ariely
[Music] [Music]
I want to talk to you today a little bit about predictable irrationality and um my interest in irrational behavior started many years ago in hospital. I was burned uh very badly and if you spend a lot of time in hospital, you'll see a lot of types of irrationalities. The one that uh particularly bothered me in the burn department was the process by which the nurses took the bandage off me.
Now you must have all taken a Band-Aid off at some point and you must have wondered what's the right approach. Do you rip it off quickly, short duration but high intensity, or do you take your bandage off slowly? You take a long time but each second is not as painful. Which one of those is the right approach? The nurses in my department thought that the right approach was the ripping one.
So they would grab hold and they would rip and they would grab hold and they would rip. And because I had 70% of my body burned, it would take about an hour. As you can imagine, uh I hated that moment of ripping with incredible intensity. I would try to reason with them and say, why don't we try something else? Why don't we take it a little longer, maybe 2 hours instead of an hour and have less of this intensity?
The nurses told me two things. They told me that they had the right model of the patient, that they knew what was the right thing to do to minimize my pain. And they also told me that the word patient doesn't mean to make suggestions or interfere. This is not just in Hebrew, by the way, it's in every language I've had experience with so far.
You know, there wasn't much I could do and they kept on doing what they were doing. About 3 years later, when I left the hospital, I started studying at the university. One of the most interesting lessons I learned was that there is an experimental method that if you have a question, you can create a replica of this question in some abstract way and you can try to examine this question, maybe learn something about the world.
So that's what I did. I was still interested in this question of how do you take bandages off a burn patient. So originally, I didn't have uh much money, so I went to a hardware store and I bought a carpenter's vice. I would bring people to the lab and I would put their finger in it and I would crunch it a little bit.
I would crunch it for long periods and short periods. The pain went up and the pain went down, and with breaks and without breaks, all kinds of versions of pain. When I finished hurting people a little bit, I would ask them, so how painful was this? Or how painful was this? So if you had to choose between the last two, which one would you choose?
I kept on doing this for a while and then, like all good academic projects, I got more funding. I moved to sounds, electrical shocks. I even had a pain suit that I could get people to feel much more pain. But at the end of this process, what I learned was that the nurses were wrong. Here were wonderful people with good intentions and plenty of experience, and nevertheless, they were getting things wrong predictably all the time.
It turns out that because we don't encode duration in the way that we encode intensity, I would have had less pain if the duration would have been longer and the intensity was lower. It turns out it would have been better to start with my face, which was much more painful, and move toward my legs, giving me a trend of improvement over time. That would have been also less painful. It also turns out it would have been good to give me breaks in the middle to kind of recuperate from the pain.
All of these would have been great things to do and my nurses had no idea. From that point on, I started thinking, are the nurses the only people in the world who get things wrong in this particular decision, or is it a more general case? It turns out it's a more general case. There's a lot of mistakes we do and um, I want to give you one example of one of these irrationalities, and um I want to talk to you about cheating.
The reason I pick cheating is because it's interesting, but also it tells us something, I think, about the stock market situation we're in. My interest in cheating started when Enron came on the scene and exploded all of a sudden. I started thinking about what is happening here? Is it the case that there are kind of a few apples who are um capable of doing these things, or are we talking more endemic situation that many people are actually capable of behaving this way?
So, like we usually do, I decided to do a simple experiment. Here's how it went. If you were in the experiment, I would pass you a sheet of paper with 20 simple math problems that everybody could solve, but I wouldn't give you enough time. When the 5 minutes were over, I would say, pass me the sheets of paper and I'll pay you a dollar per question. People did this, and I would pay people $4 for their task. On average, people would solve four problems.
Other people I would tempt to cheat. I would pass the sheet of paper and when the 5 minutes were over, I would say, please shred the piece of paper, put the little pieces in your pocket or in your backpack, and tell me how many questions you got correctly. People now solved seven questions on average.
Now it wasn't as if there were a few bad apples, a few people who cheated a lot. Instead, what we saw is a lot of people who cheat a little bit. Now, in economic theory, cheating is a very simple cost-benefit analysis. You say, what's the probability of being caught? How much do I stand to gain from cheating? And how much punishment would I get if I get caught? You weigh these options out, you do the simple cost-benefit analysis and you decide whether it's worthwhile to commit the crime or not.
So we tried to test this for some people. We varied how much money they could get away with, how much money they could steal. We paid them 10 cents per correct question, 50 cents, a dollar, $5, $10 per correct question. You would expect that as the amount of money on the table increases, people would cheat more. But in fact, it wasn't the case.
We got a lot of people cheating but still by a little bit. What about the probability of being caught? Some people shredded half the sheet of paper, so there was some evidence left. Some people shredded a whole sheet of paper. Some people shredded everything, went out of the room and paid them from a bowl of money that had over $100. You would expect that as the probability of being caught goes down, people would cheat more. But again, this was not the case.
Again, a lot of people cheated just by a little bit and they were unsensitive to these economic incentives. So we said, if people are not sensitive to the economic rational theory explanations to these forces, what could be going on? We thought maybe what is happening is that there are two forces: on one hand, we all want to look at ourselves in the mirror and feel good about ourselves. So we don't want to cheat.
On the other hand, we could cheat a little bit and still feel good about ourselves. So maybe what is happening is that there's a level of cheating we can't go over, but we can still benefit from cheating at a low degree as long as it doesn't change our impressions about ourselves. We call this like a personal fudge factor.
Now, how would you test a personal fudge factor? Initially we said, what can we do to shrink the fudge factor? So we got people to the lab and we said we have two tasks for you today. First, we asked half the people to recall ten books they read in high school or to recall the Ten Commandments. Then we tempted them with cheating.
Turns out the people who tried to recall the Ten Commandments, and in our sample nobody could recall the Ten Commandments, but those people who tried to recall the Ten Commandments, given the opportunity to cheat, did not cheat at all. It wasn't that the more religious people, the people who remembered more of the Commandments, cheated less, and the less religious people, the people who couldn't remember almost any commandment, cheated more.
The moment people thought about trying to recall the Ten Commandments, they stopped cheating. In fact, even when we gave self-declared atheists the task of swearing on the Bible and we gave them a chance to cheat, they didn't cheat at all. Now, the Ten Commandments is something that is hard to bring into the education system, so we said, why don't we get people to sign the honor code?
So we got people to sign, "I understand that this short survey falls under the MIT Honor Code." Then they shredded it. No cheating whatsoever. This is particularly interesting because MIT doesn't have an honor code.
So all this was about decreasing the fudge factor. What about increasing the fudge factor? The first experiment, I walked around MIT and I distributed six packs of Cokes in the refrigerators. These were common refrigerators for the undergrads and I came back to measure what we technically call the half-life of Coke. How long does it last in the refrigerators? You can expect it doesn't last very long; people take it.
In contrast, I took a plate with six $1 bills and I left those plates in the same refrigerators. No bill was ever disappeared. Now this is not a good social science experiment, so to do it better, I did the same experiment as I described to you before. A third of the people, we passed the sheet, they gave it back to us. A third of the people, we passed it, they shredded it, came to us and said, "Mr. Experimentor, I solved X problems, give me X."
A third of the people, when they finished writing the piece of paper, they came to us and said, "Mr. Experiment, I solved X problems, give me X tokens." We did not pay them with dollars; we paid them with something else, and then they took this something else, they walked 12T to the side and exchanged it for dollars.
Think about the following intuition. How bad would you feel about taking a pencil from work home? How bad would you feel about taking 10 cents from a petty cash box? These things feel very differently. With being a step removed from cash for a few seconds, by being paid by token, could make a difference. Our subjects doubled their cheating.
I'll tell you what I think about this in the stock market in a minute, but this did not solve the big problem I had with Enron yet because in Enron, there's also a social element. People see each other behaving. In fact, every day when we open the news, we see examples of people cheating. What does this cause us?
So we did another experiment. We got a big group of students to be in the experiment and we prepaid them. So everybody got an envelope with all the money for the experiment and we told them at the end, we asked them to pay us back the money they didn't make. Okay, the same thing happens when we give people the opportunity to cheat. They cheat; they cheat just by a little bit, all the same.
But in this experiment, we also hired an acting student. This acting student stood up after 30 seconds and said, "I solved everything, what do I do now?" And the experimenter said, "If you finished everything, go home, that's it, the task is finished." So now we had a student, an acting student that was a part of the group. Nobody knew there was an actor and they clearly cheated in a very, very serious way.
What would happen to the other people in the group? Will they cheat more or will they cheat less? Here is what happens. It turns out it depends on what kind of sweatshirt they're wearing. Here's the thing: we ran this at Carnegie Mellon in Pittsburgh, and in Carnegie Mellon in Pittsburgh, there are two big universities: Carnegie Mellon and University of Pittsburgh.
All of the subjects sitting in the experiment were Carnegie Mellon students. When the actor was getting up wearing a Carnegie Mellon sweatshirt, he was actually a Carnegie Mellon student, but he was a part of their group. Cheating went up. But when he actually had the University of Pittsburgh sweatshirt on, cheating went down.
Now this is important because remember when the moment the student stood up, it made it clear to everybody that they could get away with cheating because the experiment said you finished everything, go home, and they worked with the money. So it wasn't so much about the probability of being caught again; it was about the norms for cheating. If somebody from our in-group cheats and we see them cheating, we feel it's more appropriate as a group to behave this way.
But if it's somebody from another group, these terrible people, I mean not terrible in the sense of this, but somebody we don't want to associate ourselves with from another university, another group, all of a sudden people's awareness of honesty goes up a little bit, like the Ten Commandments experiment, and people cheat even less.
So what have we learned from this about cheating? We've learned that a lot of people can cheat; they cheat just by a little bit. When we remind people about morality, they cheat less. When we get a bigger distance from cheating from the object of money, for example, people cheat more. And when we see acts of cheating around us, particularly if it's part of our in-group, cheating goes up.
Now, if we think about this in terms of the stock market, think about what happens in a situation when you create something where you pay people a lot of money to see reality in a slightly distorted way. Would they not be able to see it this way? Of course they would. What happens when you do other things, like you remove things from money, you call them stock or stock options, derivatives, mortgage-backed securities?
Could it be that with those more distant things, it's not a token for one second; it's something that is many steps removed from money for a much longer time? Could it be that people would cheat even more? And what happened to the social environment when people see other people behave around them?
I think all of those forces worked in a very bad way uh in the stock market. More generally, I want to tell you something about behavioral economics. We have many intuitions in our life and the point is that many of these intuitions are wrong. The question is, are we going to test those intuitions?
We can think about how we're going to test this intuition in our private life, in our business life, and most particularly when it goes to policy, when we think about things like No Child Left Behind, when you create new stock markets, when you create other policies, taxation, healthcare, and so on. The difficulty of testing our intuition was the big lesson I learned when I went back to the nurses to talk to them.
So I went back to talk to them and tell them what I found out about removing bandages. I learned two interesting things. One was that my favorite nurse, Etti H, told me that I did not take her pain into consideration. She said, of course you know it was very painful for you, but think about me as a nurse, taking the bandages off somebody I liked and having to do it repeatedly over a long period of time. Creating so much torture was not something that was good for me too.
She said maybe part of the reason was it was difficult for her, but it was actually more interesting than that because she said I did not think that your intuition was right; I thought my intuition was correct. So if you think about all of your intuitions, think about it: it's very hard to believe that your intuition is wrong.
She said, given the fact that I thought my intuition was right, she thought her intuition was right, it was very difficult for her to accept doing a difficult experiment to try and check whether she was wrong. But in fact, this is the situation we're all in all the time. We have very strong intuitions about all kinds of things: our own ability, how the economy works, how we should pay school teachers.
But unless we start testing those intuitions, we're not going to do better. Just think about how better my life would have been if these nurses would have been willing to check their intuition and how everything would have been better. We would just start doing more systematic experimentation of our intuitions.
Thank you very much. [Applause] [Music]
What does a machine know about itself? Can it know when it needs to be repaired and when it doesn't? In industries like manufacturing and energy, they're using predictive analytics to detect signs of trouble, helping some companies save millions on maintenance because machines seek help before they're broken and don't when they're not.
That's what I'm working on. I'm at IBM. Let's build a smarter planet.