The field of social psychology is sometimes accused of doing no more than ratifying common sense, so it’s worth paying attention when its findings are genuinely surprising. Case in point: the discovery that when we are rewarded for doing something, we tend to lose interest in whatever we had to do to get the reward.
This outcome has been confirmed scores of times with all sorts of rewards and tasks, and across cultures, ages and genders. Yet many teachers, parents and bosses persist in using versions of what has been called “sugarcoated control.”
Psychologists often distinguish between intrinsic motivation (wanting to do something for its own sake) and extrinsic motivation (for example, doing something in order to snag a goody). The first is the best predictor of high-quality achievement, and it can actually be undermined by the second. Moreover, when you promise people a reward, they often perform more poorly as a result.
I began tracking, and trying to explain, these lines of research back in the late 1980s. This year I reviewed as many studies as I could find that had been conducted since then. The conclusions that rewards frequently kill both interest and excellence have, if anything, grown more solid in the intervening decades.
A number of studies, for example, have shown that children are apt to become less concerned about others’ well-being if they were rewarded earlier for helping or sharing. Students, meanwhile, become less excited about learning once they’ve been given a grade (or some other artificial inducement) for doing so. And even though the average American corporation resembles a giant Skinner box with a parking lot, no controlled study has ever, to the best of my knowledge, found a long-term enhancement in the quality of work as a result of any kind of incentive or pay-for-performance plan.
Over the years, researchers have investigated some intriguing questions that stemmed from these basic findings. For instance: What if the reward is really large and luscious? (Answer: It’s apt to do even more damage to intrinsic motivation.) Are rewards destructive because they distract people from the task? (Apparently not, because other distractions don’t have the same negative effects.) Which is worse, giving people a set reward for doing something or making it contingent on how well they do it? (The latter, by a long shot.)
Give a bunch of adults or children a puzzle to solve or, say, a poem to write. Promise half of them a reward if they’re successful — and then watch as they end up being less creative and less interested in the task than those promised nothing. Such studies have led some observers to conclude that rewards should be avoided for interesting tasks but that they may be harmless or even appropriate when people with more power want to make people with less power do boring stuff. (If there’s one thing this field has taught me, it’s that rewards, like punishments, are ultimately about power.)
But newer research finds that rewards may also backfire when they’re offered for doing things that aren’t especially interesting, particularly if you watch to see what happens after the rewards stop coming. For example, a study by Thane Pittman of Colby College and his colleagues found that when people put off doing something — which often happens when a task seems unappealing — a reward offered for finishing early either didn’t help or led to increased procrastination.
Or what about rewarding people just for showing up? In 2015, researchers at Hong Kong University and New York University studied 9-year-olds in a very low-income area of India whose school attendance was spotty. These children were promised a reward if they came to school at least 32 out of 38 days. During that period, not surprisingly, many kids’ attendance improved. Afterward, however, it promptly dropped — either back to the earlier low levels or, in the case of students on whom the reward hadn’t had even a temporary effect, to a level much lower than it had been to begin with.
Another study, conducted by Carly Robinson at Harvard and her colleagues, and released as a working paper this past summer, followed more than 15,000 students in 14 California school districts, watching to see whether those who received a reward for exemplary attendance in the fall would come to school more often in February as compared to those who hadn’t been rewarded. Again, the rewards either had no effect or led to poorer attendance.
What’s even more striking than studies that challenge popular assumptions about human behavior are studies whose results surprise the researchers themselves. (Hint: Watch for the phrase “contrary to hypothesis.”) That was the case with both of these attendance studies — even though their findings really were predictable in light of earlier research.
There was one aspect of the California study, however, that wasn’t already settled science. The researchers tried offering students a reward unexpectedly, after the fact, and this tactic turned out to be even more damaging than telling the students about the reward in advance. Earlier studies were mixed on this question, with some showing that unexpected rewards, while unhelpful, at least weren't damaging. But that was only in contrived, one-shot lab studies. If you get an unexpected reward in the real world, you’re likely to expect another one next time. Then you may feel resentful if you don’t get one — and manipulated if you do.
By now it should be clear that the trouble doesn’t lie with the type of reward, the schedule on which it’s presented, or any other detail of how it’s done. The problem is the outdated theory of motivation underlying the whole idea of treating people like pets — that is, saying: Do this, and you’ll get that.
Indeed, various researchers over the last half-century have admitted to being surprised by the ineffectiveness or destructiveness of rewards when money was offered to adults for succeeding at a tricky task, when movie tickets or praise was offered to children for tasting an unfamiliar beverage (the kids liked the beverage less than those who received neither a tangible nor a verbal reward), when merit pay failed to improve teachers’ performance, and when incentives didn’t increase seatbelt use or help people lose weight and keep it off.
The best that carrots — or sticks — can do is change people’s behavior temporarily. They can never create a lasting commitment to an action or a value, and often they have exactly the opposite effect … contrary to hypothesis.
Working with people to help them do a job better, learn more effectively, or acquire good values takes time, thought, effort and courage. Doing things to people, such as offering them a reward, is relatively undemanding for the rewarder, which may help to explain why carrots and sticks remain stubbornly popular despite decades of research demonstrating their failure.
In the case of attendance, it’s a lot easier — and much less threatening to those in positions of authority — to reward students and workers for showing up than it is to reconfigure schools and workplaces so that people are more likely to want to show up.
Alfie Kohn is the author of books on human behavior and education, including, most recently, a new edition of “Punished by Rewards.”