The cause of every social problem, we can all agree, is that people get rewarded for doing the wrong things. Academic fraud, dysfunctional healthcare systems, good-for-nothin’ politicians—all cases of bad incentives.
I used to nod along to these conversations like yes, yes, of course, the incentives! But then I started paying attention and was like, wait, what are we all talking about?
I think there is a monstrous theory of human behavior lurking here—one that I myself have believed—that needs to be dragged out into the light and thoroughly stomped.
PUT ANOTHER DIME IN ME, BABY
Here’s the usual take on bad incentives:
Humans do stuff in exchange for rewards—money, power, prestige, etc. Unfortunately, bad behavior often pays more of those rewards than good behavior does. “Fixing the incentives” means trying to make doing the right thing more lucrative, broadly defined.
Call this the jukebox theory of human behavior: you get people to do what you want by inserting coins and pushing buttons.
I’m skeptical of jukebox theory because it seems to be an explanation for how other people work. “I care about more than just my bank account,” jukebox theorists imply, “But other people? You can take a giant dollar sign, hook ‘em by the nostrils, and yank ‘em around.”
People really do believe this and are happy to tell you about it. For example, in one classic study, 63% of participants said they would give blood for free, and 73% said they would do it for $15, a difference that wasn’t statistically significant. Meanwhile, they estimated that only 32% of their peers would give blood for free, and that 62% would do it for $15.1 As in, “I would give my blood freely, but other people need to be bribed for it.”2
Similarly, when you survey people about what motivates them at work, they go “Feeling good about myself! Having freedom, the respect of my coworkers, and opportunities to develop my skills, learn things, and succeed!” When you survey people about what motivates others, they go, “Money and job security!” In another survey, people claimed that they value high-level needs (e.g., finding meaning in life) more than other people do.3
I’m saying “people” here as if I wasn’t one of them, but I would have agreed with all of the above. It was only saying it out loud that made me realize how cynical my theory of human motivation was, and that I applied it to everyone but myself. Yikes!
YOU'VE BEEN HIT BY // YOU'VE BEEN STRUCK BY // A SECRET CRIMINAL
Some people believe in an even darker theory of motivation, which is that humans don’t do anything useful unless enticed with a carrot or thwacked with a stick. Deep down, this theory goes, we’re a bunch of slackers and delinquents, and the only thing that keeps us from quitting our jobs and doing crimes instead is the fear of losing a paycheck or gaining a prison sentence.
Call this secret criminal theory. Jukebox theory says that intrinsic motivation doesn’t exist; secret criminal theory says that our intrinsic motivation is to be idle or evil.
(Unlike jukebox theory, people seem willing to think of themselves as secret criminals; see Excuse me but why are you eating so many frogs.)
Secret criminal theory shows up in the bizarre forms of everyday policing we all encounter in our jobs. For instance, universities have offices full of people whose job it is to review the money you spend on research and conference travel to make sure that you’re not embezzling any of it. This process routinely takes weeks and involves chains of multiple administrators. Once, the university gave me a few bucks less than I asked for because I tipped 20% at a restaurant and they only cover tips up to 18%!
(Come to think of it, my Olive Garden receipts were scrutinized more thoroughly than my research ever was.)
At the time, this seemed annoying but normal. Now it seems really weird. Academics should want to do research and go to conferences. That’s the job! If you’re worried that the people you hired are going to embezzle money, why did you hire those people in the first place? It’s like getting married and immediately paying a private detective to follow your spouse around and make sure they aren’t cheating on you.4
LITTLE ORPHAN DRUG ACT
The problem with cynical theories of human behavior is that they do work, kind of, but in a stupid way.
When you rejigger incentives in the hopes of changing behavior, you attract the people who are most motivated by the incentives themselves, and these are the people you want to attract the least. Incentive-hunters are bent on Goodharting you, that is, doing exactly what it takes to extract the reward, even at the expense of what you actually wanted them to do.
Here’s an example. A few years ago, somebody at Wells Fargo had a great idea: “Let’s push our employees really hard to sell multiple financial products to each customer; reward those who make their sales targets and punish those who don’t.” And indeed, sales went up! But it was because those employees, faced with impossible sales targets, were simply opening fake accounts in customers’ names. Those who refused or complained were fired. Those who remained were willing to do things like convince a homeless woman to open “six checking and savings accounts with fees totaling $39 a month.” Using incentives to manipulate people looks like a good idea until you’re facing billions of dollars in fines and lawsuits.
Another example. The Orphan Drug Act of 1983 was supposed to incentivize pharmaceutical companies to develop drugs for rare diseases, an enterprise that might not have been profitable otherwise. “Oh that’s a funny coincidence,” the pharmaceutical industry responded, “because actually, all of our drugs are orphan drugs!” It became standard business practice to get “orphan” status for a drug that would have been commercially successful anyway, allowing companies to enjoy tax breaks, grants, faster approval, and government-enforced monopolies. In 2014, seven of the ten bestselling drugs were “orphans”. What looks like a win for incentives—“Look at all these orphan drugs we have now!”—may actually be a big, expensive failure.
THE GREATEST SOURCE OF ENERGY IN THE UNIVERSE IS PEOPLE WHO GIVE A HOOT
So yes, the incentives are often bad, but they’re bad because they’re built on bad theories of motivation. A better theory would start with the fact that other humans are just like us: they care about money and prestige, sure, but they also care about a bunch of other things, and they’ll work really hard to get those things.
And I mean really hard. Witness the infinite energy that a teenager will expend on figuring out whether their crush likes them back, or the weekends people will sink into pruning their rose bushes, or the hundreds of hours that fans will pour into making cosplay costumes. Behold this 170 page guide on how to identify locations in Mongolia so you can get better at GeoGuessr, a game where you see one random screenshot from Google Street View and you have to figure out where you are.5 Tremble at the man who, after losing his wife because it took too long for help to arrive after an accident, spent the rest of his life hewing a faster path to the hospital through a mountain.6

All sorts of discoveries and commercial achievements have, in fact, come from people pursuing their interests even without much expectation of becoming rich and famous, like:
When I was training research assistants in graduate school, I used to think I wanted the students who already knew the most, or the ones who could learn the fastest. Eventually I realized, no, what I really want are the students who give a hoot. Unmotivated students would show up like, “I hit a roadblock, what should I do?” while the motivated students would be like, “I hit a roadblock so I tried a million different things to get unstuck, and now I have a completely new set of problems to discuss with you.” It’s not that hard to give people skills. It’s way harder to give them interests.
So yes, you can get people to do stuff by dangling carrots and shaking sticks. But you need some big honkin’ carrots and some awfully pointy sticks to get people to do things they don’t want to do. Trying to substitute external incentives for internal incentives is like trying to power your country with whale blubber when everybody is walking around with a hydrogen cell battery inside them.
IT'S TOO BAD I HAVE TO LIE TO THE SECURITIES AND EXCHANGE COMMISSION, BUT WHAT YOU GONNA DO?
The best way to use incentives, then, is to:
1) find the people who already want what you want
2) help them survive
This is the theory I’ve been using to recruit people for my prototype Science House. Lots of people are skeptical of doing science outside of academia, to the point where they’d call me names and threaten my job over the idea. I cannot fathom the size of the carrot or the stick that would make them feel otherwise. It’s so much easier to find the people who already think it’s a cool idea—so cool, in fact that they’re already doing it, or willing to start right now—and give those people a home. It’s also how I think about Substack—I give money to people who are doing work I think is important because I want them to keep doing it.
Anybody who’s like, “I hate lying to the Securities and Exchange Commission, but those are the incentives in finance!” or “I hate writing crappy papers, but those are the incentives in academia!” or “I hate making hamburgers that are mostly sawdust, but those are the incentives in the restaurant industry!”—they’re admitting that they’re only going to do the right thing when it’s convenient. I want to find the people who are willing to do the right thing even when it’s inconvenient, and hand them some money so they can keep doing that.
THE SECRET GENIUSES OF WISCONSIN
In this view, incentives are primarily selection effects, which are powerful but hard to think about.
For example, Wisconsin has an average SAT score of 1236 (out of 1600), far higher than neighboring Illinois (970) or wealthy states like Connecticut (1007) and New Hampshire (1035). If you’re like me, when you see that you start wondering: How does Wisconsin achieve these amazing results? Is it school funding? Smaller class sizes? Innovative curricula?
No, it’s mainly telling their students not to take the SAT. Only 2% of Wisconsin students take it, in fact, compared to 96% in Illinois, 93% in Connecticut, and 82% in New Hampshire. The only Wisconsinites posting SAT scores the overachievers who want to go to selective colleges.8 That’s how potent selection effects are: you could spend $1 billion on test prep, smaller classes, etc., and still not come close to achieving the 300-point boost that you get from telling your worst test-takers to stay home.
We usually think about selection effects as bugs in our experiments that prevent us from finding the truth, and that’s right. But we can also harness their terrifying power and use them to our advantage, or we can neglect them and pay the price.
For instance, I was once in a department where an internal survey revealed low morale among the graduate students. A town hall was convened to investigate the issue. The students knew that one of the biggest problems was that a handful of professors terrorize and neglect their underlings, and the fastest way to fix this would be to put those faculty members on an ice floe and push it out to sea. This, of course, was difficult to bring up (some of those faculty members were in the room), and so instead we talked about minor bureaucratic reforms like whether there should be some training for advisors, or whether bad advisors should have fewer opportunities to admit students. Nobody could name who these mysterious bad advisors were, of course, so even these piddling suggestions went nowhere.
We created that demoralized department by ignoring selection effects. If you hire someone based on the shininess of their CV and then hope that, somehow, the employee handbook will show them how to also be a good person and not just a prolific paper-producer, you’re going to end up with a department full of sad graduate students. There are people out there who would trade letters of recommendation for sexual favors, and no amount of carrots or sticks can turn them decent. Your only hope is not hiring that person in the first place. Once they’re inside, it’s too late—they’ll end up in charge of the Carrot and Stick Committee and they’ll vote for more carrots and fewer sticks for themselves, and they’ll hire more people just like them.
I’ve seen the same thing happen in every improv group I’ve ever been in, which have whipsawed wildly between Golden Ages and Dark Ages depending on who we put in charge that year. If you’re writing a constitution or a code of conduct, by all means, do a good job. But if you’re counting on something like “SUBSECTION 3A: Being evil is not allowed” to stop people from being evil, or if you think Roberts Rules of Order are going to turn an insecure despot into an enlightened ruler, well, strap in for some Dark Ages and some bad improv.
So when people are like, “It’s too bad these bad incentives turn good people into bad people!” I’m like, “No, it’s too bad these bad incentives allow bad people to exist and succeed.” The good people are the ones who don’t turn bad even when it’s lucrative to do so. In other words:
Or, since we’re all into Dune right now:
Good governance never depends upon laws, but upon the personal qualities of those who govern. The machinery of government is always subordinate to the will of those who administer that machinery. The most important element of government, therefore, is the method of choosing leaders.
And what’s true about government is also true about every organization:

It’s not that I think people are unchangeable; even villains can be redeemed. But you can’t do it by making rules or writing checks or handing out ribbons.
CUT OFF THE CORRECT LEG, PLEASE
There are two reasons why jukebox theory and secret criminal theory persist, even though they’re the wrong model of human behavior.
First, they make great self-fulfilling prophecies. If you believe that people only do the right things when you dangle the right incentives in front of them, you’re going to attract exactly that kind of person. (There are enough of these people, it seems, to staff most Wells Fargo locations.) And when you put people inside Carrot-and-Stick World, you never get to see what they would do if you knocked down all the walls and let them do as they pleased.
When you ask people what they would do in the absence of carrots and sticks, you sometimes get surprising answers. For instance, a majority of people who received Fast Grants for covid research said they would change their research a lot if their regular funding allowed it:
It seems like we’re giving millions of taxpayer dollars to pandemic scientists and telling them what to research and they’re like, “Thanks for the money, but I wouldn’t spend it the way you’re forcing me to spend it,” and I dunno, that seems bad!
Second, discovering your inner motivations takes time and experience, and we gum up the process with lots of strong opinions about what should motivate us. I spent most of my 20s as a resident advisor, hanging out with college students who were trying to talk themselves into things they didn’t really want to do. “Oh, I’d love to [build an orphanage/design a better sneaker/reform the immigration system], but I know that's unrealistic, so I should probably just go work for AirBnB.” I wish I would have told them: what’s unrealistic is thinking that you can reshape your desires to be more convenient, as if your soul is made of Silly Putty.
(Many of these students were trying to psych themselves up to be doctors. Please don’t become a doctor if you don’t really want to be one! If you’re gonna cut people open and deliver babies and stuff, you should like doing those things! Otherwise one day you’ll be so burnt out that you amputate the wrong leg, or something!)
I was no better as a college student—I talked myself into doing a master’s degree because I could get a fancy scholarship to do it. I then proceeded to waste two years on a program that taught me nothing, wondering the whole time, “Why don’t I like this? It would be way easier if I liked it!”
MR. WATSON PLEASE LET ME OUT OF THIS BOX
Here’s one of the most famous quotes in all of psychology, uttered by John Watson in 1924:
Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I’ll guarantee to take anyone at random and bring him to become any type of specialist I might select—into a doctor, lawyer, artist, merchant-chief and yes, even into beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations and race of his ancestors.9
Watson’s cherished field of behaviorism famously failed, but this idea got stuck deep in our collective consciousness, where it remains 100 years later. Whenever we try to perfect our fellow humans through bylaws and bonuses, we are channeling Watson. If you believe that people need to be treated like jukeboxes or secret criminals, you are accepting the behaviorist premise that we need to put people inside a giant operant conditioning chamber that dispenses food pellets for good behavior and electric shocks for bad behavior. To say “the incentives are bad!” is just to say “the chamber is dispensing pellets and shocks for the wrong things!”

I don’t like shocks and I would be happy to receive some pellets, but mainly I don’t want to live in the box. And neither does anyone else. At best, we only think the box is a good place to put other people, and at worst, we’re willing to stick ourselves there as punishment when we can’t get our to-do lists done. We act as if the improvement of humanity is an engineering problem, when really it’s an unleashing problem.
The only way to solve that problem is to climb out of our own boxes and to help other people climb out of theirs. That, of course, takes courage and trust. Perhaps instead we should wait around and hope that some geniuses from Wisconsin fix things for us.
This is a small study from 1998 and it only included undergraduates, so you should take it with a grain of salt. Fortunately a recent preregistered replication on a much larger and more diverse sample got almost exactly the same results.
In this case, people are right that $15 wouldn’t make a difference for themselves; they’re just wrong in assuming it would make a difference for others. This meta-analysis finds “no conclusive effect” of paying people to donate blood, although the studies aren’t very good. So if there’s any effect at all, it’s probably pretty small, and much smaller than people seem to expect. Also, only about 3% of people actually donate blood, so the 63% of people who say they’re willing to do it for free are being awfully generous with their definition of “willing.”
In fact, people claimed that all needs are more important to themselves than they are to others, but the effect was largest for high-level needs. As in, “I care more about finding meaning in life than other people do, and I also care more about finding lunch than other people do, but the difference is larger for meaning than for lunch.”
Never mind that paying the salaries of an office full of money-police almost certainly costs more than what you would lose to fraud, especially because you also have to hire police to police the police, etc.
“After more than 1000 games played in this country,” its author (who goes by Kommu) writes, “each place still amazes me just as much. This has been the homeland during my long days of confinement. So I have decided to pass on all my knowledge so that you too will no longer be afraid of going to meet the endless steppes.” What a mensch!
The document was originally written in French, and it was translated to English by someone who just thought it would be nice to bring it to more people. What a mensch!
Originally written as erotic Twilight fanfiction
I got the example of selection effects and state average SAT scores from this article.
While time hasn’t proved Watson right, two things are admirable about this quote: he offers a bold hypothesis (rare these days!), and he holds that a person’s race doesn’t dictate their abilities, long before that was a mainstream view.
I am teaching "social preferences" to my (econ) students. One of my first slides has this anecdote from Dawes and Thaler (1988):
In the rural areas around Ithaca it is common for farmers to put some fresh produce on the table by the road. There is a cash box on the table, and customers are expected to put money in the box in return for the vegetables they take. The box has just a small slit, so money can only be put in, not taken out. Also, the box is attached to the table, so no one can (easily) make off with the money. We think that the farmers have just about the right model of human nature. They feel that enough people will volunteer to pay for the fresh corn to make it worthwhile to put it out there. The farmers also know that if it were easy enough to take the money, someone would do so.
I can buy the idea that this is probably true in academic research, which, as you said, is a strong link problem, but this only means lots of waste is probably worth it.
Most of the world isn’t a strong link problem. I go to my job because they pay me. If they didn’t pay me, I’d do something that I personally find more rewarding. If I got paid the same no matter what I did, well, I’d probably make a bunch of music, exercise all the time and try to sell my ideas to other people. This is probably true of most people outside academia: They work for pay and would not do the work if they weren’t being paid. Does this make us “morons” or “cowards?”
Here’s the rub:
> I want to find the people who are willing to do the right thing even when it’s inconvenient, and hand them some money so they can keep doing that.
You want to find people that are doing what YOU think is the right thing, and hand them money so that they will do what YOU think is right.
So you already do believe in incentives, because you’re using them too! Clearly, they work. The people who don’t share your values have made it clear they think what you’re doing is wrong and evil. If they could, I bet they’d stop you. But fortunately for the world, they can’t, because you have the ability to create enough financial incentive to make it worthwhile for people to join you.
I think what you’re doing is the same mistake you put at the top: yeah, some people use incentives to produce bad outcomes, but not you. How is this different than imagining others are more motivated by money than you are?
Maybe you should consider that the people saying how they’d behave in hypothetical situations are actually deluding themselves, and that the reality is their descriptions of other people as being motivated by profit and incentive is more accurate than their own glowing self assessment. After all, do you really think 63% of people actually give blood for free? Or are they just saying that because we all like to imagine ourselves as more virtuous than we truly are? I’ll bet actual blood drives get much closer to 32% participation than 66% participation, because our assessment of others as being motivated by rewards is generally more accurate than our noble assessment of ourselves.