Ulysses Lab: Too Good to Be True?
Did we fool you?
Yesterday we announced the launch of an exciting new initiative, the Ulysses Lab. Endorsed and advised by many of the top behavioral science experts around the world, it promised to get you to your goals with perfect certainty. It seemed almost too good to be true – and it was. In case the outrageous punishments suggested by our tool didn’t give us away, the April 1st launch surely did.
To be painfully clear: We do not recommend you follow the “programs” assigned by the Ulysses Lab. It’s a good example of how NOT to apply behavioral science, and here’s why.
How not to design a commitment device
- You should know what you’re committing to
The Ulysses Lab took some bona fide insights about how pre-commitment can help us reach our goals, but applied it in the realm of the absurd. Publicly committing to behaving a certain way before we are in the throes of temptation can be a very effective way of sticking to our goals. However, in the Ulysses Lab tool, we asked people to commit to the program before they knew what it was. Of course, this would never fly in the real world. Blind pre-commitment not only seems crazy; it is unethical. No one should blindly agree to a commitment contract where it turns out they have to give away their first born child if they miss a gym date (or one of the other outrageous punishments we created, which you can see in Exhibit A.)
Thankfully, there are already much more behaviorally sound tools available that leverage commitment and implementation intentions.
- Punishments have to be deserved
When the reason you fail at something is out of your control, punishment is not the correct approach. For example, if you commit to exercising more often, but break your legs and are told by your doctor you can’t exercise, you shouldn’t be punished for not exercising. Punishment that you consented to is only deserved if it is your fault for failing to do the thing that you committed to doing. In the Ulysses Lab, there was no distinction between deserved and undeserved punishments.
- An intervention should never cause genuine harm
Interventions to change people’s behavior always thread a fine line between the benefit it will provide versus any potential harm. Worrying about the consequences of failing to stick to our promises can help us achieve what we want, but it should never cause genuine harm. So for example, we do not recommend following the Ulysses tool’s advice and placing yourself on a segway running off a cliff, swapping your fingers and toes with each other, or getting tarred and feathered medieval style.
To avoid terrible applications of behavioral science, we need experts.
These days, more and more organizations are seeing the power of carefully applying behavioral science to solve their users’ problems. There is an increased demand for behavioral scientists to work both within companies and as consultants. Yet many companies give their employees a popular science book about behavioral science, then expect them to be experts. That usually doesn’t work very well: It’s like having someone watch Grey’s Anatomy and then putting them in the operating room to perform surgery.
Behavioral scientists have a specific set of skills that make them uniquely qualified to apply their trade; they understand the scientific literature, the benefits and constraints of applying it in specific situations, and how to test whether it’s providing the intended benefits. When it’s not done correctly, it can backfire. In a recent example, United Airlines tried to replace employee bonuses with a lottery incentive. They quickly had to backtrack when employees became (understandably) enraged, especially after hearing how much money the company would save. There is a time and place for lottery incentives, but this certainly wasn’t it. A seasoned behavioral scientist would have recognized the factors at play, and realized that launching a program like this across a big company such as United without testing it first would be a massive risk, unlikely to end well.
Finally, we would like to thank our partners in crime for making this April’s fool joke a success, organizations and individuals who work tirelessly to apply behavioral science around the world: Behavioral Science & Policy Association, Ogilvy Change, Busara, Hunting Dynasty, People Science, the Dishonesty Project, Action Design Network, Irrational Labs, BE Works, Joep Lange Institute, Science of People, FehrAdvice, Lemonade, Shapa, Dan Pink, Katy Milkman, Angela Duckworth, Charles Duhigg, Adam Grant, Francesca Gino, Andreas Staub, Michael Norton, David Pizarro, Robert Cialdini, Tim Harford, Paul Bloom, Leslie John, Peter McGraw, Vanessa Van Edwards, Yoel Inbar, Dean Karlan, Jonah Berger, Laurie Santos and Todd Rogers. These are all amazing, talented and fun people, and we hope our next collaboration lasts longer than one day!