top of page
  • Cat Hicks

The Uncertainty

I recently had reason to remember and re-read this important piece: Use caution when applying behavioural science to policy.

I also had reason (recent, ancient, and in perpetuum) to reflect on what it means to give recommendations from and about science. As we continue to put out more work in my lab, I reflect often on the pressure that any scientific work receives to generate recommendations and actions regardless of what stage you (the scientist) thinks it’s at. We write papers knowing that many people will only read paragraphs. We write toolkits knowing we need to select only the strongest and most robust findings and keep them simple, so people aren’t too bogged down to try something. We write bullet points knowing many people will only read someone else summarizing their glance at your bullet points. This is the telephone game of the world, and we’re all playing it.

Where, then, do we put the caveats and the cautions and the confidence ratings? The multidimensional reflections on our own work?

It is a hard thing and I don’t think we talk about it enough. The most successful thing that can happen for science is that it gets used: so many of us in science have struggled against a world that won’t hear us. The world needs evidence desperately, particularly in situations where the primary driver of decisions is not “different evidence from mine,” but no evidence at all. Yet if we’re wrong, the worst failures that can happen for science are when it gets used. We scientists are in this world we’re trying to change. We’re in the mess of it. Not all of our evidence is ready, and epistemic humility has not particularly been the chosen defining trait of most branches of science.

The Use Caution paper argues, so well, that we need to develop practices, like conceiving of evidence for interventions as existing somewhere on an evidence ladder. They write: the assessment of how ‘ready’ the intervention is must be included when persuading decision-makers to apply social and behavioural science evidence, particularly in crisis situations when lives are at stake and resources are limited. Not doing so can have disastrous consequences.”

Amen. I study beliefs, so a friction point that always catches my attention is this: why do we struggle to conceive of nuance to scientific evidence? What belief about science holds us back from this beautiful vision of evidence nuance?

A way to name this kept floating to the forefront as I turned over many of my own recent experiences assessing evidence, my own decisions about what type of claim we could make with what type of evidence, my own struggles with a software industry milieu on social media that frequently acts out a fanatical position as its most efficient fuel for the ravenous engine of engagement. I call resistance to all that The Uncertainty. 


People frequently believe that science is about the opposite of uncertainty: they think it’s about proving things. 

Headlines do us a disservice. Scientists are always “discovering” and “finding” and “proving,” even in cases where you are able to find the paper (journalists never link it, lord almighty) behind the article and realize two thirds of it are spent describing how we already knew this thing but we hadn’t described a particular nuance of it. Scientists are also always “testing” and that testing is always in a “lab,” even though observational and descriptive science is a massive endeavor, and usually more useful to the practitioners trying to find an entry point into science.

It is difficult but important to realize that a core part of the logic of doing science is not that we are revealing the entire truth. Believing this is made even more difficult because large areas of science do indeed like to think that’s what they’re doing. 

In the introduction to the glorious The Effect, Nick Huntington-Klein writes: “How does the world work? That’s the big question, isn’t it? And it’s one for which our answers will always be ever so slightly incomplete. That incompleteness is a curse, but also a blessing.” And near the finish (as someone who likes to write long-form fiction, I always check that the end is as good as the beginning–beginnings get workshopped way more than endings but endings are what matter): “If you’ve taken the time to sit down and actually try to design an analysis to answer a research question (as I certainly hope you have by this point), you’ll very quickly come to a realization about how many choices you have to make in that process, and how many of those choices you’re unsure about.”

Ah my friend. The Uncertainty.

But here’s the thing. The surprise, the gift, the really cool trick. Pinning all your credibility, expertise and power to the premise that you will never be wrong does a lot of things–in this world it might get you headlines and panel spots and letters after your name–but it also does one thing in particular. It makes you fragile

If your position in the world, your expertise and your work, relies on you never being wrong? What on earth will happen when you are.


Antifragile science, by contrast, owns the Uncertainty. It maps it. I see the Uncertainty in confidence ratings and validity checks and ecological validity questions. I see it in the development of statistical methods that give us a field of most likely answers over single estimates. I see it when we tip over our own hypotheses (in one of our latest studies, our pre-registration means you can see I hypothesized several relationships we did not find). I see it when scientists welcome collaborators who come to them with it didn’t work for me and they say how cool let’s test it. 

I do think that you lose things when you do this: panels and mentions on substack and sometimes, the support of people who valued you only as much as they thought you were going to provide them with neatly-stacked findings, evenly weighted, for all time. 

But antifragile, process-based, Uncertainty-honoring science also frees you. It frees you from the need to censor down what you really think until it is a butterfly on a pin, a precision you don’t have. It lets you stop at a higher level: it’s in this realm, it’s in this direction, I welcome your help getting more precise. This was what we got. And often, I think I was wrong, this is a little more right, or at least a little less wrong. 

Less-wrong, Uncertainty-honoring science acknowledges that our problems in the real world are more multidimensional than we would like. We learn to think of findings not as equally weighted static entities but as differently weighted, differently evidenced forms of knowing. 

Plus, there’s human stuff in here (humanity inside MY science?! It's more likely than you think). The Uncertainty lets you be tired. Because you were always honest about the possibility that in the future your model will be more wrong than the next model they lose the ability to pin you, like a butterfly, when it all goes wrong, the scientists didn't warn us! 

(The scientists are warning you, all the time, when we say I can’t answer that and stuff is hard and it’s complicated and no thanks to that media interview and we drag ourselves out of our labs to ruefully try to put years of work into words knowing it’s a coin flip if anyone passes them on accurately. We’re warning you when we quit, just like anybody else in any other job; we’re warning you when we stop teaching or are too tired or too scared to teach anymore; we’re warning you when we can’t get jobs to do the work that would answer the questions you keep asking, because you don’t want to pay for it)

In the inoculation theory for combatting misinformation, it is emphasized that perhaps you can teach people a weakened form of the existing bad arguments and that this will teach people to expect the tilt of misinformation, the bad arguments that will fly at them in harder, harsher, more persuasive situations. In this paper on the drivers of misinformation, they write: “Prebunking seeks to help people recognize and resist subsequently encountered misinformation, even if it is novel.Getting people comfortable with The Uncertainty before you need to get people to see it in a real goddamn big-time decision might be a version of this.

 It seems to me that antifragile science requires not only that we accept the possibility of wrongness, but the inevitability of it. This is a difficult purifying fire to move through. This is scary. A world reduced to contest cultures will punish us for this, because when our opponents are more wrong but more certain, they will look more expert. 

But the trick remains. Because antifragile science says sure. Ok. You won. But what if we were not opponents at all. As Ana Hevesi taught me in our recent collaboration on community design and community-based research together, what if some of us are playing a different game altogether.


The Uncertainty frees you in that it lets things not just be wrong or right but be genuinely true sometimes and genuinely not true other times. That's right -- the Uncertainty exists over time, not just between things. This is infuriatingly rad. Truly, being a scientist is a mindfuck. But move through it and your allegiance can be to the world you’re studying instead of the effects you try for in it. In their Seed and Soil’ paper on psychological affordances, Walton & Yeager wrote: “We must study the world as it comes to us. In extending an intervention to diverse populations, especially to those most afflicted by a problem, we must ask in what kinds of contexts an intervention will be effective and in what kinds of contexts it will not. This means contending with the complicated root causes that made the problem appear in the first place; sometimes these may be remedied through psychological intervention and sometimes not.

You get to breathe better when you do this. I have found it frees you from the crushing down of your scientist role into merely an automaton of opinion, like a factory line robot that could be moved around the floor but can only regurgitate the same motions over and over again. Psychology is not a lift-and-shift discipline. 

And yet also, acknowledging that the getting-less-wrong is central to science can paradoxically free us to act with fierce confidence when we need to, to make large lateral strides of progress by moving insight from one domain to another. By better getting to know our uncertainty we can see where it is minute enough on a real issue that it hardly matters and powerful forces are using demands for precision to delay the cure, to kill more people – as in the famous work that finally freed us to acknowledge the immense human toll of smoking on human health.


The jitter of the world is what lets us do science on it in the first place. If nothing ever varied, nothing could be compared. Without variation, there would be no differences between which we could rappel in our quest for cause, no small changes to make us believe that big change might be possible. The statistics that we use for quantitative social science are deeply and profoundly rooted in the Uncertainty. 

Variation (a form of The Uncertainty) is not only an incidental and bothersome thing we need to deal with in our research and our statistics; it is also often a characteristic of the thing itself. In this paper on commitment to purpose, they write: “constructs that have traditionally been studied as stable dispositions also vary meaningfully within individuals.” Varying meaningfully is key to groking where progress over time comes from in the first place: “people show substantial variability from day to day in their ability to craft goals to be more meaningful and when successful, these behaviors are linked to greater task engagement.”

Many business translations of applied psychology seem to argue that if only we can put people into a single state (satisfaction, motivation), they will stay there and everything will be good for all time. “Up and to the right.” Against this, the rigorous social science of variation looks positively shaky – yes, this effect exists but it varies day to day. Yes this thing emerges but we don’t know enough to intervene on it. But it’s less wrong. Things can be important without being stable. Things can be interesting but not good intervention targets. Things can work here and not there. Things can be strongly felt and wrong. 

This balance beam, this very thin line where on one side you are speaking to scientists and criticizing methods and on the other side you are speaking to leaders and authorities and organizations and trying to encourage them to trust (good) science, this is a really hard fucking balance beam and I’m not always a good gymnast. But I try to remember (we’re not opponents) that social science in the end is not about what you deserve or get or even experience, it is about what the people in your research experience and whether the world hears it. 

I have a conviction that antifragile science will get us there, but perhaps not without embarrassment. You have to face down the Uncertainty, and look those opponents in the eye, and tell them squarely that you will not accept a world in which you are opponents, and you’re going to tell them about the fact that you might be wrong and more than that, you have to faceplant directly into the wrongness and describe it and characterize it and keep it close. And you will therefore have many losses but your losses will be honest. 

That is power, maybe. A very different kind.


Here’s something that helps me personally with all this. You can think of social science as being about sampling people but you can also think of social science being the experimental moment of gathering a measurement with (not from) a person as a collaboration with people to sample the situation. This conviction has made it harder to cheerfully brand myself as a People Scientist! With all the Answers! in tech but makes me happy. It keeps me happy. The Uncertainty is hard but does it have to be suffering? Not if you learn to like it. I’d rather be a Situation Scientist, an Antifragile Scientist, a scientist who welcomes the Uncertainty. As Rumi says: This being human is a guesthouse, every morning a new arrival. 

Perhaps the best thing we can do as scientists is to believe that our ideal state is not knowledge, but Uncertainty. It will be there whether we like it or not, we may as well be friends.


bottom of page