Humans are remarkably good at adapting to change. We find ways to live in just about every climate on Earth, and circumstances that seem tough at first can quickly begin to feel normal. Underlying these adaptive skills is a flexibility in how we see and understand the world around us.
Our intuition makes us believe that our eyes are merely a window onto reality. But our brains are actually interpreting input from our senses to create practical impressions of reality, rather like how an impressionist painter might depict a scene in a distorted but charming way.
This means that two people can have very different perceived realities, and the same person’s reality can change over time. One clue that your brain uses to determine how that reality should change is background abundance.

🥅 Abundance shifts your goalposts
In 2018, a paper published in Science showed just how malleable human perception is. The researchers were interested in measuring “creep” in mental concepts. When commercial products expand beyond their original scope, we call it “feature creep”, and when military aims expand beyond their initial intentions, we call it “mission creep”. In this paper, they were investigating what you might call “perceptual creep”.
In their first study, the researchers asked people to look at colored dots on a computer screen and decide whether they were blue or purple. Each person saw 1000 dots, each presented individually. The color of each dot was randomly sampled from the spectrum I’ve created below. Everything on the left side of the midpoint dot was considered objectively purple while everything on the right side was considered objectively blue.
You might define the colors above differently to their objective location on the spectrum, but subjective differences between people didn’t matter here. The researchers wanted to test how overall color perception across a group of people would change depending on context.
For the first 200 dots that everyone saw, 50% came from the purple side and 50% came from the blue side of the spectrum. But for half of the people in the study (the experimental group), things changed for the remaining 800 dots—they gradually saw fewer and fewer blue dots, with the final 650 dots having only a 6% chance of appearing from the blue side of the spectrum. Almost everything they saw was purple.
For the control group who saw blue vs purple at a 50/50 rate throughout the experiment, responses didn’t change for the first 200 vs the last 200 dots. They defined “blue” and “purple” at the same rate for the duration of the task.
But the experimental group showed a shift in their perception over time. As blue dots became rarer in the experiment, people started calling more purple dots “blue”.
It didn’t matter whether people knew that the colors of the dots would change over time. When people were told that the probability of seeing blue dots will “definitely decrease” later in the experiment, they still reported more purple dots as “blue”. In fact, even when the researchers explicitly instructed people to “be consistent” in their concept of “blue” and offered monetary bonuses, people still couldn’t help but adjust their definition of “blue” as the background abundance of blue vs purple changed.
So the researchers were finding perceptual creep in how people viewed and defined the world around them. But they wanted to explore whether it would apply to more important scenarios. After all, who cares if you occasionally call a purple dot “blue”?
They created two final versions of the same experiment. In the first version, instead of showing colored dots, they presented people with images of human faces that varied in how threatening they looked and asked people whether they found them threatening or not. When threatening faces became rare, people started detecting more threat in non-threatening faces.
In the final version of the experiment, the researchers asked people to play the role of a university ethics board that audits study proposals to decide whether they should go ahead or not. If a study is deemed unethical, it can’t be approved.
People saw 240 study proposals that varied in how ethical they were. At the most ethical end of the scale, proposals would say something like “Participants will make a list of the cities they would most like to visit around the world…”. At the most unethical end, proposals would say something like “Participants will be asked to lick a frozen piece of human fecal matter…”.
Similar to the previous studies, the task was designed to show fewer and fewer unethical proposals over time. As unethical proposals became rarer, people started rejecting more ethical proposals. People reported ethical violations where they previously saw none, purely because the background frequency of ethical vs unethical proposals shifted.
Perceptual creep could have serious consequences in the real world. Apart from making life more difficult for academics trying to get their studies approved, the authors of the research above suggest that low violent crime rates could bias police officers to expand their definitions of “assault”. Similarly, improving societal health could mean that radiologists are increasingly likely to detect tumors in healthy medical images. Some types of decision-making are important enough that we need to keep a close eye on drifting standards.
The most direct takeaway from the ethics board experiment is that perceptual creep can change what we view as ethical vs unethical. This flexibility is great when it helps us notice that some historically accepted actions were evil all along. But in more immediate time frames, it’s possible that rapid change misleads our morality instincts.
The researchers don’t adopt a particular moral or political stance in their paper: “We take no position on whether these expansions [from perceptual creep] are good or bad. Rather, we seek to understand what makes them happen”.
Perceptual creep is likely to be useful in some situations and dysfunctional in others. The only obvious conclusion is that we need to start paying more attention to it.
⭐️ Takeaway tips
Your perception creeps: Subjective realities change dramatically depending on context. Avoid overconfidence when it comes to your personal views, because your view could be totally different tomorrow!
Notice and appreciate progress: It’s easy to lose track of progress—both personal and societal progress—as we shift our goalposts over time. A constant desire to improve is a good thing, but it often masks our view of how far we’ve come, which is an important part of learning from mistakes. As immoral behavior becomes less common, we’re likely to throw more ethical actions into the unethical pile.
Life is filled with ambiguity: Decision-making is tough in the noisy world we live in, and tradeoffs are common. To use another ethics example, someone who calibrates their brain to be as sensitive as possible to ethical violations will rarely miss immoral behavior, but they will often falsely condemn innocent people. And someone who calibrates their brain to avoid condemning the innocent at all costs will rarely make a false accusation, but they’ll often miss real ethical violations. If you improve your awareness of the ambiguities and inherent tradeoffs in the world, you can strike a more informed balance in how you make decisions.
💡 A final quote
“Normally, we take our reference frame for granted; we mistake it for “reality”.”
~ K. C. Cole, The Universe and the Teacup (1998)
❤️ If you enjoyed this, please share it with a few friends. If you’re new here, sign up below or visit erman.substack.com
📬 I love to hear from readers. Reach out any time with comments or questions.
👋 Until next time,
Erman Misirlisoy, PhD