The Gap Between Who You Think You Are and What You Actually Do
Self-serving attribution bias, misread Dunning-Kruger data, and behavioral research show that your self-concept and your actual behavior diverge more than you think.
You probably think you're a pretty good person. Honest, fair, competent. Maybe even a little humble. And there's a decent chance that if we looked at what you actually do — not what you intend, but what you do — the picture would be messier than that.
This isn't a "be better" post. It's more of a "you might not be seeing yourself clearly" post. Which is, if anything, worse.
Start with how people explain things when they go right versus wrong. When you ace a presentation, it's because you're sharp and prepared. When you bomb one, it's because the audience was difficult or the tech failed or honestly the whole setup was unfair. Psychologists call this self-serving attribution bias, and it's not a quirk; it's the default. Studies going back to the 1970s show people consistently claim credit for successes and offload blame for failures onto circumstances. You're not doing this consciously. That's sort of the point.
Now layer in how we think about our own abilities. You've probably heard of the Dunning-Kruger effect, usually summarized as "dumb people think they're smart." That's not actually what the research showed. The original 1999 paper by David Dunning and Justin Kruger found that people who performed in the bottom quartile on tests of logical reasoning and grammar overestimated their performance, but so did people in the middle. The top performers, interestingly, slightly underestimated themselves. What the study really showed is that people are generally bad at knowing where they stand, and that specific skill deficits make self-assessment harder, because competence and the ability to recognize competence often come from the same place. The pop-culture version "idiots have no idea" flattens something that's actually more universal and more unsettling.
So you've got a built-in tendency to credit yourself for good outcomes, deflect blame for bad ones, and assess your own abilities with less accuracy than you'd probably like to admit. Fine. But does this actually show up in behavior?
Yeah, it does, pretty consistently. Studies on what's called the "intention-behavior gap" show that people overpredict how often they'll follow through on things: exercising, donating, helping strangers, compared to what they actually do. A meta-analysis by Sheeran and Webb (2016) in Health Psychology Review found this gap persists even when people are highly motivated and have specific plans. We're not just failing to follow through on vague goals. We're failing on things we genuinely want to do and explicitly said we would.
This extends to ethics, too. Research on "moral licensing" suggests that doing something good, such as volunteering, recycling, etc. makes people measurably more likely to behave selfishly shortly after. The self-concept stays intact. The behavior drifts.
Put it together and you get a picture where the internal story ("I'm honest, I try hard, I care about people") and the behavioral record can diverge pretty significantly, not because you're lying to yourself exactly, but because the mechanisms that build self-concept are just not that tightly wired to the ones that govern what you actually do.
Which is a strange thing to sit with. Your self-image isn't a mirror. It's more like a painting, made by you, of you, in the most flattering available light.
The data suggests most of us are looking at a pretty generous self-portrait.