The Promise

Over the past two decades, growth mindset theory has become one of the most influential ideas in education. The core claim is straightforward: students who believe their abilities can be developed through effort — those with a “growth mindset” — outperform students who believe their abilities are fixed. From this premise, researchers and educators developed interventions designed to shift students toward a growth mindset, promising significant gains in academic achievement (Burnette et al., 2022).

The appeal is obvious. If a relatively simple psychological intervention could meaningfully improve student outcomes, the implications for schools and policy would be enormous. Growth mindset programs spread rapidly through classrooms, school districts, and education systems worldwide.

But how strong is the evidence behind them?

The Study

Burnette and colleagues (2022) set out to answer that question systematically — something that, despite the interventions’ widespread adoption, had not yet been done rigorously. Their review involved two parallel efforts: first, evaluating whether existing studies met basic standards for drawing causal conclusions; and second, conducting three separate meta-analyses across 63 studies involving 97,672 students.

What They Found

The results were sobering on multiple fronts.

Study quality was poor across the board. When the authors evaluated empirical studies against a set of best practices essential for causal inference, they found major shortcomings in study design, analysis, and reporting throughout the literature (Burnette et al., 2022). These are not minor technical quibbles — they are the kinds of flaws that can make an ineffective intervention appear to work.

There were strong signs of bias. Researchers with a financial incentive to report positive findings — those with commercial or professional stakes in the interventions — published significantly larger effects than researchers without such incentives (Burnette et al., 2022). This is a serious red flag. It suggests the published literature may not reflect what the interventions actually do, but rather who is doing the studying.

The overall effect was tiny and fragile. Across all 63 studies, the pooled effect size was d = 0.05 — a figure so small as to be practically negligible. Crucially, this already-small effect became nonsignificant after correcting for potential publication bias (Burnette et al., 2022). In other words, once you account for the likelihood that negative findings went unpublished, even that modest average effect may not be real.

Moderators did not hold up. The theory predicts that interventions should work better for certain groups of students — those under greater academic stress, for instance, or those from disadvantaged backgrounds. None of these theoretically meaningful moderators reached significance (Burnette et al., 2022), meaning the data offered no clear picture of for whom, if anyone, these interventions reliably work.

Drilling Down: Higher Standards, Smaller Effects

The authors did not stop at the full sample. They conducted two additional meta-analyses applying progressively stricter criteria — and the effects shrank further at each step.

When they examined only the 13 studies that could actually demonstrate the intervention had shifted students’ mindsets as intended (the basic mechanism the theory requires), the effect was nonsignificant: d = 0.04 (Burnette et al., 2022). An intervention that does not change the mindset it targets cannot reasonably claim to work through mindset change.

When they restricted the analysis further to the six highest-quality studies — those best positioned to support causal conclusions — the effect was again nonsignificant: d = 0.02, with a confidence interval ranging from -0.06 to 0.10 (Burnette et al., 2022). The best evidence, in other words, points to essentially no effect.

The Conclusion

Burnette and colleagues (2022) are direct about what this means: the apparent benefits of growth mindset interventions on academic achievement are likely attributable to inadequate study design, reporting flaws, and bias — not to genuine effects of the interventions themselves.

This is not a claim that mindset is unimportant, or that beliefs about ability are irrelevant to learning. It is a claim about the interventions built on that theory, and about the quality of the evidence used to support them. The two are not the same thing, and conflating them has allowed a popular idea to outrun the data behind it.

Reference

Burnette, J. L., Billingsley, J., Banks, G. C., Knouse, L. E., Hoyt, C. L., Pollack, J. M., & Simon, S. (2022). A systematic review and meta-analysis of growth mindset interventions: For whom, how, and why might such interventions work? Psychological Bulletin. Advance online publication. https://doi.org/10.1037/bul0000352