Avoiding Confirmation Bias When Implementing Evidence-Based Instructional Practices
The phenomenon of confirmation bias has received increased attention in the current era of partisan politics. Defined as “the tendency to process information by looking for or interpreting information that is consistent with one’s existing beliefs,” the term has been used to describe the resistance of individuals on one side of the political spectrum to the arguments of the other. We recently found ourselves reflecting on a similar tendency, which we observed in our work introducing new instructional practices in schools. Of course, we don’t suggest that we encountered resistance per se, but rather an inclination to fit what is new into what is familiar.
The project, Cultivating Excellence in English Learner Instruction (CEELI), was designed to bring evidence-based practices for improving English-learner literacy into classroom routines in five medium-sized urban or suburban school districts. Introduced to relevant research by scholars and experts at an initial convening, participating district teams then identified a specific instructional strategy, based on the research, that they wanted to tackle. Districts worked together as a networked improvement community (NIC) to implement their strategies, gathering virtually on a monthly basis over a six-month period. The work of the NIC was intentionally classroom-based. The goal was to see how research actually made its way into classroom practice versus how it was designed to do so.
NIC participants were open-minded and enthusiastic about the research findings to which they were exposed. And they were impressive when it came to designing plans that would capitalize on the research in their unique contexts. Over time, however, we began to notice that the interventions often got lost, or at the least significantly diluted, by participants’ natural instinct to incorporate them into practices they knew and were already using.
StandardsWork served as facilitator of the NIC and supported teams throughout their six-month series of plan-do-analyze-reflect (PDAR) cycles. Given most of the participating districts were implementing high-quality, comprehensive English language arts curricula, it was natural for these PDAR cycles to become increasingly nested in the curriculum, a development we considered desirable. It made sense to us that participants would best understand the research findings in relationship to their current context, pedagogical routines, and past training—their confirmation bias.
What we were unprepared for was the distraction that resulted from this bias. In fact, we have concluded that there is a real tension between “intentionality of intervention” and understanding research findings in relation to one’s confirmation bias, and that there is significant risk that research-based interventions may suffer under this tension.
…there is a real tension between “intentionality of intervention” and understanding research findings in relation to one’s confirmation bias
We noted almost from the outset of the NIC’s collaboration the tendency of participants to describe their practices in terms inherited from previous professional training that wasn’t at all aligned with the intervention being implemented (and could be unhelpful to it.) For example, one district seemed preoccupied with learning styles, e.g., the importance of kinesthetic learning and “total physical response,” which had nothing to do with their action plan or the PDAR cycles they were implementing. Another district focusing on increasing expressive language layered into their new CEELI-inspired intervention called “dialogic reasoning” an unrelated step-by-step vocabulary routine from some other intervention they’d learned in the past that wasn’t entirely research-based. In another district, whose intention it was to increase the use of academic vocabulary from rich, grade-level text, we discovered that, in practice, they were relying on language objectives as their main intervention. While language objectives are a common practice to support English learners, this did not represent new learning from the collaborative and pulled away from a refined focus on iterative learning around the CEELI-inspired intervention for building academic language.
These kinds of conflations became one of the most notable features of the project—something we were, frankly, ambivalent about throughout. On one hand, efforts to incorporate the intervention in existing practices displayed an ownership for the strategy being implemented that was hopeful. On the other hand, the inability to isolate the intervention as a discrete repertoire of teacher and student behaviors often rendered the PDAR cycle meaningless. Too much was going on for practitioners or researches to discern how to productively refine planning and delivery in the next cycle of implementation.
The tendency of educators, indeed all of us, to engage in “confirmation bias”—to want to fit what is new into what we know and believe—is often at odds with learning new practices. How do we avoid diluting the impact of newly acquired evidence-based strategies in our natural instinct to make them fit into what we know and have always done? The lesson learned for StandardsWork in its role as NIC facilitator, which we think has implications for others engaged in implementing new evidenced-based practices, is to be very intentional in the design of training and support activities about the importance of intentionality of intervention. That is, if we hope to train practitioners to use new interventions with fidelity, it may be necessary to first crisply and explicitly differentiate between previous and new instructional practices.
…if we hope to train practitioners to use new interventions with fidelity, it may be necessary to first crisply and explicitly differentiate between previous and new instructional practices.
Through interviews with practitioners prior to training, trainers can elucidate the contours of current practices, the vocabulary used to describe it, and the origins of both in past episodes of professional development. They can incorporate into their training role-plays and other concrete demonstrations of those practices that draws explicit comparison and contrast with the new instructional practice regimen that is the topic of training. With an increased capacity for self-awareness about where past practice ends and new practice begins, practitioners should be better able to engage in the iterative learning necessary to implement meaningful PDAR cycles and thereby improve teacher and student performance.