Digest, Issue 8: Winter 2022-23

Building Evidence Systems to Integrate Implementation Research and Practice in Education

Amid persistent and expanding inequities in education, the urgency of improving practice likewise grows, along with demands for better research use (ESSA, 2015).1 As a field, we need to improve not just the quality of research evidence (Ming & Goldenberg, 2021), but also the quality of research use and the processes supporting effective implementation of evidence-based practices. Whether framed as a problem of practice or research use, understanding implementation is critical for identifying and redressing the inequitable distribution of resources, facilitators, and barriers in our systems, which give rise to disparities in student outcomes.

Applying implementation science to education

Analyzing education through the lens of implementation science illustrates its potential for addressing fundamental challenges in education (Century & Cassata, 2016). Learning emerges from rapidly-changing experiences and high-intensity social interactions, which requires instruction to continually adapt. These adaptations vary in content and impact, possibly improving quality or undermining fidelity (Hill & Erickson, 2020; LeMahieu, 2011; O’Donnell, 2008; Penuel & Means, 2004). The complex social dynamics in classrooms and schools further compound the variation: Teaching is “dense with discretionary spaces,” in which educators make hundreds of micro-decisions daily (Ball, 2018). These decisions are mediated by how educators interpret them, emphasizing the importance of alignment between their conceptual understanding and the theory underlying the practice, to avoid misconceptions, biases, and inequities (Cohen, 1990; Jones & Wiliam, 2022).

Understanding and managing this variation requires moving one step back in the causal chain, as heterogeneity and adaptations occur not just on the level of “what works” for students (educational practices), but also on the level of “how to make it work” by educators and school systems (implementation strategies). Alongside staff quality and capacity, school success depends on other factors such as instructional systems, school climate, and community partnerships (Bryk et al., 2010). These are further shaped by organizational models (Rowan & Miller, 2007), governance (Mitra, 2022), finance (Baker et al. 2014; Lafortune et al., 2018), and myriad policies around standards (Darling-Hammond 2004), certification (Cochran-Smith et al., 2015), data use (Marsh et al., 2006), school turnaround (Redding & Nguyen, 2020), and other factors. These strategies may be provided in idiosyncratic combinations yet evaluated in isolation (or vice versa), complicating efforts to clarify which strategies are more effective under which conditions. Without a coherent understanding of the interplay across these multilevel strategies, we risk leaving educational improvement up to chance.

Implications

Applying implementation frameworks to improve education research and practice requires a shift in our approach to conceptualizing, producing, and using evidence. These span systems for data collection, evidence generation, and knowledge sharing.

First, we need more evidence that extends beyond the adoption phase to encompass the implementation phase (Aarons et al., 2011; Moullin et al., 2015). This applies to both guiding intended implementation and documenting actual implementation. Studies demonstrating evidence of impact may not include sufficiently detailed evidence about implementation, especially to support adaptation (Ming & Goldenberg, 2021). Reporting guidelines can establish minimum standards for both synthesizing and applying the research (e.g., Hoffman et al., 2014; Michie et al., 2009; Phillips et al., 2016). More substantively, educators need to unpack the “black box” of programs to identify which components are essential (Chorpita et al., 2005; McLeod et al., 2017) and which adaptations are warranted (Chambers et al., 2013; Stirman et al., 2019). This requires greater clarity around the theories guiding implementation (Nilsen, 2015), as the transferability of knowledge2 across settings depends on coherent ontologies for structuring that knowledge (NASEM, 2022b). As educators are perpetually adapting practices to fit their local contexts, tracing these adaptations over time and capturing their rationales offers a valuable stream of evidence to enrich our understanding of what works and why.

Second, we need better evidence about the relationships between factors influencing successful implementation and strategies to support it. Since adoption and implementation decisions are often made by different people facing different information and demands, locating and bridging those gaps is essential. Implementation science offers frameworks for examining interventions to shape individual-level behaviors (Michie et al., 2011); taxonomies of strategies for implementing systems-level change (Powell et al., 2012, 2015; Waltz et al., 2015); and protocols for mapping effective strategies to address specific implementation determinants (Fernandez et al., 2019; Kok et al., 2015). Examples include training, facilitation, audit and feedback, infrastructure, incentives, and regulation, which have close analogues in education. Some strategies may target individual-level knowledge, attitudes, and skills of implementers, while others may target organizational-level culture, communication, or information systems. Applying implementation science offers opportunities to build coherence and clearer connections across such diverse literatures as workforce development, standards and accountability, curriculum design, resource allocation, and school reform. Bridging these fields can yield better guidance for tailoring strategies across multiple levels of the system to support implementation.

Third, we need to rethink our broader evidence systems to shift norms in education practice and research. These encompass expectations for evidence, infrastructure to support gathering evidence, and mechanisms for knowledge exchange.

Expectations for evidence

Whether due to political concerns or capacity constraints, districts typically do not collect or share data systematically about the conditions and practices inside classrooms and schools. Even if well intentioned, the hyper-focus on student outcomes leaves invisible the root causes leading to those outcomes. Meanwhile, narrow standards of rigor in education research privilege controlled experiments of interventions over nuanced descriptions of the complex interactions in systems. We need to blend multiple forms of evidence across multiple levels, developing a continuum between research standards and practical standards for useful knowledge to improve implementation.

Evidence infrastructure

Establishing practical methods for collecting useful implementation data requires building systems based on a trustworthy, understandable structure for organizing and connecting data across different levels. Some efforts range from fine-grained learning process data from digital platforms (Steinkuehler, 2017), to data schemas for education technology research and development (BIRD-E, 2022), to ontologies for characterizing behaviors (NASEM, 2022b) and core components (Scher & Martinez, 2022). Yet educators also need flexible platforms for rapidly collecting and visualizing practical measures of processes for continuous improvement (Takahashi et al., 2022). Beyond the technical infrastructure of measurement standards and data systems, they need a social infrastructure for collecting implementation data efficiently and reliably. Since this is difficult to operationalize at scale, researchers could help develop guidance for practical sampling strategies, along with collaboratively training educators in collecting the data as part of their professional practice.

Mechanisms for knowledge exchange

Strengthening connections across the research and practice communities would further support capturing and learning from variation in implementation. A translational research-to-practice “push” model might frame this as educators adopting evidence-based practices, then testing adaptations locally (Lavis et al., 2003). However, the knowledge gained from those adaptations may not be consistently evaluated or shared elsewhere. A “pull” model might position practitioners as demanding that researchers study more useful questions for their contexts (Snow, 2016). Still, the supply of researchers and the timescale of co-production are dwarfed by the sheer diversity in implementation. An alternate approach could cultivate more practice-based evidence (Green, 2008) through continuous improvement cycles, developing and sharing insights with other practitioners and researchers for ongoing study in different contexts (Bryk et al., 2011, 2015; Russell et al., 2017). Systematically investing in this capacity within education agencies would strengthen both the use of evidence in major decisions and the generation of new, practice-relevant research knowledge (Farley-Ripple et al., 2022).

Conclusion

Applying implementation science to education illuminates numerous opportunities to strengthen our evidence systems to better support practice and research. As described here, the most immediate goal is to improve education practice and outcomes (engineering). A second goal is to advance the evidence base in education about what works, for whom, under what conditions, and why, in order to achieve success more consistently (science). A third goal is to advance the evidence base about research use and impact in education so that decision-makers may use evidence more reliably to improve practice (meta-science).

Collectively, these transformations offer promise for creating a more robust system for integrating research and practice through shared understanding and evidence about implementation. Achieving these changes at scale requires funders and policymakers to align resources and incentives with these goals. This will support the research and practice communities in collaborating to rebuild and repair our educational systems to achieve more equitable outcomes through more equitable implementation.

Downloads

In this issue

The fields of implementation science and the study of research use in policy and practice travel on many of the same roads and share similar goals, chief among which is ...
Learning Across Contexts: Bringing Together Research on Research Use and Implementation Science
Today, as the pandemic recedes, its effects are still with us, in education as well as in other domains. We have considerable knowledge about how to respond to growing inequality, ...
Reducing Educational Inequality After the COVID-19 Pandemic: What Do We Know, and What Research Do We Need?
As the effects of COVID linger on, researchers have a crucial role to play in cataloging them. Even more important, by developing knowledge about exactly who is affected and how, researchers can provide evidence that points the way toward successful ...
Emergency Exits: Avenues for New Research to Improve Youth Outcomes After COVID

More Digest Issues

The Digest

Issue 9: Winter 2023-24

The ninth issue of the Digest features an update on the Institutional Challenge Grant program five years after its launch, with insights from a Foundation stock-taking effort and reflections from two grantee teams. We also share new thinking on the use of research evidence, including ways for researchers to leverage existing findings to bolster studies of effective strategies to improve research use.
The Digest

Issue 7: Winter 2021-22

The seventh issue of the Digest includes fresh insights from program staff and grantees on how researchers can successfully confront the challenges of studying causal mechanisms for reducing inequality, as well as how the global community of researchers and funders focused on improving the use of research evidence can continue to break new ground in the years ahead.
The Digest

Issue 6: Winter 2020-21

Essays in this issue of the Digest focus on the importance of looking beyond individual action and customary metrics in research on reducing inequality, as well as how strengths-based, race-conscious research can be produced and used to uplift communities of color.
The Digest

Issue 5: Winter 2019-20

The fifth issue of the William T. Grant Foundation Digest features insights from program staff and grantees on the importance of looking beneath the surface to consider the underlying factors that create and shape challenges that social scientists seek to address, whether they be related to reducing inequality in youth outcomes or improving the use of research evidence in policy and practice.
The Digest

Issue 4: Winter 2018/19

The fourth issue of the William T. Grant Foundation Digest features insights that may point the way toward a more nuanced understanding of evidence use and inspire new and more wide-ranging examinations of ways to reduce inequality in youth outcomes.
The Digest

Issue 3: Winter 2017/18

The third issue of the William T. Grant Foundation Digest features insights on how research on ability tracking can inform studies to improve the outcomes of English learners, as well as how researchers and school districts can partner to build learning systems based on research evidence.
The Digest

Issue 2: Spring 2017

The second issue of the William T. Grant Foundation Digest features writing on research rigor and relevance, as well as the potential for a new research agenda for improving the outcomes of English learners under the Every Student Succeeds Act.
The Digest

Issue 1: Summer 2016

The introductory issue of the William T. Grant Foundation Digest features essays and commentary on the value of qualitative and mixed-methods research in reducing inequality and the potential for researcher access to big data to yield useful research evidence.

Subscribe for Updates