Proposing Research on Reducing Inequality: Studying Mechanisms and Investigating the “How and Why” Behind Intervention Outcomes

In this post, I encourage researchers to study the mechanisms through which inequality can be reduced, and I offer some suggestions on what this might look like in practice.

Since 2015, the William T. Grant Foundation has funded research on programs, policies, and practices to reduce inequality in the academic, social, behavioral, or economic outcomes of young people ages 5-25 in the United States. Among the letters of inquiry we receive for studies centered on academic outcomes, many propose to examine outreach and bridge programs as interventions that can disrupt disparities in college going and completion between historically marginalized youth and their more privileged counterparts.

Though similar in their goals, outreach and bridge programs intervene at different points in the educational lifecycle. Outreach interventions such as the federally funded Upward Bound target students by supporting skill development for college entry and success.1 Bridge interventions, on the other hand, target students by supporting their transition to postsecondary institutions most often in the summer before they enter college.2

Building on Existing Evidence

Currently, outreach and bridge interventions present an interesting dilemma for researchers. They are widely implemented, and participants benefit from them, but little research evidence confirms why this might be. To advance our knowledge of why bridge and outreach interventions produce positive youth outcomes, studies must build on existing research findings. Unfortunately, this is something that many letters of inquiry proposing to investigate interventions fail to do, despite a considerable body of effectiveness evidence. For instance, a What Works Clearinghouse Intervention Report3 identified more than 100 studies investigating the effect of summer bridge interventions. Familiarity with these and similar other findings is key to making the case that the proposed research will advance research and practice.

Conceptualizing the Intervention as a Collection of Practices

As Schultz et al. (2011)4 note, many interventions operate under the assumption of effectiveness. As a result, we lack sufficient knowledge of the mechanisms that produce success. The programs operate as black boxes: Transformation happens, but the mechanisms leading to it remain obscured.

One way to get at mechanisms that produce positive effects is to conceptualize interventions as programs that contain practices, i.e., “materials and activities through which youth development is enabled.” A few of the materials and activities that make up an intervention include the staff, curriculum, instruction, and services offered to students. Viewing an intervention in this manner invites analyses that attempt to isolate how programmatic choices influence youth outcomes. Moreover, situating an intervention as something that consists of materials and activities encourages researchers to articulate a theory of change. That is, a mechanism-based study asks researchers to elucidate how or why exposure to certain materials and activities promotes academic or behavioral growth among participants and therefore, reduces inequality.

Situating an intervention as something that consists of materials and activities encourages researchers to articulate a theory of change

Research that proposes to clarify the mechanism by which interventions s reduce inequality, then, would speculate that the materials and activities included in an intervention are chosen because they are expected to produce positive outcomes for the targeted youth (e.g., first-generation students, Black students, low-income students). Whether it’s the number of staff that are hired to operate the program or the decision to offer writing courses, each choice should have a relationship to the youth outcomes under investigation. Analyses of the intervention become an opportunity to determine the theory of change’s plausibility.

For example, in their study, Schultz et al. (2011)5 noted that most interventions include similar components, such as research, mentorship, and financial aid. The team sought to isolate the effect of the components within an NIH funded training program and found that participating in a research experience had a greater effect on sustaining interest in a science career among participants than receiving mentoring did.

Other research could focus on how the number of staff or staff qualifications influence youth outcomes. Another study might instead emphasize how remote versus in-person instruction influences youth outcomes. The key is that studies of this type underscore the importance of connecting an intervention’s materials and activities to its ability to reduce inequality in youth outcomes.

To illustrate, here are two examples of funded studies with potential to isolate if and how intervention participation changes student behavior in ways that can reduce inequalities in college success.

IES-Men of Color College Achievement (MoCCA) Project

This study investigated an intervention designed to increase college completion rates among African American males. The program aimed to form a community that would encourage students to build resources and skills that can help them persist, succeed, and ultimately graduate from college. In conjunction with a randomized control trial to assess the intervention’s efficacy, the researchers also proposed to use a survey to measure whether participants and nonparticipants accessed services or academic support differently.

Connected Scholars: A Mixed Methods Investigation of a Social Capital Intervention for First-Generation College Students

This study examined a social capital intervention designed to teach first-generation college students how to enlist the support of mentors. The proposal theorized that engaging in help-seeking behaviors would deepen the networks crucial to the students’ academic performance. The researchers planned to use pre-and post-intervention surveys to assess how help-seeking, self-advocacy, and networking behaviors changed over time among participants and nonparticipants.

Building Evidence

Importantly, studies like this will help untangle perplexing findings that emerge from the literature. For instance, among those that seek to statistically isolate the effect of participating in an intervention, it often remains unclear why program participants and nonparticipants do differ.6 Understanding why an intervention produces statistically significant outcomes requires moving away from studies that treat participation as the main variable of interest and toward those that analyze the materials and activities within an intervention as malleable factors that influence youth outcomes.


1U.S. Department of Education, Office of Postsecondary Education, Student Service, Fast Facts Report for Upward Bound and Upward Bound Math-Science Programs: 2017–18, Washington, D.C., 2021

2U.S. Department of Education, Institute of Education Sciences, What Works Clearinghouse. (2016, July). Supporting Postsecondary Success Intervention Report: Summer bridge programs. Retrieved from


4Schultz, P. W., Hernandez, P., Woodcock, A., Estrada, M., Chance, R., Aguilar, and M., Serpe, R. (2011). Patching the Pipeline: Reducing Educational Disparities in the Sciences Through Minority Training Programs. Educational Evaluation Policy and Analysis, 33(1): 95-114.


6Murphy, T., Gaughan, M., Hume, R., and Moore, Jr., S. (2010). College Graduation Rates for Minority Students in a Selective Technical University: Will Participation in a Summer Bridge Program Contribute to Success? Educational Evaluation and Policy Analysis, 32(1): 70-83.

Melissa Wooten is a Program Officer at the William T. Grant Foundation.

Post Categories: