Digest, Issue 1: Summer 2016

Moving from Data to Research to Policy: What Does it Take?

The evidence base for policymaking needs to offer the best data it can to make headway in reducing inequality and expanding opportunity for youth. One effort to help institutionalize and regularize researcher access to public administrative records is the newly established Commission on Evidence-based Policymaking. Federal legislation broadly charges the Commission to recommend ways to expand access to and use of government data and address “how data and results can be used to inform program administrators and policymakers.” This will likely involve discussions about technical challenges in linking disparate data, concerns about privacy, rules for accessing the data, and managing and documenting the content of data sets.1

These considerations are essential, but we know from Foundation-supported work that there is also a social side to research use. This includes consideration of how decision makers will use the resulting research and the relationships, incentives, and supportive structures it takes to bring these considerations into the research process. Thus, while promising efforts are underway to facilitate access for researchers, we need to do more to ensure that what is learned from these data is useful—that research is not just produced and pushed out for consumption without careful consideration of how and by whom the resulting research will be used.

But connecting decision makers’ interests with researchers’ pursuits is a significant challenge. Work from those studying the use of research evidence offers some promising paths for establishing important safeguards.

Working with decision makers

The move from data to use-ready research must consider how the evidence generated might be used. Input from decision makers on the types of questions they want answered could inform the types of data sets that need to be connected, the level of analysis (e.g., city, district, school, or student), and the identification of salient variables. For example, state education leaders may want to know how specific investments at a school level—say, technology, textbooks, and physical improvements—affect achievement. This list may reflect district leaders’ working hypotheses about competing expenditures and what matters for student success. Not all of these variables may be on the minds of researchers, however. The construction of data sets, then, begins the path to generating research evidence—and this beginning delimits the range of questions researchers are able to answer with big data.

Conversely, decision makers may be experts on their systems, but they may not have the capacity to anticipate the analytic structure of data sets, design research studies, or critically evaluate, prioritize, and interpret research evidence. But these skills can be nurtured through partnerships or other opportunities that allow users to engage with research evidence and make sense of how it might apply to their organizations.

The social side of research use includes consideration of how decision makers will use the resulting research, as well as the relationships, incentives, and supportive structures it takes to bring these considerations into the research process.

Research-practice partnerships and similar collaborations offer a sound strategy for involving decision makers. Reciprocal participation in the social process of building and using evidence strengthens the joint effort between researchers and decision makers. In partnerships, decision makers are provided with opportunities to articulate their needs, and this can improve the likelihood that the resulting research evidence will be useful to them. At the same time, frequent exchanges help researchers build familiarity with the needs and motivations of decision makers. This, in turn, can help researchers construct a process and communicate findings in ways that are accessible and oriented toward use.

The literature has strong examples of the value of partnerships when working with administrative data, especially at state and local levels. The University of Chicago Consortium on School Research, a long-standing partnership between researchers at the University of Chicago and within the Chicago Public School district, for instance, has used administrative data to identify indicators of success, chart improvement, and conduct theory-driven evaluations of within-district programs and policies. The Child and Adolescent Services Research Center at Rady Children’s Hospital in San Diego has also demonstrated the value of marrying research expertise and experience with national datasets with local knowledge of system operations and information needs. By embedding researchers within the local organization, they have linked local administrative systems, data from nationally representative longitudinal survey samples, and Medicaid data to show that court referrals were associated with racial disparities in the use of mental health services.

Thus, partnerships between decision makers and researchers, either within their organizations or at research institutions, may produce work that, from the outset, is likely to penetrate the realms of policy and practice.

Incentives to facilitate collaboration

Incentivizing collaboration and knowledge integration is fundamental to fostering broader uses of research in ways that benefit youth. For example, the recent reauthorization of the federal Every Student Succeeds Act includes provisions that call for local decisions about programs and activities to be “evidence-based.” Further, elements of the law link funding to these levels of evidence. This is just one example of how policy can encourage integration of research evidence in decision making. Creating these and other kinds of incentives may strengthen collaborations by structuring expectations and routines that promote use from beginning to end—from data set construction to evidence production to research use.

Beyond legislation, though, what structures might incentivize links between research institutions or trusted intermediaries and decision makers? To some extent, demand for program evaluations may encourage researchers and decision makers to work together. Still, more direct incentives may be needed to upend existing tendencies to focus on the demands and rewards of one’s own institution. This might include providing direct support for sustained research collaborations. On the research side, incentives might include course releases to allow time for partnership activity and recognition that partnering constitutes valuable service to the community. For decision makers, partnering may be more desirable if state and local agencies had expedient mechanisms to receive funds and award contracts, or if they could reallocate resources from programs deemed ineffective to those with a stronger theoretical or empirical evidence base.

Other strategies and supports for improving the use of research evidence

Another promising avenue to improve use includes interventions that are designed to develop skills related to accessing and appraising research. Since partnerships and other one-on-one collaborations are often not possible, technical assistance is an important alternative for cultivating such skills. Technical assistance providers often have deep knowledge of research and the decision makers’ system, as well as sustained contact with decision makers. It is important to imagine how this supportive infrastructure might be refined to develop both leaders’ capacity to access and process research evidence and opportunities to discuss how it might be integrated with existing evidence. In turn, technical assistance providers may also emerge as important conduits for sharing their knowledge with researchers or intermediaries engaged in analyzing data sets. This type of give and take can lead to new ways of seeing old problems and help inform action, thereby contributing to a culture in which it is normative for leaders to use research evidence in decision making processes.

Incentivizing collaboration and knowledge integration is fundamental to fostering broader uses of research in ways that benefit youth.

Lastly, because not all research will be interpreted with a researcher on hand, there need to be structures in place to make sense of the abundance of information that amasses from big data projects like the American Opportunity Study. Currently, numerous clearinghouses provide access to research evidence, but their structures are far from uniform. These outlets differ in terms of the level of rigor required for inclusion, elements reported about each study, supporting documentation, and synthesis across studies. What’s more, the kinds of information decision makers need—like user reviews from peers, implementation context, and infrastructure, training, and cost requirements—are often omitted. This variation, unfortunately, makes it difficult for decision makers to identify research evidence that they can use.

But there are promising examples of approaches that respond to these challenges. In the United Kingdom, for instance, systems (and funding) are in place to routinely conduct systematic reviews of emerging evidence bases, and centralized and developed frameworks ease the organization and sharing of research evidence. The London-based Education Endowment Foundation, for instance, rigorously evaluates strategies to improve the use of research evidence and is offering roadmaps for developing supportive structures for using research.

Moving between data, research evidence, and use: What does it look like?

Beyond generalities about building bridges and translating research to practice or policy, how do we create the real-world conditions for research use? What will the infrastructure actually look like? What strategies and incentives might it comprise?

As described, the potential of getting from data to useful science is realized only when the research is actually taken up by the decision makers. And this is more likely to happen when researchers are able to produce work that is relevant to the leaders who make decisions about policies and practices that affect young people.

One example of how administrative data can be leveraged to generate research evidence that informs decision making is CalYouth, a collaboration between researchers at the University of Chicago and members of the California Child-Welfare Co-Investment Partnership, including leaders from the California Department of Social Services, Child Welfare Directors Association of California, the Judicial Council of California and a group of philanthropic funders. Together, the collaboration is responding to a mandate from the state legislature to evaluate the implementation of extended foster care.

The potential of getting from data to useful science is realized only when the research is actually taken up by the decision makers.

Federal legislation gives states the option to extend services and receive reimbursement for youth in care until age 21, but policymakers in California and elsewhere want to know how and under what conditions the extended care is benefiting youth, and at what cost. Financial resources from a number of private funders have allowed CalYouth to link administrative data on youth foster care histories, employment insurance wage claims, use of public assistance programs, Medicaid, college engagement, and arrests. A sampling frame was generated from the administrative data, and led to extensive interviewing and web-based surveys. State and county administrators and supervisors, youth in foster care, and others provided input on the tools. And, over time, collaborators have used the assembled data to generate research evidence that describes youth and the services they receive, tackle concrete problems related to caseload sizes, and respond in real time to policy debates about the value of extended care. Thus, both the mandate to evaluate the implementation of extended services for youth and an influx of financial resources have helped to incentivize the production of research.

This example shows data being leveraged with an orientation toward use, but it also gets to a larger point about the use of research evidence. As we’ve said, linking and analyzing data sets can contribute to strong, useful bodies of evidence, as long as we’re looking to the data with purpose and remaining mindful that big data is, by itself, only a means to an end. But, as illustrated by the example, in order to increase the likelihood that the evidence we produce is used to inform policies, practices, and programs that benefit youth, we need an infrastructure that encourages opportunities for stakeholders and researchers to engage one another and benefit from each other’s perspectives. We need a framework that cultivates the use of research evidence by incentivizing collaboration and establishing structures that focus attention and resources on integrating the perspectives of decision makers into the research production process.

Work by Foundation-supported researchers and others suggest that merging research evidence in these processes requires having structured discussion about research. In the CalYouth example, research evidence—cost-benefit studies in particular—was included in deliberations about whether to extend care. And negotiations about what administrative data to link and what items were needed in the survey included input from decision makers and other stakeholders. In this way, the data extraction and subsequent data collection efforts were designed for use from the outset; the team solicited the stakeholders’ feedback, down to specific survey items and data fields.

Lastly, the CalYouth example also highlights some outstanding questions. Providing incentives for cross-sector work does not guarantee a productive collaboration. Although there was some evidence that the CalYouth cross-sector partners collaborated in critical deliberations, it is unclear whether these practices were happenstance or intentional and routine. We need to know more about the extent to which practices were codified. For example, were there formal agreements or memorandums of understanding that established guidelines for working with one another? Was there a coordinating body that instructed the process? Examining the extent and value of such formal structures may inform strategies for facilitating research use.

Conclusion

Linking big data sets is a promising first step to producing research evidence that is used by decision makers. But it is not enough by itself. Efforts to link large-scale data sets from diverse sources have the potential to rapidly enhance what can be learned from surveys, experiments, evaluations, and qualitative data about the lives of young people, as well as ways to improve their outcomes. Yet we must be aware that “many factors shape what research is sought, how it is shared, and the ways in which it is evaluated, used, contorted, or dismissed.”

This knowledge demands intentional efforts to orchestrate and incentivize the move from big data to research evidence to use. And, we have accumulated promising lessons about what is required to design and promote research use. For instance, research is more likely to inform decision making when it comes from a trusted source and is deeply understood. Participation in the research process and sustained relationships may enhance trust. Structured opportunities to discuss and push back on research appear to deepen knowledge. And these processes are hypothesized to increase appropriate uses of research evidence. When it comes to harnessing the potential of big data, these lessons should not sit idle.

At the same time, we need to know how to create these conditions; here, the knowledge base is relatively weak. Partnerships between researchers or intermediary organizations and decision makers present one potentially promising vehicle. But we need to create and test these and other mechanisms for improving the use of research evidence. Toward that end, our Foundation welcomes studies that identify, build, and test new strategies for improving the use of research evidence. These proposals should describe the body of evidence that is ripe for use; how use is being conceptualized, operationalized, and measured; and why use is expected to improve decision making and, ultimately, youth outcomes. These questions are a key line of inquiry for our Foundation. We suspect these pursuits will lead to new insights about the infrastructure needed to produce and use research in ways that benefit youth.

Footnotes
  1. For full citations and a complete reference list, download the PDF of this essay.
Downloads

In this issue

For researchers and their teams, it’s important to appreciate that when seeking to incorporate mixed methods, the forms and challenges vary from project to project.
Harnessing Discovery: Writing a Strong Mixed-Methods Proposal
A key approach in our efforts to support impactful research is to invest in the development of tools that enhance the work of many researchers engaged in a common enterprise.
Investing in Tools to Create Evidence and Improve Policy
What steps can we take to ensure that access to big data leads to the production of high-quality, useful research evidence?
Using Data to Produce Useful Research Evidence
Taken together, these studies produced deep insights that would not have been accessible through quantitative research alone.
The Value of Qualitative and Mixed-Methods Research: Examples from our Portfolio on the Social Settings of Youth Development
Understanding the social processes that involve interactions between individuals, or between individuals and their contexts, is essential to responding to inequality.
How and Why: Questions that are Well-suited for Qualitative and Mixed Methods
Digest, Issue 1: Summer 2016
We believe that qualitative and mixed-methods research is essential to building, understanding, testing, and improving responses to inequality.
Why Qualitative Research?

More Digest Issues

The Digest

Issue 9: Winter 2023-24

The ninth issue of the Digest features an update on the Institutional Challenge Grant program five years after its launch, with insights from a Foundation stock-taking effort and reflections from two grantee teams. We also share new thinking on the use of research evidence, including ways for researchers to leverage existing findings to bolster studies of effective strategies to improve research use.
The Digest

Issue 8: Winter 2022-23

The eighth issue of the Digest features insights about where new research might help identify ways to reduce inequality in youth outcomes after COVID-19, as well as how the fields of implementation science and research on research use might learn from each other in ways that yield more transformative research in years ahead.
The Digest

Issue 7: Winter 2021-22

The seventh issue of the Digest includes fresh insights from program staff and grantees on how researchers can successfully confront the challenges of studying causal mechanisms for reducing inequality, as well as how the global community of researchers and funders focused on improving the use of research evidence can continue to break new ground in the years ahead.
The Digest

Issue 6: Winter 2020-21

Essays in this issue of the Digest focus on the importance of looking beyond individual action and customary metrics in research on reducing inequality, as well as how strengths-based, race-conscious research can be produced and used to uplift communities of color.
The Digest

Issue 5: Winter 2019-20

The fifth issue of the William T. Grant Foundation Digest features insights from program staff and grantees on the importance of looking beneath the surface to consider the underlying factors that create and shape challenges that social scientists seek to address, whether they be related to reducing inequality in youth outcomes or improving the use of research evidence in policy and practice.
The Digest

Issue 4: Winter 2018/19

The fourth issue of the William T. Grant Foundation Digest features insights that may point the way toward a more nuanced understanding of evidence use and inspire new and more wide-ranging examinations of ways to reduce inequality in youth outcomes.
The Digest

Issue 3: Winter 2017/18

The third issue of the William T. Grant Foundation Digest features insights on how research on ability tracking can inform studies to improve the outcomes of English learners, as well as how researchers and school districts can partner to build learning systems based on research evidence.
The Digest

Issue 2: Spring 2017

The second issue of the William T. Grant Foundation Digest features writing on research rigor and relevance, as well as the potential for a new research agenda for improving the outcomes of English learners under the Every Student Succeeds Act.

Subscribe for Updates