Studying Ways to Improve the Use of Research Evidence: Distinguishing Data from Research Evidence

We recently analyzed the applications we received over the last two years for research grants on improving the use of research evidence. One of the top three reasons applications were not selected for funding was a focus on the use of data and not research evidence, as defined by the Foundation. We recognize that distinctions between data and research may not always be clear and wanted to provide some additional guidance.

Our application guidelines define research evidence as follows:

A type of evidence derived from studies that apply systematic methods and analyses to address predefined questions or hypotheses. These include descriptive studies, intervention or evaluation studies, meta-analyses, and cost-effectiveness studies conducted within or outside research organizations. For the purposes of this program, data, practitioner knowledge, and expert opinions do not meet the definition of research evidence. However, when these types of evidence are tested or analyzed with systematic methods and analyses to address predefined questions or hypotheses, the resulting findings constitute research evidence.

The Foundation recognizes many different kinds of evidence to inform decision-making: data, research findings, clinical experience, and community input. The Foundation’s priority is studying how, when and under what conditions evidence from research is used to inform policy and practice and ways to improve the use of that evidence.

To distinguish the difference between data and research evidence, Adam Gamoran wrote in 2016, “data are the ingredients; research evidence is the meal….” That is, data may highlight a problem, but the data, in and of themselves, do not indicate why the problem exists and what to do about it. Evidence from descriptive studies examining the underlying causes of problems and from intervention research that tests solutions are important for decision-making. Dumont and Smeeding (2016) note that access to data by itself does not ensure research evidence is produced or used.

To help clarify the distinction between data and research evidence, we’ve developed the following hypothetical examples modeled after prior submissions to the Foundation:


The use of continuous quality improvement (CQI) methods and related methodologies can be an important tool for schools, community-based agencies, and other youth-serving providers to strengthen their policies and practices. CQI guides decision-makers to identify a problem and test potential solutions. The use of both data and research evidence are important during CQI. First, the data can help identify that a problem exists and motivate leadership to commit resources to address the problem. The quality improvement team can use research evidence in the form of evidence syntheses, meta-analysis, or evidence-based practices to identify potential solutions to the problem. Teams may then try out potential solutions and use data to track if they are influencing the problem at hand. Proposals that focus on the use of research evidence to identify potential solutions and how to improve that use of research evidence would be a fit for our priorities. Proposals only focusing on the data collection and analysis alone would not be.


Data can be a powerful tool to identify problems and track progress. For example, many schools use data to examine student academic progress or attendance, community-based agencies may use data to identify a local need their services could address, or local data may motivate a politician to prioritize a policy. However, the data by themselves cannot identify the potential next steps. Research evidence can be a powerful tool for organizations to identify why a problem may be occurring as well as indicate potential solutions. Proposals that examine how to improve the use of research to inform decision-making about the causes and consequences of the problem would be a fit with our priorities.


Data analyzed using systematic methods to answer predefined questions or hypotheses generate research evidence. Often administrative data is linked with other sources of data to provide a broader picture of the problem and context. One example is a project to understand how and under what conditions does the policy of extending foster youth until age 21 benefit youth outcomes. To answer these questions researchers merged multiple administrative datasets along with data from surveys and interviews to answer this research question co-designed with state-level child welfare administrators. Similarly, administrative data can be used to evaluate the impact of specific programs. Proposals that include partnerships between researchers or intermediary organizations and decision-makers that identify, build or test new strategies for improving the use of research to support decision making are welcome.


Data paired with research evidence can be a powerful motivator for action. For example, many organizations need to apply for funding to support their services or incentivize local leadership to act on a policy priority. Local data can be a powerful way to provide local information on the magnitude of the issue. When local data is combined with research evidence, the broader empirical literature can illuminate why the problem may be occurring to better target solutions. For example, local data may highlight high rates of infant mortality for Black mothers in a specific community and raise the importance of the need to address the problem. Research evidence can inform why infant mortality occurs and specifically why Black mothers may be at greater risk. Pairing research expertise with local knowledge of systems operations and local and national datasets can spark action. Proposals examining how and under what conditions research evidence is used strategically and when and why the strategy is effective would align with the Foundation’s priorities.

We hope these examples help illustrate our definition of research evidence and help you understand how the two differ. Please reach out to Senior Program Officer Lauren Supplee if you have questions about this topic:

Additional Resources

Post Categories: