The online application currently closed. 2024 application dates will be published in November.
- 2023 Application Guidelines for Research Grants on Improving the Use of Research Evidence
- Application Submission Instructions
- Frequently Asked Questions
- Research Grants on Improving the Use of Research Evidence: An Overview of the Program and How to Apply (Webinar, June 2023)
- Learn about our program to recruit early-career peer-reviewers of proposals for studies on improving the use of research evidence
This program supports research on strategies to improve the use of research evidence in ways that benefit young people ages 5-25 in the United States. We want to know what it takes to produce useful research evidence, what it takes to get research used, and what happens when research is used. We welcome letters of inquiry for studies that pursue one of these broad aims.
Replicable methods, activities, or policies intended to improve the use of research evidence or to maximize its positive impact on decision-making and youth outcomes.
A type of evidence derived from studies that apply systematic methods and analyses to address predefined questions or hypotheses. These includes descriptive studies, intervention or evaluation studies, meta-analyses, and cost-effectiveness studies conducted within or outside research organizations. Data, practitioner knowledge, and expert opinions are other important types of evidence, but are distinct from our definition of research evidence. However, when they are analyzed using systematic methods and analyses to address predefined questions or hypotheses, the resulting findings constitute research evidence.
Use of research evidence:
The use of research evidence refers to the multiple ways research can be used, including: applying research evidence directly to a decision (instrumental use), the influence of research evidence on decision-makers’ understanding of problems and potential solutions (conceptual use), supporting existing stances or positions (strategic use), building trust with colleagues or educating constituents (relational use) or mandating decision-makers to engage with research (imposed use).
Those who create policies or make other high-level decisions that shape practice in youth-serving systems. Decision-makers include but are not limited to agency leaders; organizational managers; school district and local youth-serving program administrators; federal, state, and local policymakers; and intermediaries such as think tanks, advocacy groups, technical assistance providers, and professional associations that shape the production of research evidence or facilitate its use.
BackgroundResearch evidence can be a powerful resource for policymakers, agency leaders, organizational managers, and others who make high-stakes decisions that shape youth-serving systems. In addition to informing policy formation and service delivery, evidence from systematic research can deepen decision-makers’ understanding of issues, generate reliable assessment tools, support strategic planning, and guide program improvement. But only if it is used.
Studying ways to improve the use of research evidence will require new and innovative ideas, and we welcome creative studies that have potential to advance the field. Proposals for studies are evaluated on the basis of their fit with our interests; the strength and feasibility of their designs, methods, and analyses; their potential to inform improvements to research use; and their contribution to theory and empirical evidence.
The research on research use
Prevailing strategies to bring research evidence into policy and practice rest on models that increase decision-makers’ access to rigorous evidence and incentivize or mandate the adoption of programs with evidence of effectiveness. Despite large-scale initiatives and major investments of this kind, research evidence remains under-used.
Recent scholarship points to the limitations of models that prioritize research production and dissemination without adequate attention to would-be users’ realities. Decision-makers may be experts on their systems, but, even with access to rigorous research, they may not have the capacity or resources to critically evaluate, prioritize, and apply the findings. What’s more, the research may not be relevant to their specific contexts or communities.
In order to harness the full power of research evidence, decision-makers need deeper engagement and support. Across disciplines and policy areas, studies are remarkably consistent in their identification of specific conditions that enable the use of research evidence:
- research is timely and relevant, addressing decision-makers’ needs and local contexts
- trusted relationships between researchers, intermediaries, and decision-makers enable collective sense-making of research and deliberation over how to use it
- evidence use is integrated into decision-makers’ existing routines, tools, and processes.
Toward new strategies
While an extensive body of knowledge provides a rich understanding of specific conditions that foster the use of research evidence, we lack robust, validated strategies for cultivating them. What is required to create structural and social conditions that support research use? What infrastructure is needed, and what will it look like? What supports and incentives foster research use? And, ultimately, how do youth outcomes fare when research evidence is used? This is where new research can make a difference.
Vivian Tseng outlines the Foundation’s call for proposals on research to illuminate strategies for improving research use in ways that can improve outcomes for young people.
This program supports research on strategies focused on improving the use, usefulness, and impact of evidence in ways that benefit young people ages 5-25 in the United States. We welcome impact studies that test strategies for improving research use as well as whether improving research use leads to improved youth outcomes. We also welcome descriptive studies that reveal the strategies, mechanisms, or conditions for improving research use. Finally, we welcome measurement studies that explore how to construct and implement valid and reliable measures of research use.
NOTEWe are particularly interested in research on ways to improve the use of research evidence by state and local policymakers, mid-level managers, and intermediaries. These decision-makers play important roles in deciding which programs, practices, and tools to adopt; deliberating ways to improve existing services; shaping the conditions for implementation; and making resource allocation decisions.
We invite studies from a range of disciplines, fields, and methods, and we encourage investigations into various youth-serving systems, including justice, housing, child welfare, mental health, and education. Previous studies have drawn on conceptual and empirical work from political science, communication science, knowledge mobilization, implementation science, and organizational psychology, among other areas.
In addition to studies that build and test theory, we are interested in measurement studies to develop the tools necessary to capture changes in the nature and degree of research use. Finally, we welcome critical perspectives that inform studies’ research questions, methods, and interpretation of findings.
We welcome studies that pursue one of three aims:
This may include:
- Studies of strategies, mechanisms, or conditions that foster more routine and constructive uses of existing research evidence by decision-makers.
- Studies that test the effects of deliberate efforts to improve routine and beneficial uses of research in decision-making.
For example, prior work suggests that decision-makers often lack the institutional resources and requisite skills to seek out and apply research, and certain organizational norms and routines can help overcome those barriers. Studies might examine efforts to alter the decision-making environment by comparing the effectiveness of different ways (e.g., technical assistance, research-practice partnerships, cross-agency teams, etc.) to connect existing research with decision-makers, or by exploiting natural variation across decision-making environments to identify the conditions that improve research use.
This may include:
- Studies to identify strategies for altering the incentive structures or organizational cultures of research institutions so that researchers conduct more practice- or policy-relevant studies and are rewarded for producing research that decision-makers consider useful.
- Studies to identify the relationships and organizational structures that lead to the prioritization of decision-makers’ needs in developing research agendas.
- Studies that examine ways to optimize organized collaborations among researchers, decision-makers, intermediaries, and other stakeholders to benefit youth.
For example, one might investigate the effectiveness of funders’ efforts to incentivize joint work between researchers and decision-makers. Others might test curriculum and training initiatives that develop researchers’ capacity to conduct collaborative work with practitioners.
This may include:
- Studies that examine the impact of research use on youth outcomes and the conditions under which using research evidence improves outcomes.
The notion that using research will improve youth outcomes is a long-standing assumption, but there is little evidence to validate it. We suspect that the impact of research on outcomes may depend on a number of conditions, including the quality of the research and the quality of research use. One hypothesis is that the quality of the research and the quality of research use will work synergistically to yield strong outcomes for youth.
- Studies to identify and test other conditions under which using research evidence improves youth outcomes.
For example, recent federal policies have instituted mandates and incentives to increase the adoption of programs with evidence of effectiveness from randomized controlled trials, with the expectation that the use of these programs will lead to better outcomes. Do these policies actually increase the use of those programs and improve child outcomes?
NOTEThese research interests call for a range of methods, including experimental or observational research designs, comparative case studies, or systematic reviews.
- Where appropriate, consider using existing methods, measures, and analytic tools for assessing research use so that your findings can be compared and aggregated across studies (see Gitomer and Crouse  Studying the Use of Research Evidence: A Review of Methods.
- Existing measures may not be well-suited for some inquiries, so you may also propose to adapt existing measures or develop new ones. We strongly encourage applicants to utilize a new open-access methods and measures repository that shares existing protocols for collecting and analyzing data on research us.
- Mixed methods studies that collect and integrate multiple types of data may be particularly advantageous given the difficulty of relying solely on self-report methods to study evidence use in complex deliberations and decision-making contexts.
Major research grants
- $100,000 to $1,000,000 over 2-4 years, including up to 15% indirect costs.
- Studies involving secondary data analysis are at the lower end of the range (about $100,000-$300,000), whereas studies that involve new data collection can have larger budgets (typically $300,000-$600,000). Generally, only proposals to launch experiments in which settings (e.g., schools, child welfare agencies, justice settings) are randomly assigned to conditions are eligible for funding above $600,000.
Officers’ research grants
- $25,000–$50,000 over 1-2 years, including up to 15% indirect costs.
- Studies may be stand-alone projects or may build off larger projects. The budget should be appropriate for the activities proposed.
NOTEIn addition to financial support, the Foundation invests significant time and resources in capacity-building for research grantees. We provide opportunities to connect with other scholars, policymakers, and practitioners, and we organize learning communities that allow grantees to discuss challenges, seek advice from peers and experts, and collaborate across projects. To strengthen grantees’ capacities to conduct and implement strong qualitative and mixed-methods work, the Foundation also provides access to a consultation service focused on those methods.
- The Foundation makes grants only to tax-exempt organizations. We do not make grants to individuals.
- We encourage proposals from organizations that are under-represented among grantee institutions, including Historically Black Colleges and Universities (HBCUs), Hispanic-serving Institutions, Tribally Colleges and Universities (TCUs), Alaska Native-Serving Institutions, Native Hawaiian-Serving Institutions, and Asian American Native American Pacific Islander Serving Institutions (AANAPISIs).
Eligible Principal Investigators
- The Foundation defers to the applying organization’s criteria for who is eligible to act as a Principal Investigator or Co-Principal Investigator on a grant. In general, we expect that all investigators will have the experience and skills to carry out the proposed work.
- We strive to support a diverse group of researchers in terms of race, ethnicity, gender, and seniority, and we encourage research projects led by Black or African American, Indigenous, Latinx, and/or Asian or Pacific Islander American researchers.
- Only studies that 1) align with the stated research interests of this program and 2) relate to the outcomes of young people between the ages of 5 and 25 in the United States are eligible for consideration.
- We do not support non-research activities such as program implementation and operational costs, or make contributions to building funds, fundraising drives, endowment funds, general operating budgets, or scholarships. Applications for ineligible projects are screened out without further review.
All letters of inquiry are initially reviewed by internal staff with social science expertise. On occasion, internal reviewers will request more information from applicants or solicit expert opinions to better assess a project. In general, however, given the breadth of studies proposed in letters of inquiry, internal reviewers may lack deep knowledge of an applicant’s specific area of work, so avoid disciplinary jargon and use language appropriate for an educated lay audience.
We begin application reviews by looking at the importance of the research questions or hypotheses. Then we evaluate whether the proposed research designs and methods will provide strong empirical evidence on those questions.
NOTEFor major research grants applications, based on internal review of the letter of inquiry, the Foundation either invites a full proposal for further consideration, or declines the application. We do not accept unsolicited full proposals. Officers’ research grants are awarded on the merit of the letter of inquiry alone.
The letter of inquiry functions as a mini-proposal and is reviewed against the following criteria:
- The proposed study aligns with this program’s research interests and pursues one of three aims:
- Building, identifying, or testing ways to improve the use of existing research evidence.
- Building, identifying, or testing ways to facilitate the production of new research evidence that responds to decision-makers’ needs.
- Testing whether and under what conditions using research evidence improves decision-making and youth outcomes.
- The proposed study relates to the outcomes of young people between the ages of 5 and 25 in the United States.
- The letter of inquiry reflects a mastery of relevant theory and empirical findings.
- The letter of inquiry provides a clear operational definition of the use of research evidence for the purposes of the proposed project.
- The letter of inquiry states the theoretical and empirical contributions the study will make to the existing research base.
- The letter of inquiry discusses how the findings will be relevant to policy or practice.
- The proposed study employs rigorous methods (quantitative, qualitative, or mixed) that are commensurate to its goals.
- The study’s design, methods, and analysis plan fit the proposed research questions.
- The description of the research design makes clear how the empirical work will test, refine, or elaborate specific theoretical notions.
- Quantitative analyses might emphasize hypotheses and plans for testing them, while qualitative analyses might elaborate on how the research will illuminate processes underlying specific programs, policies, or practices.
- Plans for case selection, sampling, and measurement clearly state why they are well-suited to address the research questions or hypotheses.
- For example, samples should be appropriate in size and composition to answer the study’s questions. Qualitative case selection–whether critical, comparative, or otherwise– should also be appropriate to answer the proposed questions.
- The quantitative and/or qualitative analysis plan demonstrates awareness of the strengths and limits of the specific analytic techniques and how they will be applied in the current case.
- (If proposing mixed methods) Plans for integrating the methods and data are clear and compelling.
- (If proposing quantitative methods) The letter of inquiry demonstrates that the study will have adequate statistical power to detect meaningful effects.
- The letter of inquiry demonstrates adequate consideration of the gender, ethnic, and cultural appropriateness of concepts, methods, and measures.
- The proposed methods, time frame, staffing plan, and other resources are realistic.
- The letter of inquiry assures that data will be successfully collected, describes the team’s prior experience collecting such data, and identifies strategies for maximizing response rates and access to data sources.
- Prior training and publications demonstrate that the research team has a track record of conducting strong research and communicating it successfully.
- Be sure to demonstrate that the research team is well-positioned to address the varied tasks demanded by the study’s conceptualization and research design. This might include combining expertise across disciplines or methods.
- Be specific about the value of each member’s contributions to the team. We strongly discourage teams that comprise many senior investigators for very limited time and effort or otherwise make cursory nods to multi-disciplinary or mixed-role project teams. Instead, clearly justify the unique value of each team member and the specific role each will play in different stages of the project.
Where appropriate, we value projects that:
- harness the learning potential of mixed methods and interdisciplinary work
- involve practitioners or policymakers in meaningful ways to shape the research questions,
- interpret preliminary and final results, and communicate their implications for policy and practice
- combine senior and junior staff in ways that facilitate mentoring of junior staff
- are led by members of racial or ethnic groups underrepresented in academic fields
- generate data useful to other researchers and make such data available for public use
- demonstrate significant creativity and potential to advance the field, for example by
- introducing new research paradigms or extending existing methods, measures and analytic tools to allow for comparison across studies.
For major research grants, the review process for a successful application—beginning with the submission of a letter of inquiry and ending with approval by our Board of Trustees—is 10 to 15 months. If you are invited to submit a full proposal, you will be offered two deadlines to submit it, ranging from approximately six weeks to six months from the time of the invitation.
In general, the full proposal follows a format similar to that of the letter of inquiry, and includes a proposal narrative of about 25 pages, a complete budget and budget justification, and full curriculum vitae or resumes for key investigators and staff. If you are invited to submit a full proposal, we will provide additional detailed instructions on developing the proposal. Institutional Review Board Approval is not required at the time of the proposal’s submission but is required before issuing grant funds. Full proposals are reviewed using a scientific peer review process involving two or more external reviewers with content, methodological, and disciplinary expertise in the proposed work.
Following external review, the Foundation’s Senior Program Team reviews promising proposals and offers additional feedback. Applicants who receive positive reviews with critiques that can be addressed within a short time frame are asked to provide written responses to internal and external reviewers’ comments. Applicants’ responses to external reviews are then further reviewed by the Senior Program Team. Finally, the team makes funding recommendations to the Program Committee and the Board of Trustees. Approved awards are made available shortly after Board meetings, which take place in March, June, and October.
For complete instructions on applying for a research grant on improving the use of research evidence, download the 2023 Application Guidelines.