Digest, Issue 1: Summer 2016

Harnessing Discovery: Writing a Strong Mixed-Methods Proposal

As co-directors of the Fieldwork and Qualitative Data Research Laboratory in the UCLA Center for Culture and Health, Dr. Tom Weisner and I work with a range of funding organizations and agencies to support research teams in thinking through and writing mixed-methods proposals and putting research plans into motion. Our part in the William T. Grant Foundation’s longstanding effort to encourage and support mixed methods research has centered on helping mixed-methods teams clearly articulate their research questions, formulate designs that are well-aligned to those questions, and develop plans to implement the design in as efficient a manner as possible.

In reflecting on this work, four elements stand out as critical to the development of high-quality mixed-methods proposals: foregrounding the research question, methods planning and consistency throughout the proposal, code system development and application, and co-mingling data and mixed method data analysis.

Foregrounding the research question

In our experience, strong proposals make a compelling case for the inquiry that is motivating the research. Successful teams, regardless of their methodological approach, foreground the research question throughout their proposal. We often find that strong proposals first set the essential stage of establishing the research question and articulating its importance, and then begin to identify the methods that are best suited to answering the question. For instance, does the question warrant quantitative, qualitative, or mixed methods? If qualitative methods, which specific variety? Why? Strong proposals clearly show reviewers that the methods can be used to collect and analyze data that will yield strong evidence on the research question.

Methods planning and consistency throughout the proposal

For questions that are well suited to qualitative or mixed-methods approaches, we often advise teams to start with a clear rationale for the value added by these methods, with as much specificity of what these different perspectives will bring to the project.

We do not expect researchers to enter the field already knowing the answers they will find. We commonly encourage researchers, however, to think backwards when making these decisions. Thinking about what one wants to learn at the end of a project can be a great guide in making initial decisions about the study sample and methods that will ultimately yield the data needed to support the study conclusions.

One approach would be incorporating qualitative work into a nested design in which a thoughtfully selected sub-sample of participants is selected from the larger population that will provide survey or other quantitative data. We’ve seen good examples of this in proposals where the ultimate goal of a project is to unpack the processes of programs that are known to produce consistent and positive outcomes. When we see, for example, innovative school programs showing great success in serving immigrant English language learner high school students, we may want to know more about the nuances and factors in place. Such findings will help develop more contextualized understandings of how the programs actually work to improve youth outcomes. With such deeper understandings, study findings may be more likely to inform interventions in other settings or encourage policy or funding decisions to support program expansion.

In sum, we have found that strong proposals undertake the following tactics to demonstrate the strength of the data collection tools and the plans to field those tools. In strong proposals, there is a clear articulation of why and at which stages in the project qualitative data will be collected, the nature of the protocols that will be used, and how these data will be analyzed and integrated with any quantitative data to inform the key research questions. In our experience, strong mixed-methods proposals thread the methodological decisions consistently throughout the proposal in order to paint a clear picture of what the process will look like as it unfolds and where and how the selected methodology will add value.

Code system development and application

While there is a wealth of guidance on the development of code systems based on theory, we are often approached for support with more practical considerations: How to develop a useful code or code system? How best to excerpt and code qualitative content? How to use systems to assist in data reduction?

Put simply, coding is the practice of marking content for later retrieval. Coding is how you sustain the mechanics of a study, and the coding system and its content needs to inform the research question. It is important to understand that code systems will evolve and then stabilize into a set of concepts or themes that will help capture all important meanings in the qualitative data. For instance, proposal writers may initially identify a priori codes related to major themes that must be addressed, following an interview or observation protocol and drawing from theory. Then, as the qualitative data are explored, more emergent codes will be identified. Describing this process in the proposal is one important way to assure reviewers of the team’s preparedness for this work.

Strong proposals anticipate the emergence of important themes and acknowledge that this discovery will take place in the analysis phase, while also articulating how this process will relate to the central research questions. Tracking the emergence and discovery of themes is a fundamental part of mixed-methods and qualitative work, but a persistent focus on the research question is essential to identifying which themes and findings are relevant, rather than those that are more fractured or idiosyncratic.

We’ve seen that outlining how this work will unfold can contribute to a proposal’s strength. Codes and their rules for application are generated, tested with real data, modified, and re-tested (and this process is documented). The process continues until the system itself can be consistently applied (tested and documented via inter-rater reliability) to capture something meaningful across the research population that makes important contributions to the study findings. Under most circumstances, we generally recommend a process that begins with broader codes applied to larger sections of text or other qualitative content before moving to more narrowly defined and nuanced codes as the meanings in the data are better understood. Keeping in mind that the qualitative data are intended to speak to how and why phenomena occur, the broader context is important to capture as codes are developed and applied.

Finally, we often recommend including strategies for data reduction in the proposal. One strategy could be describing the possibility of “pulling back the lens” should the analysis process become overly narrow. Under these circumstances, codes can be merged or organized into broader “categories,” in order to communicate the broader findings. Another strategy involves the development of code ratings or weightings that can represent variation in quality, strength, emphasis, degree of use, etc.—virtually anything that can be overlaid on a numerical dimension. For example, perhaps all teachers talk about the extent to which they make use of a new teaching strategy. Some might describe how it penetrates all facets of their teaching, while others describe how it is used from “time to time.” All such content might be tagged with a “fidelity of use” code, but indexed across a dimension that represents the actual depth of adoption. Distributions such as these introduce another dimension to a project’s dataset and can be used in a range of creative approaches in data analysis.

Co-mingling data and mixed-method data analysis

Many times, we advise applicants intending to use mixed methods approaches to include clear plans for how data will be integrated and analyzed. Ideally, data will be collected in a “within-subject” manner, whereby qualitative and quantitative data are collected from the same group of participants in order to create natural connections between the different types of data.

Beyond these natural connections, code ratings and weightings, as described above, can also serve as mechanisms to connect qualitative, demographic, and quantitative data. Strong proposals draw on the quantitative findings to inform how the qualitative content will be approached and explored as well as how qualitative findings can be converted to other forms where they can be analyzed in more quantitative ways. It is important for research teams to describe the details of this process and justify the proposed approach in their proposals. This kind of detailed information reassures reviewers that the team understands the strengths and limits of various analytic techniques and has an analysis plan that deepens understanding or validates findings across methods.

In our experience, strong proposals also communicate a readiness for discovery. For example, the project and coding processes may bring to the surface variation on some unanticipated dimension within the qualitative content that exposes important distinguishing characteristics in the sample population. Describing preparedness for these findings and clear plans for how these data may enhance analysis can bring strength to a proposal as an innovative use of qualitative data.

For instance, qualitative data may yield an unexpected understanding that allows us to look at mean differences between participants. Subjects from different environments—urban, suburban, or rural, for example—may talk about the same things in different ways. Variables, then, often emerge from the qualitative data. A good example is the New Hope for Families and Children study, where a research team evaluating an anti-poverty intervention initially found that some groups of people took up services while others did not. The qualitative data showed that the individuals who did not take up the services were experiencing a number of stressors that impacted their lives. Looking at this finding as a quantitative variable led to the insight that a given number of stressors would predictably prevent subjects from taking up services. When the team broke out the participant groups beyond treatment and control, and instead subdivided the intervention group by the variable of stressors, it brought clarity to the picture of what was actually happening.

Conclusion

In our experience working with researchers affiliated with a variety of foundations and agencies, we’ve seen that strong mixed-methods proposals most often begin with a well-conceived and clearly articulated research question that reflects a genuine appreciation of some mystery that quantitative research has not or cannot figure out. These proposals delineate clear plans with concrete, well-justified action steps, which provide the kind of details that reviewers need to evaluate its merits.

Oftentimes, it is helpful to work backward from where we want to end up. In conjunction with quantitative work, strong mixed-methods or qualitative projects can provide the sort of context necessary to inform real-life policies, programs, and practices. And articulating the value of the chosen methods to support the question and conclusions is an important step to approaching this work.

Proposals that demonstrate awareness of the discovery process and how it can be harnessed in an open-minded but disciplined way that responds to the research question are likely to bear the most fruit. With qualitative and mixed-methods work, we need the ability to remain flexible in dealing with the data. But we also need to have clear plans and goals. Successful proposals show careful consideration of the time and resources that collecting, managing, and analyzing qualitative data requires. In a mixed-methods study, for instance, although quantitative data may be collected from a large group of individuals, it is almost always impractical or even impossible to conduct 30-minute interviews with every individual in the group. We have found that strong mixed-methods proposals will often incorporate qualitative research into a study design that acknowledges and respects these considerations. We’ve seen that the teams who make a strong case for their work give careful thought to how they intend to marry qualitative and quantitative methods—for instance, by using representative subgroups in nested studies—and articulate the rationale for their plans.

In the end, proposals that reflect a strong understanding of these issues, and address them with fluency and awareness at all stages—from clearly articulated research questions to well-chosen methods and designs to a keen eye on how findings will support meaningful conclusions—often make the most convincing case that they will lead to important and sound research.

Downloads

In this issue

The potential of big data is multiplied when researchers are able to use it to produce work that is relevant to the leaders who make decisions about policies and practices that affect young people.
Moving from Data to Research to Policy: What Does it Take?
A key approach in our efforts to support impactful research is to invest in the development of tools that enhance the work of many researchers engaged in a common enterprise.
Investing in Tools to Create Evidence and Improve Policy
What steps can we take to ensure that access to big data leads to the production of high-quality, useful research evidence?
Using Data to Produce Useful Research Evidence
Taken together, these studies produced deep insights that would not have been accessible through quantitative research alone.
The Value of Qualitative and Mixed-Methods Research: Examples from our Portfolio on the Social Settings of Youth Development
Understanding the social processes that involve interactions between individuals, or between individuals and their contexts, is essential to responding to inequality.
How and Why: Questions that are Well-suited for Qualitative and Mixed Methods
Digest, Issue 1: Summer 2016
We believe that qualitative and mixed-methods research is essential to building, understanding, testing, and improving responses to inequality.
Why Qualitative Research?

More Digest Issues

The Digest

Issue 9: Winter 2023-24

The ninth issue of the Digest features an update on the Institutional Challenge Grant program five years after its launch, with insights from a Foundation stock-taking effort and reflections from two grantee teams. We also share new thinking on the use of research evidence, including ways for researchers to leverage existing findings to bolster studies of effective strategies to improve research use.
The Digest

Issue 8: Winter 2022-23

The eighth issue of the Digest features insights about where new research might help identify ways to reduce inequality in youth outcomes after COVID-19, as well as how the fields of implementation science and research on research use might learn from each other in ways that yield more transformative research in years ahead.
The Digest

Issue 7: Winter 2021-22

The seventh issue of the Digest includes fresh insights from program staff and grantees on how researchers can successfully confront the challenges of studying causal mechanisms for reducing inequality, as well as how the global community of researchers and funders focused on improving the use of research evidence can continue to break new ground in the years ahead.
The Digest

Issue 6: Winter 2020-21

Essays in this issue of the Digest focus on the importance of looking beyond individual action and customary metrics in research on reducing inequality, as well as how strengths-based, race-conscious research can be produced and used to uplift communities of color.
The Digest

Issue 5: Winter 2019-20

The fifth issue of the William T. Grant Foundation Digest features insights from program staff and grantees on the importance of looking beneath the surface to consider the underlying factors that create and shape challenges that social scientists seek to address, whether they be related to reducing inequality in youth outcomes or improving the use of research evidence in policy and practice.
The Digest

Issue 4: Winter 2018/19

The fourth issue of the William T. Grant Foundation Digest features insights that may point the way toward a more nuanced understanding of evidence use and inspire new and more wide-ranging examinations of ways to reduce inequality in youth outcomes.
The Digest

Issue 3: Winter 2017/18

The third issue of the William T. Grant Foundation Digest features insights on how research on ability tracking can inform studies to improve the outcomes of English learners, as well as how researchers and school districts can partner to build learning systems based on research evidence.
The Digest

Issue 2: Spring 2017

The second issue of the William T. Grant Foundation Digest features writing on research rigor and relevance, as well as the potential for a new research agenda for improving the outcomes of English learners under the Every Student Succeeds Act.

Subscribe for Updates