The discipline of political science has been engaged in vibrant debate about research transparency for more than three decades. In the abstract, augmenting transparency implies the same steps in all types of political science scholarship: making the empirical information that underpins our work meaningfully accessible; elucidating how that information was collected or generated; and describing how the information was interpreted and/or analyzed.Footnote 1 Nonetheless, the way in which transparency is pursued—and the type and difficulty of the challenges that pursuing it presents—vary across research traditions.
Scholars who collect, generate, and draw on qualitative evidence in their work are relative newcomers to the debate about transparency. Their more vigorous engagement during the past decade has brought important new voices and viewpoints to the conversation and raised new issues and questions. In particular, the recently completed Qualitative Transparency Deliberations (QTD; see www.qualtd.net), directed by Tim Büthe and Alan Jacobs, represent a crucial step forward. Called for during the business meeting of the Qualitative and Multi-Method Research (QMMR) section of the American Political Science Association (APSA) during the 2015 APSA conference, the QTD involved 13 working groups (and hundreds of political scientists beyond those groups), and ultimately produced 13 thoughtful final reports.Footnote 2 Among their many lessons, the QTD demonstrated that practices for making scholarship more transparent—and the challenges that doing so poses—vary among forms of qualitative inquiry.Footnote 3
This article first briefly reviews the literature on transparency in qualitative inquiry, describing what we see as its evolution. Next, we highlight some considerations that shape how and how much researchers pursue transparency. We then describe a set of exciting, creative techniques that scholars are developing and pioneering to enhance the transparency of qualitative research. These strategies can help scholars to illustrate research practices, clarify the empirical underpinnings of their work, and facilitate its evaluation, as well as balance the various considerations that bear on achieving transparency. The diversity of these emerging strategies demonstrates that transparency is not an all-or-nothing proposition and can be pursued in many different ways. The conclusion summarizes and offers thoughts on the way forward.
STATE OF THE DEBATE
During the past decade, political scientists who generate, collect, interpret, analyze, and publish scholarly work based on qualitative data have engaged in energetic dialogue about research transparency.Footnote 4 One way to characterize the arc of the debate is to suggest that it began with thoughtful consideration of “whether” scholars who use qualitative data and methods can and should seek to make their work more transparent, and then progressed to the question of “what” information scholars should share about data production and analysis, and what data they should share, in pursuing transparency. The debate only recently has begun to consider “how” transparency can be achieved—that is, which concrete techniques and strategies scholars can use to augment the transparency of their work.
The debate only recently has begun to consider “how” transparency can be achieved—that is, which concrete techniques and strategies scholars can use to augment the transparency of their work.
This section presents a very general overview of the literature that addresses the first two of these questions. We take up the third question in the third and fourth sections of the article, first discussing some challenges related to achieving transparency and then offering a series of strategies for doing so. The literature on the first two questions is rich and extensive: it includes interventions by scholars in various academic disciplines (e.g., political science, education, health, and sociology); by practitioners in information schools, university libraries, and data repositories; and by scholars from around the globe. Our review, by necessity, is incomplete given space limitations.Footnote 5
The question of whether political scientists who engage in qualitative research can and should make their work more transparent has mainly played out (in written form) in a series of journal symposia published since 2010.Footnote 6 In the opening articles (Büthe and Jacobs Reference Büthe and Jacobs2015; Golder and Golder Reference Golder and Golder2016) and in contributions to these symposia, scholars have discussed the intellectual benefits—for producers and consumers of scholarship based on qualitative data and methods—of making such research more transparent, as well as the epistemological, ethical, legal, and practical challenges of doing so. In addition to the contributions to these symposia, several stand-alone articles written by scholars from various disciplines have addressed these questions, with some advocating for transparency (e.g., Corti Reference Corti2006; Elman, Kapiszewski, and Lupia Reference Elman, Kapiszewski and Lupia2018; Gleditsch and Janz Reference Gleditsch and Janz2016; Miguel et al. Reference Miguel, Camerer, Casey, Cohen, Esterling, Gerber and Glennerster2014) and others registering concerns (e.g., Monroe Reference Monroe2018; Schwartz-Shea and Yanow Reference Schwartz-Shea and Yanow2016; Tripp Reference Tripp2018; Tsai et al. Reference Tsai, Kohrt, Matthews, Betancourt, Lee, Papachristos, Weiser and Dworkin2016).
The issue of what information about data generation and analysis and which data scholars should share likewise has been considered in a range of written work. The QTD advanced the debate in productive ways, offering novel insights on the meaning and “content” of transparency. In their overview article, Jacobs et al. (Reference Jacobs, Tim Büthe, Arjona, Arriola, Bennett and Björkman2019, 25–27) provided useful lists of information that can be shared to increase transparency, and various reports (e.g., Elkins, Spitzer, and Tallberg Reference Elkins, Spitzer and Tallberg2019; Schneider, Vis, and Koivu Reference Schneider, Vis and Koivu2019, and others mentioned elsewhere in this article) offered guidance about what can be shared at low cost, low risk, and/or efficiently to achieve transparency. Additional examples of work considering these questions include Barnes and Weller’s (Reference Barnes and Weller2017) discussion of what information can elucidate analytic processes in process-tracing work and Tuval-Mashiach’s (Reference Tuval-Mashiach2017, 130–34) suggestion that scholars answer three reflective questions in pursuit of transparency (i.e., what they did and how and why they did it).
CONSIDERATIONS IN MAKING RESEARCH TRANSPARENT
All scholars weigh and balance various factors and pressures as they consider how to make their work more transparent. Among these are two obligations whose fulfilment trumps other aims: pursuing transparency ethically (see, e.g., Carusi and Jirotka Reference Carusi and Jirotka2009; Chauvette, Schick-Makaroff, and Molzahn Reference Chauvette, Schick-Makaroff and Molzahn2019) and legally. For example, scholars must obtain the informed consent of the people they involve in their research in order to ethically share the information that those “human participants” convey. Scholars and participants must reach unambiguous agreement—ideally, through a consultative process—on what, when, where, how, and with whom information can be shared
Scholars and participants must reach unambiguous agreement—ideally, through a consultative process—on what, when, where, how, and with whom information can be shared…
, and scholars must adhere strictly to those agreements, without compromise.Footnote 7 Likewise, scholars cannot legally share work that is under copyright if permission cannot be secured. These issues are discussed in more detail in the next section.
Scholars also consider other factors when deciding how to pursue research transparency. These include (but are not limited to) intellectual considerations (i.e., how to pursue transparency in ways that will showcase the rigor and power of research; see, e.g., Elman and Lupia Reference Elman and Lupia2016, 51; Fujii Reference Fujii2016, 25–26); resource considerations and opportunity costs (i.e., how much time and money to spend on pursuing transparency and what the cost of not spending those resources elsewhere will be; see, e.g., Saunders Reference Saunders2014, 694–97); and expositional considerations (i.e., how practically to pursue transparency while ensuring that the text remains readable and suitable for standard publication formats; see, e.g., Moravcsik Reference Moravcsik2012, 36).
TRANSPARENCY IN PRACTICE
This section considers the question of “how” transparency can be achieved in qualitative inquiry. We outline a set of techniques that scholars can use to ethically and legally increase their work’s transparency while balancing other considerations relevant to their situation and project. The discussion draws on the literature on, and our experiences working with scholars pursuing, transparency in qualitative research. Exciting and promising techniques beyond those discussed are surely being developed and used. Scholars should consider which strategies to use before beginning research because their choices bear on how they track the research process as they carry out their work.
Preregistration entails specifying a research project’s rationale, hypotheses, design, and plan for data generation and analysis before initiating data collection. Interest in preregistration for qualitative work has been increasing (Haven and Van Grootel Reference Haven and Van Grootel2019; Jacobs Reference Jacobs, Elman, Gerring and Mahoney2020; Kern and Gleditsch Reference Kern and Gleditsch2017; Piñeiro and Rosenblatt Reference Piñeiro and Rosenblatt2016).Footnote 8 There are good reasons to be skeptical of the need for and utility of preregistration in qualitative research given the often exploratory nature of such work (Haven and Van Grootel Reference Haven and Van Grootel2019, 6–8). Nonetheless, having a timestamped record of the original research and analysis plan, as well as changes made during the research process, can help scholars to stay on track and to carefully consider and justify (for themselves and their readers) changes to their design. Creating and maintaining such a plan may be a relatively low-cost way for scholars to demonstrate the rigor of their research without overloading a publication with methodological description. A pioneering example of a preregistered qualitative case study is presented in Christensen, Hartman, and Samii (Reference Christensen, Hartman and Samii2019, 26–31), who used case-study evidence to validate (and extend) quantitative findings.
Methodological appendices—that is, supplementary material that discusses how an author collected, generated, and analyzed data—can advance transparency in various types of research and can take several forms.Footnote 9 Creating these appendices allows researchers to augment the transparency of their work without affecting its readability and length—even when sharing the data underpinning the work is not possible. Journals rarely place length limitations on these appendices,Footnote 10 affording scholars great latitude in describing their research process. However, appendices can be difficult to locate and the discussions difficult to connect to particular arguments in the text.
For instance, scholars who engage in ethnography or interpretive research can provide an extended discussion of these issues in a stand-alone document that supplements the space-constrained text of research articles (see Lester and Anders Reference Lester and Anders2018, Reyes Reference Reyes2018, and the appendix in Goffman Reference Goffman2014 for notable examples). Scholars who conduct interviews also can increase the transparency of their work by including appendices. Shesterinina (Reference Shesterinina2016) provided a particularly impressive example, describing in detail how she organized her fieldwork in Abkhazia, her interview settings and strategies, and how she recruited respondents and gained their trust. Bleich and Pekkanen (Reference Bleich, Pekkanen and Mosley2013) proposed a formalized “Interview Methods Appendix” comprising a descriptive list of interviews conducted (including, e.g., the source of the interview contact, structure, and length); Bleich (Reference Bleich2018) included such an appendix for a recent article.
Scholars who conduct archival research can augment their work’s transparency by providing a log that describes how cited primary or secondary historical sources were originally produced and why, among those consulted, a subset was selected for inclusion in the research (Gaikwad, Herrera, and Mickey Reference Gaikwad, Herrera and Mickey2019, 2; Verghese Reference Verghese2016, appendix 3, discusses the use and selection of secondary historical sources). Scholars whose work relies heavily on qualitative coding can enhance transparency by including with their publication a “coding appendix” that details how they arrived at their initial coding, resolved intercoder disagreements, and refined their schema. For instance, Fuji Johnson (Reference Fuji Johnson2017) included such an appendix to describe how she coded legislative discourse on sex work in Canada. Scholars who use process tracing can generate appendices to bolster analytic claims and make their role in an overall argument more explicit (Bennett, Fairfield, and Soifer Reference Bennett, Fairfield and Soifer2019, 9–10; Fairfield Reference Fairfield2013).
Other forms of research documentation also can be shared to increase transparency. For instance, scholars in education research have pioneered the use of reflective journals in which they record in detail the steps of the research process (Ortlipp Reference Ortlipp2008). Also, some scholars with large research teams periodically conduct “debriefing interviews” in which team members describe their decisions and actions throughout the research process (see, e.g., Collins et al. Reference Collins, Onwuegbuzie, Johnson and Frels2013, which discusses the use of this practice in Onwuegbuzie et al. Reference Onwuegbuzie, Frels, Leech and Collins2011). Sharing these journals and interview transcripts as part of a methodological appendix illuminates and clarifies key research steps and choices for readers.
Most generally, scholars can include as a methodological appendix the information that they assembled following “reporting guidelines” that set thresholds for information provision about data collection and analysis. Most commonly used in medical research, these guidelines exist for in-depth interviews and focus groups (COREQ: Tong, Sainsbury, and Craig Reference Tong, Sainsbury and Craig2007); for synthesizing qualitative research (ENTREQ: Tong et al. Reference Tong, Flemming, McInnes, Oliver and Craig2012); and for qualitative research in general (SRQR: O’Brien et al. Reference O’Brien, Harris, Beckman, Reed and Cook2014). Although they need to be adapted for use by different political science research communities, these guidelines strike us as a potentially fruitful way to consider, organize, and systematize ideas about what should be shared to achieve transparency.
The potential utility of methodological appendices notwithstanding, they can be hard to locate and their discussions difficult to connect to particular arguments in the text (Grossman and Pedahzur 2020, 2f.). Especially when placed on an author’s personal website, online appendices also are at risk of eventually becoming unavailable (Gertler and Bullock 2017, 167). As an alternative, scholars can publish methodological companion articles to augment the transparency of a primary publication. Doing so can lower the opportunity cost of pursuing transparency, allowing scholars to enhance both the transparency of their work and their publishing record.Footnote 11
Annotation also can help scholars to achieve transparency. Two forms of annotation developed for political science inquiry are Active Citation (AC), pioneered by Moravcsik (Reference Moravcsik2014; Reference Moravcsik2019), and Annotation for Transparent Inquiry (ATI; see qdr.org/ati) (Gaikwad, Herrera, and Mickey Reference Gaikwad, Herrera and Mickey2019, 15–17), developed by the Qualitative Data Repository (with which both authors are affiliated). ATI builds on AC, using more sophisticated technology and placing greater emphasis on the value of sharing underlying data sources. ATI uses open-annotation technology to allow researchers to link specific passages in a publication to digital annotations comprising “analytic notes” and extended excerpts from data sources, as well as to the data sources themselves when they can be shared ethically and legally (Karcher and Weber Reference Karcher and Weber2019). Analytic notes can elucidate data generation or analysis, make explicit the link between a source and a claim in a published text, or discuss other aspects of the research process. Extended excerpts facilitate transparency even when sharing underlying data sources is not possible (Ellett Reference Ellett2016; and see the discussion of this project in Shesterinina, Pollack, and Arriola Reference Shesterinina, Pollack and Arriola2019, 23). These annotations, as well as an overview of the research process, comprise an ATI data supplement. By associating methodological discussion and underlying data with the precise point in the text to which they relate, ATI enhances scholars’ ability to demonstrate the rigor of their work without disturbing narrative flow. Given the relative novelty of the approach, however, researchers may find the creation of ATI supplements time-consuming.
QDA Software Output
Bringer, Johnston, and Brackenridge (Reference Bringer, Johnston and Brackenridge2004) highlighted how scholars can use qualitative data analysis (QDA) software—in particular, the memo/note function—to provide an “electronic audit trail” of the research process and the development of a project.Footnote 12 Similarly, based on their work in international business and management, Sinkovics and Alfoldi (Reference Sinkovics and Alfoldi2012) argued that QDA software can help to capture the nonlinear back-and-forth between data collection and analysis that is characteristic of much qualitative work, thereby improving its transparency and trustworthiness. QDA software also can help scholars to provide a coherent image of their data as a whole. Corti and Gregory (Reference Corti and Gregory2011) have long advocated for the sharing of QDA-generated qualitative data, and some researchers have shared excerpts from their QDA projects (Luigjes Reference Luigjes2019; O’Neill Reference O’Neill2017). Although the proprietary nature of QDA file formats had stymied these efforts, the recent emergence of an open-exchange format for QDA data—supported by the major software projects (see www.qdasoftware.org)—should help scholars to be transparent about the generation and analysis of data and to share the QDA data themselves. Sharing QDA-generated qualitative data has low opportunity costs because doing so does not entail the creation of a separate product (as with, e.g., a methodological appendix). It may be difficult, however, to disentangle shareable data from those that cannot be shared (for legal or ethical reasons) because such data typically are exported in their entirety.
A final strategy that scholars can adopt to increase the transparency of their work is making the underlying data accessible to other researchers. This strategy intersects with some of those mentioned previously. For instance, some scholars include data as part of their methodological appendices or ATI annotations.
Achieving transparency does not require that scholars share all of the data that underpin a publication but rather calls on them to make careful choices about which data to share. For instance, ethical and legal constraints may limit which data can be shared. As noted previously, if human participants in a scholar’s research do not consent to the information they provide being shared more broadly, it cannot be shared. Likewise, it may not be possible to share documents that are under copyright. However, scholars can petition the copyright owner for permission to share such material (see, e.g., newspaper articles shared in association with Holland Reference Holland2019), and documents that are in the public domain can be freely shared. For instance, Hitt (Reference Hitt2019) shared papers of US Supreme Court justices that they had dedicated to the public domain.
Sharing data ethically, and ensuring that they are useful to others, may require scholars to take preparatory steps including cleaning, organizing, and documenting the data. The earlier in the research process that scholars take these steps, the less time-consuming they may be. Also, to protect human participants, scholars may need to de-identify data—that is, remove “direct identifiers” (i.e., pieces of information that are sufficient, on their own, to disclose an identity, such as proper names, addresses, and telephone numbers) and “indirect identifiers” (i.e., contextual information that can be used—often in combination with other information—to identify a participant).Footnote 13 Contreras (Reference Contreras2019, 11–16) explored three strategies for “partially disclosing” information about participants in dangerous research: semibiographical disclosure, partial spatial disclosure, and invitational disclosure (which involves inviting people to a field site to meet participants); see also Shesterinina, Pollack, and Arriola (Reference Shesterinina, Pollack and Arriola2019, 15–16).
Scholars can make their research data available in many venues. Best practice is to do so in institutions such as data repositories (Kapiszewski and Karcher Reference Kapiszewski, Karcher, Elman, Gerring and Mahoney2020). Scholars who share data in these venues can help to address ethical concerns about the data’s availability by placing “access controls” on the data that limit the number or type of individuals to whom they are available. Scholars also can combine strategies. For instance, Camp and Dunning (Reference Camp and Dunning2015) shared de-identified transcripts of interviews with political brokers in Argentina, describing the general region but not the specific location where the data were collected, and restricted access to the data to researchers with clearly specified research plans.
CONCLUSION: CONTINUING FORWARD
Debates about the challenges and benefits of research transparency, about the “content” of transparency, and about how precisely to achieve transparency in scholarly work, are proceeding across academic disciplines and geographies. Multiple innovative techniques have been developed to aid scholars to increase the transparency of their work within ethical and legal limits, and to help them balance the considerations that bear on the pursuit of transparency. The creation and use of these techniques highlight that “transparency” is not an all-or-nothing prospect: most work is neither left completely opaque nor made completely transparent but rather falls somewhere in between.
Indeed, it is important to remember that transparency is a means to an end, not an end in itself. As discussed here, transparency adds value by facilitating comprehension and assessment of our scholarship
[T]ransparency is a means to an end, not an end it itself.
. The goal and necessity of assessment are in no way new: our research is assessed informally every day by individual scholars and more formally periodically through the peer-review process. However, increasing transparency facilitates new ways to evaluate qualitative inquiry. The availability of shared data and materials also raises compelling questions. Can (and should) we use data and materials shared to augment the transparency of qualitative work to verify claims made in that work? If so, how can we develop forms of evaluation that accommodate the diverse epistemological commitments and methodological practices that make qualitative research such a rich and powerful form of inquiry? How can and should shared qualitative data be valued compared to traditional scholarly outputs (e.g., published articles)? Can shared qualitative data and materials be used in qualitative-methods instruction in ways similar to those in which their quantitative analogues are routinely used in quantitative-methods courses?
It is critically important that scholars who use qualitative data and methods continue to discuss all of these topics, to engage with one another within and across different qualitative research traditions, and to listen to and learn from one another. Broad ongoing involvement is crucial to the productivity of the conversation. Ultimately, however, we believe that the large and heterogeneous community of qualitative researchers will develop the best answers to the questions raised in this article and the broader symposium by actively seeking to make their work more transparent, employing the techniques discussed here and others that emerge. As they do so, research communities can draw on their examples to develop community-specific norms and practices for transparency, which funders, journal editors, and other institutions then can adopt. Both continued conversation and engaged practice are necessary for transparency to be deployed to its best purpose: to demonstrate the rigor of qualitative research and it valuable contributions to the production of knowledge.
We are extremely grateful to the editors of this symposium for inviting us to participate and to this article’s three reviewers—Alan Jacobs, Tamarinde Haven (in signed reviews), and an anonymous reviewer—for their suggestions and recommendations, which made the article much stronger. Any remaining problems are our responsibility alone.This article is based upon work supported by the National Science Foundation under Grant No. 1823950.