Hostname: page-component-77c89778f8-m42fx Total loading time: 0 Render date: 2024-07-24T10:40:08.290Z Has data issue: false hasContentIssue false

Blogs, Online Seminars, and Social Media as Tools of Scholarship in Political Science

Published online by Cambridge University Press:  19 March 2018

Justin Esarey
Rice University
Andrew R. Wood
Rice University
Rights & Permissions [Opens in a new window]


How do political scientists use online tools as part of their scholarly work? Are there systematic differences in how they value these tools by field, gender, or other demographics? How important are these tools relative to traditional practices of political scientists? The answers to these questions will shape how our discipline chooses to reward academics who engage with “new media” such as blogs, online seminars (i.e., webinars), Twitter, and Facebook. We find that traditional tools of scholarship are more highly regarded and used more often than any new media, although blogs are considered most important among new media. However, we also find evidence that these webinars are used and valued at rates comparable to traditional tools when they are provided in ways that meet political scientists’ needs. Finally, we observe that women and graduate students are substantially more likely than men and tenure-track academics to report that webinars and online videos are important sources of new ideas and findings.

Copyright © American Political Science Association 2018 

Journal articles, books, conferences, lectures, and seminars have been basic tools to communicate knowledge for centuries and are staple resources for academics. In the past decade, these traditional instruments have been supplemented by new tools that make use of modern technology: blogs, online videos and seminars (i.e., webinars), and social media including Twitter and Facebook. Political scientists must decide as a discipline how they will choose to value contributions made through these new media relative to their traditional counterparts. For example, is blogged scholarship as “serious” or “important” as an essay in an edited volume? How does a scholarly web lecture compare with a conference presentation as a research and service contribution? Should creating an online teaching tool used by thousands of people a year count in one’s promotion and tenure package?

In this article, we study how blogs, online videos, and social media are being used by political scientists as tools of scholarship. Our aim is to provide descriptive information useful for evaluating the importance of these tools to the scholarly community: how often new media are used, how scholars evaluate their utility, and the purposes for which they are valued. Although we do not seek to test any particular theory, we can determine whether there are systematic differences in how political scientists value new and traditional tools of scholarship by field, gender, seniority, and other demographics. We believe that this information will help us determine how decisions about the importance of new media could impact diversity in the field and facilitate the education of the next generation of scholars.

Four findings of our study stand out as especially important. First, blogs are a commonly utilized and valued tool for academic discussion and the dissemination of new ideas in political science. Second, although at present online seminars (i.e., webinars) and videos are used less often than blogs, there is substantial latent demand for topically relevant resources of this type, which are widely used when made available. Third, women are more likely than men to report that online and offline modes that maximize personal interaction (i.e., webinars, Facebook, conferences, and small groups) are important for learning about ideas and research findings; they rate impersonal exchanges (e.g., blog posts) as less important when compared to men. Finally, graduate students are substantially more likely than tenure-track academics to report that online videos and webinars are important sources of new ideas and findings.

Based on our findings, we speculate that the discipline would benefit from a greater focus on producing online videos and webinars. We also surmise that investing in these tools would disproportionately benefit the next generation of political scientists in graduate school, as well as underrepresented groups in the discipline. We find that online videos and blogs are currently not as important to scholars as search engines and the traditional tools of scholarship (e.g., journals and conferences). However, data from a cross-sectional survey of the discipline and from usage patterns for two widely used online political science resources lead us to believe that there is a large current audience with potential for significant future growth.

Based on our findings, we speculate that the discipline would benefit from a greater focus on producing online videos and webinars. We also surmise that investing in these tools would disproportionately benefit the next generation of political scientists in graduate school, as well as underrepresented groups in the discipline.


Our data originate from the following three sources:

  • an Internet-based survey of political scientists in the most research-active departments in the United States

  • viewership data collected as part of the International Methods Colloquium (IMC) project, an online seminar series of research talks and roundtables related to political methodology

  • readership data collected by The Political Methodologist (TPM), the newsletter of the Society for Political Methodology (i.e., the American Political Science Association’s organized section for methodology)

Survey Data

In August 2015, we used SurveyMonkey to distribute a questionnaire to an e-mail list of 9,840 political scientists. The list was created by manually collecting e-mails on websites from the following three sources:

  1. 1. The e-mail address was listed as that of a faculty member or graduate student on the website of a PhD-granting political science departmentFootnote 1 in the United States.

  2. 2. The e-mail address was listed as that of a faculty member on the website of a political science department at an institution designated as RU/VH, RU/H, or DRU by Carnegie (

  3. 3. The recipient participated as a viewer or presenter in the IMC (see 2

We received 909 responses that answered at least one question in the survey.Footnote 3 The survey initially asked whether a respondent had viewed an IMC session. If respondents had viewed the IMC once or more, they were asked eight additional questions about their experience. Respondents then were asked five demographic questions (i.e., occupation; gender; age; field of interest and expertise; and proportion of time spent on research, teaching, and other activities) and 36 questions about their experience with and interest in various online and offline tools of academic work. The full survey questionnaire is included in the online appendix.

Figure 1 shows demographic characteristics of the survey respondents, which included roughly equal numbers of tenure-track academics and graduate students.Footnote 4 As expected from our sampling frame, the sample’s representation of non-tenure-track academics, emeritus faculty, and political scientists working in industry was much smaller and possibly unrepresentative. The sample included generous proportions of faculty working in comparative politics, American politics, international relations, and methodology.Footnote 5 There was substantially less representation of political scientists in public policy and political theory, leading to caution in generalizing to these findings from our respondent pool. As expected, the age distribution skewed young, given that graduate students comprised a substantial proportion of the sample and tended to have a compressed age range relative to those in other positions. In our sample, 62.1% identified as male.

Figure 1 Demographic Descriptors of Survey Respondents

To further assess the representativeness of the sample relative to the population of political scientists, we compared the demographic characteristics of our survey respondents to those of a 2015 membership survey for the American Political Science Association (APSA).Footnote 6 In general, the survey closely approximated the gender distribution of APSA membership but skewed substantially younger and contained a disproportionate representation of graduate students relative to faculty members. For example, whereas only 13% of APSA respondents reported being younger than 30 years of age, slightly more than 28% of respondents in our survey stated that they were younger than 30. In addition, compared to the APSA sample, a substantially larger proportion of our sample identified as having an interest in methods. This discrepancy may be in part because the APSA survey allowed respondents to indicate only one subfield, whereas our survey allowed them to select multiple options.


The International Methods Colloquium is an online seminar series that hosts periodic research presentations and roundtable discussions by political methodologists; it is supported by a grant from the National Science Foundation. Attendance at these seminars is freely available to the public, including the possibility for real-time questions and answers as well as discussion among multiple participants (International Methods Colloquium 2016). The IMC began hosting talks in spring 2015, and has hosted a total of 41 presentations as of January 2018. After each live seminar concludes, the video is uploaded to YouTube for later viewing.

The GoToWebinar software used for live broadcasts in these three seasons tracked the number of participants (including audience members) in each session (GoToWebinar 2016). The minimum number of participants was four (i.e., the speaker, moderator, and two production assistants), although on at least one occasion only one production assistant was present. YouTube also tracked the number of video views over the lifetime of each video (YouTube 2016a; 2016b). These tracking statistics provide insight into the size of the audience for research-related video presentations—at least among political scientists interested in (quantitative) methodology.


In late 2013, TPM started a WordPress blog to run alongside its biannual print edition.Footnote 7 WordPress collects detailed statistics about the number of unique visitors to each article and to the blog as a whole ( 2016). These statistics provide a direct measure of interest in blogged academic research from the political-methodology community. It also provides insight about what types of blog posts attract the most interest.

…greater acceptance of blogs as a forum for scholarly discussion within the academy presumably lends legitimacy to blogging as an academic activity by encouraging scholars to blog more, thereby communicating with journalists and policy makers.


Our project is distinguished from prior work in two important ways. First, although political scientists have written frequently about blogging and using social media as an academic, much of this work studies these activities as strategies to engage with policy makers and the larger public beyond political science (Carpenter and Drezner Reference Carpenter and Drezner2010; Farley Reference Farley2013; Farrell and Drezner Reference Farrell and Drezner2008; Farrell and Sides Reference Farrell and Sides2010; Gruzd, Staves, and Wilk Reference Gruzd, Staves and Wilk2012; Klunk Reference Klunk2012; Lynch Reference Lynch2016; McKenna Reference McKenna2007; Nyhan, Sides, and Tucker Reference Nyhan, Sides and Tucker2015; Sides Reference Sides2011; Walt Reference Walt2010) or as a teaching tool to educate students (Sjoberg Reference Sjoberg2013). By contrast, this article primarily discusses how political scientists use blogging, social media, and webinars as tools for scholarship, including learning about new findings and updating their own research toolkit.

We consider using online tools for research purposes as complementary to using them for outreach and teaching. For example, greater acceptance of blogs as a forum for scholarly discussion within the academy presumably lends legitimacy to blogging as an academic activity by encouraging scholars to blog more, thereby communicating with journalists and policy makers.

Second, most prior studies of blogs, online videos/webinars, and social media as research tools draw inferences from small-scale intensive interviews (Acord and Harley Reference Acord and Harley2012; Dawson and Rascoff Reference Dawson and Rascoff2006; Esposito Reference Esposito2013; Maron and Smith Reference Maron and Kirby Smith2008; Papalexi et al. Reference Papalexi, Cheng, Dehe and Bamford2014). The few extant large-scale surveys mostly employed convenience samples of large and heterogeneous groups of academics from many locations and disciplines (Gruzd and Goertzen Reference Gruzd and Goertzen2013; Ponte and Simon Reference Ponte and Simon2011; Procter et al. Reference Procter, Williams, Stewart, Poschen, Snee, Voss and Targhi-Asgari2010; Rowlands et al. Reference Rowlands, Nicholas, Russell, Canty and Watkinson2011). By contrast, our survey specifically studied the research-active community of political scientists in the United States and used a sampling frame targeted to this group.Footnote 8 Consequently, our study makes a novel contribution of particular interest to the discipline of political science.


We begin by describing the self-reported experience of our survey respondents with online tools figure 2 depicts responses on a six-point scale to the question “About how often do you use online tools as a part of your work in the following ways?” The nine possible experiences listed in the figure are summarized as follows: 82.0% of respondents engaged in at least one new media activity as part of their work “once a month” or more and 68.1% engaged in at least one activity “2-3 times a month” or more.

Figure 2 Experience Working with Online Tools


The most commonly performed online activity covered by our survey was reading a blog post. Our question asked specifically about “blog post[s] related to your academic work” rather than non-academic content or content in unrelated fields. The modal respondent in our survey reads an academic blog post “once a week or more.” On average, respondents reported reading academic blog posts between one and two to three times a month.

Our survey’s responses for blogs matched well with readership data collected from TPM; these data are shown in figure 3. The findings show steady growth in TPM’s page views from September 2013 (i.e., the first month for which data are available) to June 2016, with the readership leveling off after this point to about 4,000 page views a month. By comparison, as of August 1, 2016, the e-mail listserv of the Society for Political Methodology had 3,258 members.

Figure 3 Page Views from The Political Methodologist

From this information, we infer that blogs are an increasingly important source of information for political scientists. However, we should not yet overstate the importance of blogs relative to other more traditional tools of scholarship. This point is emphasized by political scientists’ self-reported importance scores for sources of new ideas and research findings. Our survey respondents rated nine different sources on a five-point scale (i.e., 1=not at all important; 5=extremely important); their ratings are shown in figure 4.

Figure 4 Sources of New Ideas and Research Findings Rated by Importance

Figure 4 reveals that search engines and one-on-one/small-group conversations with colleagues were rated as the most important sources of information, with the modal respondent rating them as “extremely important.” Journals and conferences also were comparatively high-rated sources of information, with the modal respondent rating them as “important.” In our survey, blogs had not yet achieved this level of importance. Respondents were roughly equally likely to report that blogs were “slightly important,” “somewhat important,” and “important,” with an average rating of 2.91 on the five-point scale. However, blogs were rated as considerably more important compared to other online sources (i.e., webinars and social media).

Data from TPM indicated that not all blog posts were of equal interest to the scholarly community. Apparently, blogs provide a source for practical advice and/or discussion of “inside-baseball” issues of disciplinary importance rather than an outlet for original research findings. Of the 10 most-viewed posts on TPM between September 2013 and August 2016, five were either technical or career adviceFootnote 9; the remainder consisted of commentaries on issues of disciplinary significance. By far the most popular post was a piece by Leeper (Reference Leeper2013) about creating high-resolution graphics for manuscripts that will appear sharp and clear when printed in a journal or book.Footnote 10

Perhaps the reason that political scientists do not use online video resources is not because they are not interested but rather because it is difficult to find high-quality resources targeted to researchers’ interests and that are convenient for scholars to use.

Collaboration and Learning via Online Video

Figures 2 and 4 appear to indicate that, at present, online videos and seminars play a secondary role as tools for scholarship in political science. The modal respondent to our survey “never” attended a web seminar or used an online video or guest lecturer in class, rating them as “not at all important” as a source for new ideas and research findings. However, online video–based resources have a role in scholarly work: the modal respondent used an online video to learn a new skill or collaborate with a coauthor “a few times per year.”

Perhaps the reason that political scientists do not use online video resources is not because they are not interested but rather because it is difficult to find high-quality resources targeted to researchers’ interests and that are convenient for scholars to use. Figure 5 indicates that our modal survey respondent is “interested” in “learning about new research findings” and “receiving feedback on [his or her] own work” via online video resources; however, figure 2 indicates that they rarely do so. This conclusion is consistent with Procter et al.’s (Reference Procter, Williams, Stewart, Poschen, Snee, Voss and Targhi-Asgari2010, 4052) conclusion that “among occasional users, there is considerable enthusiasm [for new technologies] that has not yet been translated into routine use.”

Figure 5 Interest in Video-Based Online Resources for Types of Scholarly Work

Our survey provided information about the features that high-quality online videos and webinars should possess. We asked an array of questions pertaining to “what factors would make [the respondent] more or less likely to attend a webinar/online presentation.” The responses for all nine questions are available in the online appendix.Footnote 11 Of the nine factors analyzed, two stood out as particularly important in determining whether respondents would attend: (1) a presentation on the topic relevant to the respondent’s core interest; and (2) the availability of a recorded video that can be viewed at any time. Videos relevant to a researcher’s core interest, with a recorded video available for viewing at any time, made almost all survey respondents “much more likely” or “somewhat more likely” to view an online presentation.

The data from the IMC project provided insight about how a source of high-quality online research seminars will be used by the scholarly community. The number of participants (including speakers, moderators, and staff) in each IMC presentation is shown in figure 6. The numbers indicate that live attendance at an IMC seminar is typical of what we might expect for attendance at a conference panel for a subfield meeting. However, unlike a conference panel, these seminars are widely used after the fact: a seminar with a live audience of ≈30 people can expect ≈200 subsequent views on YouTube.Footnote 12 It even may be the case that this relationship is nonlinear: a live audience three times larger than the previous example (≈90) is predicted to receive a far larger number of views (i.e., almost 1,500). However, the small number of data points with ≥45 attendees makes any inferences in this range tentative at best.

Figure 6 International Methods Colloquium, Attendance versus YouTube Views

The strong live-attendance numbers of the IMC seminar, and even stronger delayed-viewing statistics for IMC recordings, combined with our survey results leads to two conclusions: (1) there is a strong latent demand for topically relevant video-based online resources; and (2) making recorded videos available for viewing at any time is an important part of meeting that demand.

Thus, although the usage and importance ratings for online seminars and videos lag substantially behind those for blogs and traditional tools of scholarship, we believe that these resources have significant potential for future growth in political science.


Our final task was to determine whether certain types of political scientists are more or less inclined to consider online tools an important resource for their scholarly work. Therefore, we created a model that predicted survey responses to the question “How important would you say the following sources are for you in terms of hearing about new ideas and research findings related to your work?” for several online tools (i.e., blogs, webinars/online videos, Facebook, and Twitter) as well as traditional sources (i.e., conferences, journals, and small groups of colleagues and students). These raw data are depicted in figure 4. The dependent variable is ordinal with five levels. We therefore used an ordered probit regression using the polr function in the MASS package in R (Venables and Ripley Reference Venables and Ripley2002). We predicted responses using gender, field of expertise or interest,Footnote 13 current position, and proportion of work time spent teaching. The estimated coefficients from our models are in the online appendix.Footnote 14

Although the raw coefficients were not particularly informative, we observe a statistically significant relationship between males and lower importance ratings for webinars, Facebook, conferences, and small groups but higher importance ratings for blogs.Footnote 15 A substantive interpretation of these coefficients is facilitated by figure 7. As in our descriptive results, we focused on the perceived importance of blogs and webinars (i.e., the two “new media” tools rated as most important in figure 4). We compared blogs and webinars to conferences, a venerable and important mode of scholarly activity. Figure 7 shows the predicted probability of each importance rating for conferences, blogs, and webinars/videos separately for men and women; independent variables other than gender are held at fixed values.Footnote 16

Figure 7 Model Predicted Importance of Online Tools, by Gender

According to our model, women were about 7 percentage points less likely than men to rate a webinar as “not at all important” (i.e., 43.3% for women, 50.7% for men). Women were also about 9 percentage points more likely than men to rate conferences as “extremely important” (i.e., 30.0% for women, 20.6% for men). By contrast, women were 3 percentage points more likely to rate blogs as “not at all important” compared to men (i.e., 15.7% for women, 12.7% for men). Although our research design was not set up to determine why these differences exist, we observed that what conferences, webinars, small groups, and Facebook have in common (and blogs do not) is that they encourage interpersonal interaction among scholars. Our findings are especially interesting in light of Procter et al.’s (2010, 4044) conclusion that “there is a gender bias” in users of Web 2.0 technologies for scholarly communications, “with men making up two-thirds of frequent users, while women make up a slight majority in non-users.”

We note another interesting finding: relative to tenure-track faculty members, graduate students were more likely to rate webinars/online videos and Twitter as important sources of ideas and findings.Footnote 17 As shown in figure 8, graduate students were more than 10 percentage points less likely to rate webinars as “not at all important” (i.e., 50.7% for tenure-track academics, 39.9% for graduate students); this difference was distributed over the higher categories of importance (i.e., 21.1% of graduate students rated webinars and videos as “somewhat important” relative to 15.9% of tenure-track academics). There also were differences between tenure-track academics and people in other positions (e.g., those working in industry). However, our confidence in these measurements is considerably diminished because our survey’s sampling frame was not designed to systematically sample those populations.

Figure 8 Model Predicted Importance of Online Tools, by Position


To summarize our findings, political scientists have limited experience with, but substantial interest in, using online tools as a part of their work. Other than search engines, blogs were the most widely used and important online tool that we studied. However, traditional tools of scholarship (e.g., conferences and journal articles) are (on average) still considered by political scientists to be more important than online tools such as blogs, webinars, online videos, and social media. Conversely, certain segments of the discipline (i.e., women and graduate students) were more likely to believe that webinars and online videos are important sources of ideas and information relevant to their work. Moreover, the reason why political scientists are not likely to use online video resources may be more related to the availability of high-quality and topically appropriate resources rather than a lack of latent demand (which we measured as being relatively high).

Based on our findings, we surmise that the discipline would benefit by investing in the creation of more online tools for scholarly work. Although at present political scientists consider most online tools less important than journals, conferences, and in-person interactions with colleagues, our usage data from the IMC and TPM is evidence that even the current demand for scholarly blog posts and webinars is still strong. Moreover, the substantial interest expressed in our survey for video-based online resources indicates that there is potential for significant growth in the utilization of these resources if they are created. The results of our survey, especially as shown in figures 7 and 8, lead us to speculate that webinars and online videos would be of particular importance to those who will play an important role in the future of our discipline: women and graduate students.

We also note that the importance of a resource is at least partially endogenous to availability. It is difficult to believe that webinars are as important as conferences when webinars are so much rarer. At the same time, webinars may not be offered if they are perceived as undervalued—a catch-22. The experience of the IMC indicates that if this cycle is broken, these resources are utilized at rates comparable to traditional methods (e.g., conference panels). Moreover, participants seem to like what they see: our survey respondents who participated in IMC talks rated them highly (the modal evaluation was “very good”) and were “very likely” to attend the IMC again.Footnote 18

As noted by Acord and Harley (2012, 381), “an understanding of sharing practices should be put in the context of the primary drivers of scholarly communication behavior, which, in competitive institutions, are career self-interest, advancing the field and receiving credit and attribution.” Based on the results of Ponte and Simon (2011, 153), blogs and professional social networks are currently considered almost irrelevant for evaluating researchers in the disciplines they survey. If we want to encourage the provision of high-quality online tools of scholarship, we may have to be more generous in rewarding these types of activities when they appear on curricula vitae.


To view supplementary material for this article, please visit


This research was supported by the National Science Foundation (Grant #1423825). We thank Teka Miller of the APSA for providing demographic data from the 2015 APSA Membership Survey. We also thank Ahra Wu, Michelle Dion, Cassy Dorff, Jeffrey J. Harden, Dustin Tingley, Christopher Zorn, our discussants and audience members at the 2016 APSA Annual Meeting, and members of the International Relations Working Group at Rice University for helpful feedback on previous drafts and analyses.


2. We repeated all of our analyses on a sample that excludes IMC participants because their solicitation might result in oversampling people who are interested in using online tools and might include scholars from outside of the United States. The results, which do not change any of our fundamental conclusions, are presented in the online appendix.

3. We began with 912 observations. We excluded any respondent who did not answer any questions. We also excluded one respondent who stated that his current position was “Giant Possum.” Among the remaining participants, 14 answered only the first question (i.e., about how often they viewed IMC presentations).

4. Our survey asked respondents to identify as tenure-track assistant professors or as tenured associate/full professors. However, we consolidated these two categories into “tenure-track academics” to maximize the sample size in the new single category.

5. Note that these categories were asked non-exclusively; respondents could indicate as many subfields of interest and expertise as they wished.

6. The results are shown in the online appendix figure 9. We obtained results of the APSA survey from Teka Miller, who provided the demographic characteristics of this survey in response to our e-mailed request.

7. The blog is available at

8. A few political scientists from outside of the United States entered the sample because we included IMC participants in our survey. However, our qualitative results are robust to the exclusion of all IMC participants (see the results in the online appendix).

9. The five blog posts are “Making High-Resolution Graphics for Academic Publishing”; “What Courses Do I Need to Prepare for a PhD in Political Science?”; “Building and Maintaining R Packages with devtools and roxygen2”; “A Checklist Manifesto for Peer Review”; and “Student Advice: Should I Go to Graduate School? If So, Where Should I Go?”

10. Table 1 in the online appendix lists the titles of the 10 most-viewed posts between September 2013 and August 2016 alongside the page views that they accumulated during this period.

11. See figures 10, 11, and 12.

12. The model in the figure 6 inset uses a simple linear model, whereas the main model adds squared and cubed terms of webinar attendees to predict YouTube views. As shown in the figure, the predictions of both models are similar when the number of webinar attendees is <45.

13. Note that survey respondents could indicate more than one field of expertise or interest; therefore, these categories are not mutually exclusive.

14. These coefficients are listed in table 2.

15. The gender difference in the importance rating for blogs becomes statistically insignificant in the analysis excluding IMC participants. See table 3 in the online appendix.

16. For figures 7 and 8, the gender variable is set to “male”; all field variables are set to zero; position is set to tenure-track academic; and time teaching is set to 30% whenever the variable in question is not depicted in the graph.

17. They also differ from tenure-track faculty in several other ways—for instance, in rating small-group discussions as more important and journals as less important. We focused on findings that relate specifically to online tools.

18. See figure 13 in the online appendix for more details.



Acord, Sophia Krzys, and Harley, Diane. 2012. “Credit, Time, and Personality: The Human Challenges to Sharing Scholarly Work Using Web 2.0.” New Media and Society 15 (3): 379–97.CrossRefGoogle Scholar
Carpenter, Charli, and Drezner, Daniel W.. 2010. “International Relations 2.0: The Implications of New Media for an Old Profession.” International Studies Perspectives 11 (3): 255–72.CrossRefGoogle Scholar
Dawson, Michael, and Rascoff, Matthew. 2006. “Scholarly Communications in the Economics Discipline: A Report Commissioned by JSTOR.” Available at Accessed August 18, 2016.Google Scholar
Esposito, Antonella. 2013. “Neither Digital or Open. Just Researchers: Views on Digital/Open Scholarship Practices in an Italian University.” First Monday 18 (1): 120.CrossRefGoogle Scholar
Farley, Robert. 2013. “Complicating the Political Scientist as Blogger.” PS: Political Science & Politics 46 (2): 383–6.Google Scholar
Farrell, Henry, and Drezner, Daniel W.. 2008. “The Power and Politics of Blogs.” Public Choice 134 (1/2): 1530.CrossRefGoogle Scholar
Farrell, Henry, and Sides, John. 2010. “Building a Political Science Public Sphere with Blogs.” The Forum 8 (3): Article 10.CrossRefGoogle Scholar
GoToWebinar. 2016. “Performance Report.” Available at Accessed August 10, 2016.Google Scholar
Gruzd, Anatoliy, and Goertzen, Melissa. 2013. “Wired Academia: Why Social Science Scholars Are Using Social Media.” In Hawaii International Conference on System Sciences. Available at Accessed August 18, 2016.Google Scholar
Gruzd, Anatoliy, Staves, Kathleen, and Wilk, Amanda. 2012. “Connected Scholars: Examining the Role of Social Media in Research Practices of Faculty Using the UTAUT Model.” Computers in Human Behavior 28 (6): 2340–50.CrossRefGoogle Scholar
International Methods Colloquium. 2016. “About the IMC.” Available at!about2/cq8t. Accessed August 10, 2016.Google Scholar
Klunk, Brian. 2012. “Does It Matter If Political Scientists Publish Great Blogs?” Available at Accessed August 14, 2016.Google Scholar
Leeper, Thomas J. 2013. “Making High-Resolution Graphics for Academic Publishing.” The Political Methodologist 21 (1): 25. Available at Accessed August 18, 2016.Google Scholar
Lynch, Marc. 2016. “Political Science in Real Time: Engaging the Middle East Policy Public.” Perspectives on Politics 14 (1): 121–31.CrossRefGoogle Scholar
Maron, Nancy L., and Kirby Smith, K.. 2008. “Current Models of Digital Scholarly Communication: Results of an Investigation Conducted by Ithaka for the Association of Research Libraries.” Association of Research Libraries. Available at Accessed August 16, 2016.Google Scholar
McKenna, Laura. 2007. “‘Getting the Word Out’: Policy Bloggers Use Their Soap Box to Make Change.” Review of Policy Research 24 (3): 209–29.CrossRefGoogle Scholar
Nyhan, Brendan, Sides, John, and Tucker, Joshua A.. 2015. “APSA as Amplifier: How to Encourage and Promote Public Voices within Political Science.” PS: Political Science & Politics 48 (S1): 9093.Google Scholar
Papalexi, Marina, Cheng, Siu Yee, Dehe, Benjamin, and Bamford, David. 2014. “The Use of Social Media and its Potential in the Research Lifecycle.” Presented at the European Operations Management Association (EurOMA) Conference, June 20–25.Google Scholar
Ponte, Diego, and Simon, Judith. 2011. “Scholarly Communication 2.0: Exploring Researchers’ Opinions on Web 2.0 for Scientific Knowledge Creation, Evaluation, and Dissemination.” Serials Review 37 (3): 149–56.CrossRefGoogle Scholar
Procter, Rob, Williams, Robin, Stewart, James, Poschen, Meik, Snee, Helene, Voss, Alex, and Targhi-Asgari, Marzieh. 2010. “Adoption and Use of Web 2.0 in Scholarly Communications.” Philosophical Transactions of the Royal Society, Series A 368 (1926): 4039–56.CrossRefGoogle Scholar
Rowlands, Ian, Nicholas, David, Russell, Bill, Canty, Nicholas, and Watkinson, Anthony. 2011. “Social Media Use in the Research Workflow.” Learned Publishing 24 (3): 183–95.CrossRefGoogle Scholar
Sides, John. 2011. “The Political Scientist as Blogger.” PS: Political Science & Politics 44 (2): 267–71.Google Scholar
Sjoberg, Laura. 2013. “Feminist IR 101: Teaching Through Blogs.” International Studies Perspectives 14 (4): 383–93.CrossRefGoogle Scholar
Venables, William N., and Ripley, Brian D.. 2002. Modern Applied Statistics with S. Fourth edition. New York: Springer. ISBN 0-387-95457-0. Available at Scholar
Walt, Stephen M. 2010. “What to Do on Your Summer Vacation.” Foreign Policy (December 10). Available at Accessed August 14, 2016.Google Scholar 2016. “Support: Stats.” Available at Accessed August 10, 2016.Google Scholar
YouTube. 2016a. “YouTube Analytics Basics.” Available at Accessed August 10, 2016.Google Scholar
YouTube. 2016b. “YouTube Creator Studio App Basics.” Available at Accessed August 10, 2016.Google Scholar
Figure 0

Figure 1 Demographic Descriptors of Survey Respondents

Figure 1

Figure 2 Experience Working with Online Tools

Figure 2

Figure 3 Page Views from The Political Methodologist

Figure 3

Figure 4 Sources of New Ideas and Research Findings Rated by Importance

Figure 4

Figure 5 Interest in Video-Based Online Resources for Types of Scholarly Work

Figure 5

Figure 6 International Methods Colloquium, Attendance versus YouTube Views

Figure 6

Figure 7 Model Predicted Importance of Online Tools, by Gender

Figure 7

Figure 8 Model Predicted Importance of Online Tools, by Position

Supplementary material: PDF

Esarey and Wood supplementary material 1

Online Appendix

Download Esarey and Wood supplementary material 1(PDF)
PDF 244.5 KB