We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Weighting techniques are employed to generalize results from survey experiments to populations of theoretical and substantive interest. Although weighting is often viewed as a second-order methodological issue, these adjustment methods invoke untestable assumptions about the nature of sample selection and potential heterogeneity in the treatment effect. Therefore, although weighting is a useful technique in estimating population quantities, it can introduce bias and also be used as a researcher degree of freedom. We review survey experiments published in three major journals from 2000–2015 and find that there are no standard operating procedures for weighting survey experiments. We argue that all survey experiments should report the sample average treatment effect (SATE). Researchers seeking to generalize to a broader population can weight to estimate the population average treatment effect (PATE), but should discuss the construction and application of weights in a detailed and transparent manner given the possibility that weighting can introduce bias.
Political science researchers have flexibility in how to analyze data, how to report data, and whether to report on data. A review of examples of reporting flexibility from the race and sex discrimination literature illustrates how research design choices can influence estimates and inferences. This reporting flexibility—coupled with the political imbalance among political scientists—creates the potential for political bias in reported political science estimates. These biases can be reduced or eliminated through preregistration and preacceptance, with researchers committing to a research design before completing data collection. Removing the potential for reporting flexibility can raise the credibility of political science research.
Political science graduate students need to develop strong skills in drafting empirical research manuscripts. Yet, many graduate student manuscripts contain similar shortcomings, which require student peers, faculty advisors, and journal referees to produce the same comments for multiple manuscripts. This article lists common comments on empirical research manuscripts, as a reference to help students revise their manuscripts before presentation to others for review, so that reviewers can focus on the more substantive elements of a manuscript, thus producing better manuscripts that are more likely to be published and thus contribute to knowledge about political phenomena.
Large majorities in nearly every country support democracy, according to studies of cross-national surveys. But many of these reports have treated as missing data persons who did not provide a substantive response when asked to offer an opinion about the suitability of democracy as a regime type for their country, which has led to substantial overestimates of expressed support for democracy in some countries. This article discusses the consequences of excluding such nonsubstantive responses and offers suggestions to improve the study of popular support for democracy.
Many political science publications advance knowledge usingpreviously collected data and an innovation or two in theory ormethods. To encourage students embarking on a seminar paper project,I review some of these publications to illustrate that theunderstanding of political phenomena often advances in incrementalsteps.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.