Misinformation presents a significant societal problem. To measureindividuals’ susceptibility to misinformation and study its predictors,researchers have used a broad variety of ad-hoc item sets, scales, questionframings, and response modes. Because of this variety, it remains unknownwhether results from different studies can be compared (e.g., in meta-analyses).In this preregistered study (US sample; N = 2,622), we comparefive commonly used question framings (eliciting perceived headline accuracy,manipulativeness, reliability, trustworthiness, and whether a headline is realor fake) and three response modes (binary, 6-point and 7-point scales), usingthe psychometrically validated Misinformation Susceptibility Test (MIST). Wetest 1) whether different question framings and response modes yield similarresponses for the same item set, 2) whether people’s confidence in theirprimary judgments is affected by question framings and response modes, and 3)which key psychological factors (myside bias, political partisanship, cognitivereflection, and numeracy skills) best predict misinformation susceptibilityacross assessment methods. Different response modes and question framings yieldsimilar (but not identical) responses for both primary ratings and confidencejudgments. We also find a similar nomological net across conditions, suggestingcross-study comparability. Finally, myside bias and political conservatism werestrongly positively correlated with misinformation susceptibility, whereasnumeracy skills and especially cognitive reflection were less important(although we note potential ceiling effects for numeracy). We thus find moresupport for an “integrative” account than a “classicalreasoning” account of misinformation belief.