Online panels have become an important resource for research in political science, but the compensation offered to panelists incentivizes them to become “survey professionals,” raising concerns about data quality. We provide evidence on survey professionalism exploring three US samples of subjects who donated their browsing data, recruited via Lucid, YouGov, and Facebook (total
$n = 3,886$). Survey professionalism is common, but varies across samples: by our most conservative estimate, we find 1.7% of respondents on Facebook, 7.
$\color {black}6$% on YouGov, and 34
$\color {black}.7$% on Lucid to be professionals (under the assumption that professionals are as likely as non-professionals to donate data after conditioning on observable demographics available from all online survey takers). However, evidence that professionals lower data quality is limited: they do not systematically differ demographically or politically from non-professionals and do not exhibit more response instability. They are, however, somewhat more likely to speed, straightline, and attempt to take questionnaires repeatedly. To address potential selection issues in donating of browsing data, we present sensitivity analyses with lower bounds for survey professionalism. While concerns about professionalism are warranted, we conclude that survey professionals do not, by and large, distort inferences of research based on online panels.