Justin Grimmer presents "A Statistical Framework to Engage the Problem of Disengaged Survey Respondents"

Presentation Date: 

Wednesday, January 25, 2023
Researchers in academia, government, and industry increasingly rely upon cheaper online surveys to measure public opinion. However, with their lower cost, online surveys increase the risk of bias from inattentive or disengaged survey respondents entering the sample – a risk that remains even after survey firms and researchers use well-developed filters and attention checks to exclude these disengaged respondents. In this paper, we introduce a statistical framework for surveys with disengaged respondents and tools to address the bias. First, we develop a partial identification approach that clarifies the extent to which relevant estimands can be identified in the presence of disengaged respondents. These bounds apply regardless of how well attention checks uncover disengaged respondents. Second, we show that simply dropping respondents who are flagged as disengaged or inattentive from the analysis can lead to selection bias if the scientific question is about the attitudes or beliefs in a general target population (e.g., adults in the US). To correct for this, we introduce partial and point identification approaches that adjust for this selection bias. We apply our estimators to study the prevalence of extreme anti-democratic attitudes and find that – despite alarming topline results — that the survey data is consistent with there being effectively no respondents who support these views.
See also: 2022