Our efforts in developing and using misconception-based, multiple-choice items and assessments grew out of the need to probe students' understanding of science concepts, and any changes in those understandings, as part of the development of the Project STAR high school astronomy curriculum, SED's first project. The 47-item STAR Questionnaire was developed as a pre-/post-test instrument to probe for the existing conceptual knowledge, and any changes in that knowledge, of the grades 9-12 students who participated in the curriculum's development in both experimental and control classrooms nationwide. At the time the instrument was created, several published research studies existed identifying the diverse ways in which students' understandings of science concepts differ from those held by scientists1. The documented strong attraction of alternative ideas to learners in the process of constructing accurate scientific understandings led to the incorporation of these ideas as "distractor" choices in the questionnaire's items. The resulting data proved very useful in identifying the components of the STAR curriculum that had the greatest effect in helping students develop more scientifically accurate ideas. The use of similar pre-/post-tests has been a standard component of all of SED's curriculum, professional development and informal learning programs.

In 1998, SED chair Philip Sadler, in writing about our research using these types of items2, coined the term "distractor driven multiple choice" (DDMC) items to emphasize the powerful nature of choices that attract responses by learners whose conceptual understandings of science are not fully developed. Since then, we have used "DDMC" as a descriptor for all of our assessments.

Since 2001, SED has endeavored to build a battery of DDMC-based assessment instruments to aid in the evaluation of teacher professional development programs and the conduct of research in the nation's pre-college science classrooms. We were motivated by a desire to contribute to a corpus of knowledge concerning the knowledge and gains exhibited by both teachers and students of science3 and the kinds of activities and pedagogies that promote the greatest learning. We found that evaluators and researchers were often stymied by a lack of suitable assessment instruments usable for comparative studies. By developing a test-bank of more than 1800 items based on the now extensive peer-reviewed literature on student misconceptions, 24 valid and reliable assessments have been constructed to gauge mastery of the content in the National Research Council's National Science Education Standards. We, and other researchers and evaluators, have used these DDMC assessments to measure changes in teacher subject matter knowledge (SMK) and pedagogical content knowledge (PCK) that relate to teachers' understanding of student misconceptions. The grade band coverage in the content fields of physical science, and earth and space science is now complete (i.e., K-4, 5-8 and 9-12). Life science is covered at the K-4 and 5-8 grade bands and items for the 9-12 grade band linked to the Next Generation Science Standards began development in late 2013.


1Researchers use different terms to label students’ emerging ideas, one of the most common being “misconception,” although this term has limited meaning. The term is used here as an identifier for all conceptual understandings of science that differ from those held by scientists.

2Sadler, Philip M. (1998). Psychometric models of student conceptions in science: Reconciling qualitative studies and distractor-driven assessment instruments. Journal of Research in Science Teaching, 35(3), 265-296. This article evaluates the psychometric properties of multiple-choice tests that characterize the conceptual frameworks of learners and won the Journal of Research in Science Teaching Award for 1999.

3Since all of our test items are written at the reading level of the grades they assess, assessments can be used with both students and teachers.