Learning Progressions, Student Thinking, and Teacher Reasoning: Applying Scientific Model Criteria to Evaluate a Learning Progression for Force and Motion
Alicia Alonzo, Michigan State University, email@example.com
“Learning progressions” (LPs) have rapidly gained popularity in the science education community, with the promise of informing standards (e.g., Next Generation Science Standards), curriculum, instruction, and assessment. As “descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as [students] learn about and investigate a topic” (National Research Council, 2007, p. 219), LPs may also inform work in undergraduate courses – as students develop more advanced knowledge and practices. I am particularly interested in the ways that LPs may support and inform classroom instruction that is responsive to students’ learning needs.
In this talk, I share an argument that I have been developing with my colleague Andy Elby (at the University of Maryland). Like others, we suggest that LPs, as models of student thinking, should be evaluated using criteria applied to other scientific models. Because fit to empirical data is just one of several such criteria, we also consider utility (the extent to which the model serves its intended purpose—in this case, supporting instructional decision-making) and generativity (the extent to which the model gives rise to new understandings and/or conceptualizations). Using a LP for force and motion (at the level of high school physics), I first describe our evaluation of whether LPs accurately capture the nature of students’ thinking. I contrast assumptions of the “LP perspective” with other views of student thinking (e.g., knowledge-in-pieces, diSessa, 1993) and provide evidence (both theoretically and empirically) that LP assumptions may not be adequate for capturing student thinking. Next, I describe an empirical study used to address the remaining two criteria. In this study, we were interested in directly addressing the argument that even if LPs do not match student thinking exactly, they may still be useful for classroom applications. In particular, using a think-aloud protocols with six high school physics teachers, we explored the utility of LPs for supporting formative assessment practices and for generating knowledge supportive of instructional decision-making more broadly. I conclude with implications of this work for further LP research and teacher professional development.