Session Information
01 SES 03 B, Classroom Practice
Paper Session
Contribution
Student questions have multiple benefits for teaching and learning (Chin & Osborne, 2008). However, to ensure that student questioning is effective for learning, teachers must ensure that students will find answers. A major obstacle for teachers in guiding questions to answers is that many initial student questions are unfocused, poorly investigable, and therefore difficult to answer (Baranova, 2017). Guiding unclear initial questions to more focused investigable questions, is a cognitively challenging and time-consuming process (Herranen & Aksela, 2019). Teachers would like more insight into how to guide learning questions to answers in the best possible way within the available time and resources (Kaya, 2018).
Our assumption was, that teachers would first need a clear view of the aspired quality of a question in order to identify the potential quality in initial student questions and to be able to use appropriate instructional strategies to foster that quality. However, we found that the participating teachers in this study found it initially difficult to identify the potential quality of student questions. They needed to develop a deeper understanding of which types of student questions and research activities would lead to hands-on research that is feasible for students and which will lead to deeper understanding of the core concepts of the subject under study. Smith et al. (2013) showed that teachers who guide the process of student questioning need to develop Pedagogical Process Knowledge. PKK refers to the teachers’ ability to diagnose the current state of students’ question quality and the ability to choose and employ the most effective instructional strategies to foster the students questioning process.
Smith et al. (2013) found that teachers could develop PPK in a community of practice by using conceptual models (cf. Bereiter, 2005). Unfortunately, a conceptual model to develop PPK about fostering question quality was not yet available. Therefore, we developed the conceptual model of Multiple Hypothetical Question-Related Learning Trajectories (MHQLT’s) for this study. MHQLTs are based on the Hypothetical Learning Trajectories (HTLs) of Simon and Tzur (2004) as a conceptual model to help teachers explore possible pathways that learners might take to reach a learning outcome. Simon and Tzur found that using HLTs to explore potential learning pathways helped teachers to anticipate on and use effective instructional strategies to support student learning. The HLT approach seemed promising for guiding student questioning, because this might help teachers to think about, anticipate upon and find effective ways to foster the quality of student questioning for hands-on research. The essential functionality of the MHQLTs -model is: a) to explore the different learning trajectories of various question-types based on for students feasible research activities, and b) to understand the patterns in the relationship between the formulation of different question-types, types of hands-on research activities and types of learning outcomes. To make working with the MHQLTs model more accessible for teachers we visualized and introduced it to them as the “Question Compass”.
The aim of this study was to determine the value of the Question Compass for teachers' professional learning of effective diagnostic and instructional strategies (PPK) to guide the quality of student questions. To determine if and how the Question Compass contributed to teachers’ professional learning, the value of the conceptual model was operationalized with Odenbaugh’s (2005) and Alonzo and Elby’s (2019) criteria for the quality of conceptual tools: generativity, flexibility, and robustness. Therefore, the main research question of the study was: In what ways was working with Question Compass as a conceptual model perceived as generative, flexible and robust for teacher learning about effective diagnostic and instructional strategies to support students’ question quality?
Method
A multiple case study methodology was applied because this is particularly instrumental for evaluating phenomena in real-life contexts (Yazan, 2015). A broad sample of teachers from primary education was included because maximum variation sampling enables a comprehensive description of the phenomenon (Patton, 2015). To explore what the value of Question Compass would be for guidance of student question quality, 32 teachers from six Dutch primary schools participated in four design teams, which worked independently in four iterative cycles of design, implementation, evaluation and reflection and redesign over a period of two school years. The focus in the design teams was on the professional learning of the teachers to support them in developing their own ideas and concrete plans for guiding student question quality. At the start of each design cycle, the researcher first (re)introduced the Question Compass and the basic ideas underlying the conceptual tool. After this introduction, teachers used the Question Compass to collaboratively design professional experiments for topics of their own choosing, by brainstorming about possible desirable student questions, discussing how these types of questions might be prompted, and what kinds of guidance students would need to answer them. Then, teachers individually tested their lesson plans in practice. Upon completion of these professional experiments, teachers evaluated their experiences collaboratively in their design teams. The primary data source consisted of 36 hours of transcribed audio recordings of all sessions during the three completed design cycles and the worksheets that teachers used during these sessions. To triangulate teachers’ self-report about the professional experiments, we made classroom observations and collected video recordings of classroom learning activities. The basis for our analysis is the Interconnected Model of Teachers’ Professional Growth (IMTPG) of Clarke and Hollingsworth (2002) (Figure 3). The IMTPG was selected because it both acknowledges the complexity of teacher change and the importance of teacher agency in professional learning (Roehrig, 2023). We developed a coding schema based on the four change sequences in the IMPTG model that can be related to: generativity (CS1), flexibility (CS2), robustness: lessons learned (CS3) and robustness: salient outcomes (CS4), as shown in Figure 3. To ensure quality of the coding scheme, two coders independently tested it on 10% of the data. The interrater agreement was 85%. Differences were discussed and resolved, further refining and clarifying the coding scheme. Then the rest of the data was coded
Expected Outcomes
To determine the value of the Question Compass for teachers as a conceptual tool to foster student question quality three criteria were identified: generativity, flexibility, and robustness For generativity findings show that the tool helped teachers develop conceptions of good inquiry learning questions by relating quality to feasible inquiry activities. It also made teachers more aware of how to develop epistemic agency by examining the relationship between question type, research method and learning outcomes. Moreover, the tool was considered to support a more purposeful design of teacher guidance of student questions. For flexibility findings show that teachers:1) used the Question Compass in various explicit ways to diagnose question quality, 2) used the Question Compass in multiple ways explicitly in their instructional strategies to support generating, formulating and answering student questions, 3) were able to use Question Compass to develop diagnostic and instructional strategies that fitted their own personal and their classroom’s needs, 4) developed flexibility over time, leading teachers to combine and vary their instructional strategies as they deemed most appropriate. For robustness findings show that: a) recognizing and categorizing question types was supportive for diagnosing question quality, b) prompting students with purposely chosen activities and materials and modeling question types was effective for generating questions, c) anticipating on the question-types’ research methods fostered support of the answering process and d) discussing question types with students was effective for fostering learning outcomes. We conclude that findings support our assumption that the Question Compass as a conceptual tool supported the collaborative professional learning of teachers when designing, implementing and evaluating professional experiments and in this way fostered teachers’ guidance of student question quality.
References
Alonzo, A. C., & Elby, A. (2019). Beyond empirical adequacy: Learning progressions as models and their value for teachers. Cognition and Instruction, 37(1), 1-37. Baranova, E. A. (2017). Question-asking behavior as a form of cognitive activity in primary school children. Psychology in Russia, 10(1), 269. Bereiter, C. (2005). Education and mind in the knowledge age. New York, NY: Routledge. Chin, C., & Osborne, J. (2008). Students’ questions: A potential resource for teaching and learning science. Studies in Science Education, 44(1), 1–39. Clark, D., & Hollingsworth, H. (2002). Elaborating a model of teacher professional growth. Teaching and Teacher Education, 18(8), 947–967. https://doi.org/10.1016/ S0742-051X(02)00053-7 Herranen, J., & Aksela, M. (2019). Student-question-based inquiry in science education. Studies in Science Education, 55(1), 1-36. Kaya, S. (2018). Improving the quality of student questions in primary science classrooms. Journal of Baltic Science Education 17(5), 800–811. Odenbaugh, J. (2005). Idealized, inaccurate, but successful: A pragmatic approach to evaluating models in theoretical ecology. Biology and Philosophy, 20, 231–255. Patton, M.Q. (2015). Qualitative research & evaluation methods: Integrating theory and practice (4th ed.). Thousand Oaks, CA: Sage. Roehrig, G. (2023). Research on Teacher Professional Development Programs in Science. In Handbook of Research on Science Education (pp. 1197-1220). Routledge. Simon, M. A., & Tzur, R. (2004). Explicating the role of mathematical tasks in conceptual learning: An elaboration of the hypothetical learning trajectory. Mathematical Thinking and Learning, 6(2), 91-104. Smith, C., Blake, A., Fearghal, K., Gray, P., & McKie, M. (2013). Adding pedagogical process knowledge to pedagogical content knowledge: teachers' professional learning and theories of practice in science education. Educational research eJournal, 2(2), 132-159. Yazan, B. (2015). Three approaches to case study methods in education Yin, Merriam, and Stake. The Qualitative Report, 20(2), 134-152.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.