Session Information
27 SES 11 B, Diversity and the Science and Mathematics Classroom
Paper Session
Contribution
This paper reports on an exploratory secondary analysis of classroom observations to inquire patterns of instructional features. These features are explored within and between lesson segments to reveal patterns of instruction, which could provide further knowledge on didactical aspects of teaching and on internal structures of lessons. Insights into these patterns can serve as grounds for further exploration, both between different specific subject contexts and also across different school subjects. The paper aims to answer the following research question: What is the relationship between instructional features within and between different lesson segments.
Observation systems focus on specific dimensions of teaching to deepen understanding and improve teaching (Bell et al., 2019). Although wording, conceptualisation and instrumentalization differ between frameworks, common dimensions include aspects such as instructional clarity, cognitive activation, discourse features and supportive climate (Klette et al., 2017),. Cognitive activation, a concept used in several frameworks (Bell et al., 2019; Klette et al., 2017; Praetorius & Charalambous, 2018), includes practices that “encourage students to engage in higher-level thinking” (Lipowsky et al., 2009, p. 529) by utilizing appropriately challenging tasks, activating previous knowledge and students are expected to explain and challenge their reasoning (Praetorius et al., 2014). Instructional clarity is related to explanation of subject matter and includes aspects of modelling strategies and ways of working (Bell et al., 2019; Klette et al., 2017; Praetorius & Charalambous, 2018). This can be included in the way that Cohen (2018, p. 324) conceptualise explicit instruction as practices that “makes learning processes overt and clear with detailed models, strategies, and examples of the skills students are expected to demonstrate”.
The dimension of cognitive activation and instructional clarity (or degree of explicit instruction) are of special interest for this paper as they are identified to be important aspects of teaching quality, and part of core processes within and across school subjects, and salient in several frameworks. Comparative as well as specific subject didactic research has identified commonalities and differences within and between subjects (Cohen, 2018; Praetorius et al., 2014; Tengberg et al., 2021) regarding these dimensions. For cognitive activation, Praetorius et al. (2014) identified challenges in measuring. They underline the importance of further understanding since cognitive activation might be different depending on the stage of the instructional sequence, or whether it is as the start or end of a lesson. Thus, it is of importance to further explore how instructional features are related within lessons.
The Linking Instruction and Student Achievement (LISA) project is based on the four dimensions previously mentioned as a perspective on instructional quality (Klette et al., 2017). In the LISA-project, instructional features in classrooms was observed and rated following the Protocol for Language Arts Teaching Observation (PLATO) protocol. PLATO revolves around four central domains which are divided into elements (see Grossman et al., 2014, 2015) of which the following are of interest for this paper: modelling (MOD), Strategy Use and Instruction (SUI), Feedback (FB), Intellectual Challenge (IC), Classroom Discourse (CD), Representations of Content (ROC), Connections to Prior Knowledge (CPK), Purpose (PUR). Additionally, the main instructional format was observed, distinguishing between whole class, group work, pair work and individual seat work. Cognitive Activation is mainly related to IC and CD whereas instructional clarity is related to MOD, SUI and ROC.
Method
Latent class analysis (LCA) provides a probabilistic statistical approach to identify different subgroups, most often called classes, in observed data. The classes represent typologies that can help to understand similarities and differences across observations and variables (Weller et al., 2020). The latent classes stem from patterns in the observed data and class membership is estimated and given a probability (Sinha et al., 2021). This provides a novel approach to observation data which could identify groups of teaching segments and their corresponding characteristic instructional features. Thus, different types of segments can be characterised and aspects of cognitive activation and instructional clarity can be explored, together with other instructional features. The data stems from the LISA project and a subsample from the Swedish cohort is selected for this analysis. In this sample, 127 mathematics lessons from 16 schools and 31 grade 7 classrooms were videotaped. Each lesson was divided into 15-minute segments giving a total of 403 segments. The average was 13 segments per classroom. Each segment was coded from 1 – 4 for each of the PLATO elements (see Tengberg et al., 2021). For this study all analysis are performed with R (R Core Team, 2022) and the PoLCA package (Linzer & Lewis, 2011) following the method outlined by Oberski (2016) and Sinha et al. (2021). Codes with less than 10% observations are collapsed and grouped to the corresponding side of low (1-2) or high (3-4) end on the scale for that specific PLATO-element. The analysis is run for 1 class solution up to, and including, a 5-class solution. LCA analysis is performed 500 times to find global maximum log-likelihood and avoid local maximum. Using Bayesian Information Criteria (BIC) and Akaike Information Criteria (AIC) two solutions possible for further inspection are identified, corresponding to 2 and 3 classes as solutions. The 3-class solution is further pursued as it offers a separation of characteristic PLATO-elements within and between the classes and is presented in the results section.
Expected Outcomes
In conclusion, three different classes of lesson segments and their corresponding instructional features were identified. The first identified class of segments (proportion = 0.363) was found to exhibit a high probability of incorporating instructional elements that received high ratings, including ROC, MOD, SUI. Additionally, this class was found to possess a higher probability of cognitive activation features mainly related to CD being rated on the higher end, with whole-class instruction as a dominant characteristic, compared to the other classes. The second class (proportion = 0.325), was characterized by a high probability of individual seat work, with no clear distinction between low or high ratings in IC. However, this class exhibited a high probability of receiving high ratings in ROC and SUI, although lower compared to the first class. This class also had the highest probability of CD being rated as 1, as well as a high probability of MOD being rated as 2, which could indicate a situation where the teacher only addresses a few students. Finally, the third class (proportion = 0.312) was found to possess a high probability of low-end ratings in MOD and SUI, as well as a high probability of low ratings in ROC. Characteristic for this class of segments was that the teacher was not actively employing instructional features, as identified by PLATO. The elements of CPK, Feedback and Purpose did not exhibit distinctive patterns in terms of probabilities for low/high-end ratings across the classes. This result suggests that these elements may not be related to the instructional patterns of the three identified classes, but may warrant further exploration in future studies. The results of the study provide valuable insights into the instructional patterns within lessons which can be extended to other contexts and subjects.
References
Bell, C. A., Dobbelaer, M. J., Klette, K., & Visscher, A. (2019). Qualities of classroom observation systems. School Effectiveness and School Improvement, 30(1), 3–29. https://doi.org/10.1080/09243453.2018.1539014 Cohen, J. (2018). Practices that cross disciplines?: Revisiting explicit instruction in elementary mathematics and English language arts. Teaching and Teacher Education, 69, 324–335. https://doi.org/10.1016/j.tate.2017.10.021 Grossman, P., Cohen, J., Ronfeldt, M., & Brown, L. (2014). The Test Matters: The Relationship Between Classroom Observation Scores and Teacher Value Added on Multiple Types of Assessment. Educational Researcher, 43(6), 293–303. https://doi.org/10.3102/0013189X14544542 Grossman, P., Loeb, S., Cohen, J., & Wyckoff, J. (2015). Measure for Measure: The Relationship between Measures of Instructional Practice in Middle School English Language Arts and Teachers’ Value-Added Scores. American Journal of Education. https://doi.org/10.1086/669901 Klette, K., Blikstad-Balas, M., & Roe, A. (2017). Linking Instruction and Student Achievement. A research design for a new generation of classroom studies. Acta Didactica Norge, 11(3), Article 3. https://doi.org/10.5617/adno.4729 Linzer, D. A., & Lewis, J. B. (2011). poLCA: An R Package for Polytomous Variable Latent Class Analysis. Journal of Statistical Software, 42(10). https://doi.org/10.18637/jss.v042.i10 Lipowsky, F., Rakoczy, K., Pauli, C., Drollinger-Vetter, B., Klieme, E., & Reusser, K. (2009). Quality of geometry instruction and its short-term impact on students’ understanding of the Pythagorean Theorem. Learning and Instruction, 19(6), 527–537. https://doi.org/10.1016/j.learninstruc.2008.11.001 Oberski, D. (2016). Mixture Models: Latent Profile and Latent Class Analysis. In J. Robertson & M. Kaptein (Eds.), Modern Statistical Methods for HCI (pp. 275–287). Springer International Publishing. https://doi.org/10.1007/978-3-319-26633-6_12 Praetorius, A.-K., & Charalambous, C. Y. (2018). Classroom observation frameworks for studying instructional quality: Looking back and looking forward. ZDM, 50(3), 535–553. https://doi.org/10.1007/s11858-018-0946-0 Praetorius, A.-K., Pauli, C., Reusser, K., Rakoczy, K., & Klieme, E. (2014). One lesson is all you need? Stability of instructional quality across lessons. Learning and Instruction, 31, 2–12. https://doi.org/10.1016/j.learninstruc.2013.12.002 R Core Team. (2022). R: A language and environment for statistical computing [Manual]. https://www.R-project.org/ Sinha, P., Calfee, C. S., & Delucchi, K. L. (2021). Practitioner’s Guide to Latent Class Analysis: Methodological Considerations and Common Pitfalls. Critical Care Medicine, 49(1), e63–e79. https://doi.org/10.1097/CCM.0000000000004710 Tengberg, M., Bommel, J. van, Nilsberth, M., Walkert, M., & Nissen, A. (2021). The Quality of Instruction in Swedish Lower Secondary Language Arts and Mathematics. Scandinavian Journal of Educational Research, 0(0), 1–18. https://doi.org/10.1080/00313831.2021.1910564 Weller, B. E., Bowen, N. K., & Faubert, S. J. (2020). Latent Class Analysis: A Guide to Best Practice. Journal of Black Psychology, 46(4), 287–311. https://doi.org/10.1177/0095798420930932
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.