Session Information
04 SES 12 F, Looking To The Future: Inclusion As Conjecture And Prediction
Paper Session
Contribution
The use of test-diagnostic approaches for the assessment of cognitive abilities is critically questioned in the context of special education. Doubts about the derivability of educational recommendations on pedagogical practice as well as on the validity of test scores for groups of learners, who are in the focus of special education, are at the center of the criticism. Dynamic assessment is discussed as a possibility to present a meaningful extension of diagnostic practice in special education and to overcome the major two lines of critique. Dynamic assessment is described as the connection between measurement and intervention in the diagnostic process (Guthke, Beckmann, & Wiedl, 2003, Sternberg & Grigorenko, 2002). In contrast to classic test-diagnostic approaches, during dynamic assessment, the tester intervenes in the problem-solving process in order to support the student. In this context, the goal of dynamic assessment is to counteract the lack of test fairness outlined above. Therefore the goal is not to shed a light on learning outcomes, but rather on learning potentials and the amount of support needed to fulfill these potentials (Bosma, Hessels, & Resing, 2012, Bosma & Resing, 2012, Elliott, 2003; Guthke et al., 2003; Haywood & Lidz, 2007). Despite positive evidence on the application of dynamic assessment, however, to date, there are only a few German-language procedures that capture cognitive abilities in a dynamic assessment context. [Anoynmous](2016, 2018a) developed a dynamic assessment approach focusing on facets of concrete-operational thinking. Concrete-operational thinking depicts an important step in the development of thinking (Piaget, 2015). It is characterized by the mastery of so-called concrete-operational concepts (e.g. conservation, classification or seriations tasks). Mastery of these concepts is associated with a range of learning outcomes, e.g. mathematical achievement or reading achievement (Cartwright, 2002; Wubbena, 2013). Therefore, the assessment of concrete-operational thinking seems of relevance in educational contexts. However, so far evidence about the usefulness of the dynamic assessment of concrete-operational thinking in academic contexts is lacking. Especially, there is a lack of insights on how dynamic measures might provide a way to predict school-related outcomes. Such validity seems particularly relevant for the discussion of the possible contribution of dynamic assessment to educational assessment. In light of this research gap, the following research questions arise:
Question 1: Is there a correlation between dynamic test measures (number of hints, test score after the intervention) and measures of academic achievement (amount of support in schools and academic achievement)? Assuming that dynamic test measures are indicators of the extent to which a person can benefit from learning stimuli, it is believed that these measures may predict the amount of support in schools and academic achievement.
Question 2: Do dynamic test measures (number of hints, test score after the intervention), explain additional variance compared to static measures of cognitive abilities with regard to the amount of support needed in schools and academic achievement? Assuming that dynamic test measures relate to a learning process and thus more closely matches the character of academic learning, it can be expected that these measures, in comparison to static measures of cognitive abilities, will explain additional variance with regard to measures of academic learning.
Method
For this purpose, 40 first- and second-grade students worked on a static (TEKO) as well as a dynamic test (DKTK) for the assessment of concrete-operational thinking. To examine the relationship between measures of dynamic assessment and amount of support in schools as well as academic performance (RQ1) Spearman correlations were calculated. The following measures were considered as indicators of the dynamic assessment procedure: the number of hints, test score after the intervention. To examine the second research question on the incremental validity of the aforementioned measures of dynamic assessment (RQ 2), hierarchical regression models were used. Separate models for the respective cognitive operations (conservation, classification, and sequences) were calculated. This was done against the background that the individual operations are discussed as independent constructs. In total 3 x 3 regression models were calculated (3 cognitive operations x 3 criteria). In a first step, the respective learning measures (math grade, German grade, amount of support) were defined as criteria and the static scores from the TEKO as predictors. In a second and third step, the dynamic measures were included in the regression as additional predictors (Step 2: number of hints; Step 3: score after the intervention). To describe the explained variance explanation, the adjusted R2 are reported. Delta-R2 is reported as an indicator of the incremental explained variance by the respective model extensions. To estimate the effects of the single dynamic test scores (number of hints, indications test score after intervention) standardized regression coefficients are calculated.
Expected Outcomes
Looking at the results, it is not possible to draw a clear picture. With regard to the subscales of number conservation and sequences, a better prediction of dynamic test scores can be shown with regard to the criteria (need for support, math grade and German grade). At the same time, however, the results differ. On the one hand, the number of hints (in the case of conservation tasks), and on the other hand, the test score after the intervention (in the case of sequence tasks) explains additional variance. This indicates the relevance of both measures of dynamic testing. In summary, it can be concluded that the dynamic test scores might be better suited to describe the current academic performance and amount of support needed in academic learning. This also confirms the results of previous work on dynamic testing (Caffrey et al., 2008, Resing, Stevenson, & Bosma, 2012). The results may provide initial insights into the possibilities of the dynamic assessment of cognitive abilities. At the same time, especially against the background of the small sample, this is only a first approximation of this finding. Still, dynamic testing could be seen as a potential and important extension of educational diagnostics.
References
Börnert-Ringleb, M., & Wilbert, J. (2018a). The Association of Strategy Use and Concrete-Operational Thinking in Primary School. https://doi.org/10.3389/feduc.2018.00038 Caffrey, E., Fuchs, D., & Fuchs, L. S. (2008). The Predictive Validity of Dynamic Assessment: A Review. The Journal of Special Education, 41(4), 254–270. https://doi.org/10.1177/0022466907310366 Bosma, T., Hessels, M. G. P., & Resing, W. C. M. (2012). Teachers’ preferences for educational planning: Dynamic testing, teaching’ experience and teachers’ sense of efficacy. Teaching and Teacher Education, 28(4), 560–567. https://doi.org/10.1016/j.tate.2012.01.007 Bosma, T., & Resing, W. C. M. (2012). Need for instruction: dynamic testing in special education. European Journal of Special Needs Education, 27(1), 1–19. https://doi.org/10.1080/08856257.2011.613599 Cartwright, K. B. (2002). Cognitive development and reading: The relation of reading-specific multiple classification skill to reading comprehension in elementary school children. Journal of Educational Psychology, 94(1), 56–63. https://doi.org/10.1037//0022-0663.94.1.56 Guthke, J., Beckmann, J. F., & Wiedl, K. H. (2003). Dynamik im dynamischen Testen. Psychologische Rundschau, 54(4), 225–232. https://doi.org/10.1026//0033-3042.54.4.225 Haywood, H. C., & Lidz, C. S. (2007). Dynamic Assessment in Practice Clinical and Educational Applications. Leiden: Cambridge University Press. Resing, W. C. M., Stevenson, C. E., & Bosma, T. (2012). Dynamic Testing: Measuring Inductive Reasoning in Children With Developmental Disabilities and Mild Cognitive Impairments. Journal of Cognitive Education and Psychology, 11(2), 159–178. https://doi.org/10.1891/1945-8959.11.2.159 Sternberg, R. J., & Grigorenko, E. L. (2002). Dynamic Testing: The Nature and Measurement of Learning Potential. Cambridge University Press. Wubbena, Z. C. (2013). Mathematical fluency as a function of conservation ability in young children. Learning and Individual Differences, 26, 153–155. https://doi.org/10.1016/j.lindif.2013.01.013
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.