09 SES 01 B, Measuring Cognitive Processes and Domain-General Competencies
An elaborated understanding of science facilitates a general understanding of the world, since we live in a knowledge-based society (Carey & Smith, 1993; Kuhn & Weinstock, 2002). Therefore, the promotion of scientific thinking has become a core issue for science educators, researchers, and policy-makers (Lawson, 2004; Webb, 2010). It is also a normative goal of international science education frameworks that students are well prepared for science-related challenges both in their academic career and in everyday life (OECD, 2007). In this respect, meta-scientific reflection has been proposed as an important competence for students to acquire (Huber, 1997). Although the concept of meta-scientific reflection is rooted in the German tradition of educational theory, it applies well to international demands of science education since it comprises fundamental skills such as the contextualization and juxtaposition of scientific evidence or evaluating the significance and consequences of science (Müsche, 2009). This complex construct draws on various lines of research, such as science literacy (cf. Feinstein, 2011) and epistemic beliefs, or “the nature of knowledge and the nature or process of knowing” (Hofer & Pintrich, 1997, p. 112).
However, the assessment of the cognitive processes involved when students are required to reflect on scientific issues remains a critical issue. Qualitative measures as favored within the nature of science framework provide in-depth analyses of individual belief systems, but they are also time-consuming and hard to generalize. Quantitative measures, on the other hand, provide comparable test scores, but little is known about the processes of how these scores originate. Process measures such as eye-tracking and verbal reports allow for valid interpretations of underlying cognitive processes (Just & Carpenter, 1980; van Gog, Paas, van Merriënboer, & Witte, 2005). Hence, there is growing research interest in such approaches. Using think-aloud protocols, Ferguson, Bråten, and Strømsø (2012) for example were able to identify patterns of epistemic cognition when students read multiple scientific documents. In an eye-tracking study, Mason, Pluchino, and Ariasi (2014) showed that reading behavior on scientific websites was moderated by epistemic beliefs. These results demonstrate both the applicability of process measures in the study of scientific thinking as well as the role of epistemic beliefs when dealing with science. While science-related measures in the above-mentioned studies are treated as covariates, the present study adds to this research by using the measurement instrument itself as a stimulus. For this purpose we used a measure of meta-scientific reflection skills, namely the test for the evaluation of scientific contradictions (TEWI) by Oschatz, Kramer, Thomm, and Bromme (2014), which was also an element of the National Educational Panel Study (NEPS).
Our aim is to investigate the cognitive processes at work while students complete the TEWI, and to examine whether these processes are differentially related to the test scores. We triangulate product and process measures using questionnaires as well as eye-tracking and verbal reports. In doing so, we target a longstanding problem in educational measurement, because “although the collaboration between educational measurement specialists and cognitive psychologists should work easily in principle, in practice the collaboration has not been as productive as once anticipated” (Leighton, 2004, p. 13). Furthermore, we aim at analyzing the interrelations of process measures and individual difference variables. Specifically, we assume that both process measures and epistemic beliefs influence students’ meta-scientific reflection skills as measured by the TEWI. With our approach we also want to demonstrate the gain of process measures in the domain of test development by providing insight into students’ understanding of the test. To the best of our knowledge, no other study has investigated scientific thinking by analyzing the test-taking behavior itself using cognitive process measures.
Conley, A. M., Pintrich, P. R., Vekiri, I., & Harrison, D. (2004). Changes in epistemological beliefs in elementary science students. Contemporary Educational Psychology, 29(2), 186–204. Ferguson, L. E., Bråten, I., & Strømsø, H. I. (2012). Epistemic cognition when students read multiple documents containing conflicting scientific evidence: A think-aloud study. Learning and Instruction, 22(2), 103–120. Hayes, A. F. (2015). An index and test of linear moderated mediation. Multivariate behavioral research, 50(1), 1–22. Hofer, B. K., & Pintrich, P. R. (1997). The development of epistemological theories: Beliefs about knowledge and knowing and their relation to learning. Review of Educational Research, 67(1), 88–140. Huber, L. (1997). Fähigkeit zum Studieren - Bildung durch Wissenschaft. In E. Liebau, W. Mack, & C. Scheilke (Eds.), Das Gymnasium. Alltag, Reform, Geschichte, Theorie (pp. 333–351). Weinheim: Juventa. Just, M. A., & Carpenter, P. A. (1980). A theory of reading: From eye fixations to comprehension. Psychological review, 87(4), 329–354. Kaakinen, J. K., & Hyönä, J. (2005). Perspective effects on expository text comprehension: Evidence from think-aloud protocols, eyetracking, and recall. Discourse Processes, 40(3), 239–257. Kuhn, D., & Weinstock, M. (2002). What is epistemological thinking and why does it matter? In B. K. Hofer & P. R. Pintrich (Eds.), Personal Epistemology: The Psychology of Beliefs about Knowledge and Knowing (pp. 121–144). Mawah, NJ: Erlbaum. Leighton, J. P. (2004). Avoiding misconception, misuse, and missed opportunities: The collection of verbal reports in educational achievement testing. Educational Measurement: Issues and Practice, 23(4), 6–15. Mason, L., Pluchino, P., & Ariasi, N. (2014). Reading information about a scientific phenomenon on webpages varying for reliability: an eye-movement analysis. Educational Technology Research and Development, 62(6), 663–685. Müsche, H. (2009). Wissenschaftspropädeutik aus psychologischer Perspektive - Zur Dimensionierung und Konkretisierung eines bildungstheoretischen Konzeptes. TriOS – Forum für schulnahe Forschung, Schulentwicklung und Evaluation, 4(2), 61–109. OECD. (2007). PISA 2006: Science competencies for tomorrow‘s world: OECD Publishing. Oschatz, K., Kramer, J., Thomm, E., & Bromme, R. (2014). Entwicklung eines Testinstrumentes zur Erfassung von Wissenschaftspropädeutik in der gymnasialen Oberstufe (TEWI). Paper presented at the 2nd GEBF conference, Frankfurt am Main, Germany. van Gog, T., Paas, F., van Merriënboer, J. J. G., & Witte, P. (2005). Uncovering the problem-solving process: Cued retrospective reporting versus concurrent and retrospective reporting. Journal of experimental psychology. Applied, 11(4), 237–244.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.