Session Information
09 SES 07 B, Findings from International Comparative Achievement Studies: Issues in Study Design and Reporting
Parallel Paper Session
Contribution
This paper addresses the assessment of students’ scientific competency by examining how do they justify their choices, compared with their actual choices in multiple choice questions (MCQ). It makes part of a wider study about development of competencies in laboratory contexts.
International assessments, like PISA, contribute to a greater understanding of science education by helping to understand potential changes in policies, programs, and practices in science teaching (Bybee, 2009). PISA assesses the extent to which students have acquired essential knowledge and skills, focusing on student competencies in reading, mathematics and science (OCDE, 2007).
The relevance of PISA reports and its impact on emphasizing the application of knowledge over memorizing of facts has been acknowledged. PISA reports have been analysed from a diversity of perspectives. Its methodology has been criticized in terms of validity (Lau, 2009), adequacy of the data (Golstein, 2004) and the interest scales (Drechsel, Carstensen & Prenzel, 2011). Others like Sadler & Zeidler (2009) analysed the PISA approach to assessment using the lens of socioscientific issues.
This paper addresses the assessment of scientific competencies. PISA 2006 survey includes multiple choice items to assess them, which ask students to select one option.
Our focus is on the role of justification, which in our opinion is grounded on two dimensions, theoretical and methodological. From a theoretical perspective, PISA’s 2006 definition of scientific literacy includes three scientific competencies: identifying scientific issues, explaining phenomena scientifically and using scientific evidences. Using evidence is characterized as the ability to draw evidence-based conclusions about science-related issues. Being able to justify conclusions (or choices) on the basis of evidence is one of the operations making part of the use of evidence (Jiménez-Aleixandre & Puig, 2011). In other words, justifications play a central role in argumentation, which can be defined as the connection between claims and data through justifications (Jiménez-Aleixandre, 2008). Engaging in scientific argumentation is related to the acquisition of scientific competencies.
From a methodological perspective, we agree with authors as Tamir (1990) and Treagust (1989) in considering that justifications of choices in multiple choice items significantly increase the information that test results provide about students’ knowledge.
In PISA survey students’ performance of scientific competencies is detailed into a scale of six proficiency levels, with level 6 representing the highest scores and level 1 the lowest (OCDE, 2007).
We seek to examine if the MCQ format is adequate to assess sophisticate aspects of students performances, for instance whether they choose an option by reasoning or at random.
The research objectives are: a) to examine students’ acquisition of scientific competencies by analyzing their written justifications of their choices in a PISA item about tooth decay. b) To compare students’ justifications with students’ choices to MCQ.
Method
Expected Outcomes
References
Bybee, R. (2009). Scientific literacy and contexts in PISA 2006 Science. Journal of Research in Science Teaching, 46(8), 862-864. Drechsel, B., Carstensen, C., & Prenzel, M. (2011). The role of content and context in PISA interest scales: A study of the embedded interest items in the PISA 2006 science assessment. International Journal of Science Education, 33(1), 73-95. Goldstein, H. (2004). International comparisons of student attainment: some issues arising from the PISA study. Assessment in Education, 11(3), 319-330. Jiménez-Aleixandre, M. P. (2008). Designing argumentation learning environments. In S. Erduran, & M. P. Jiménez-Aleixandre (Eds.), Argumentation in Science Education. Perspectives from classroom-based research. Dordrecht: Springer. Jiménez-Aleixandre, M. P., & Puig, B. (2011). The role of justifications in integrating evidence in arguments: Making sense of gene expression. Paper presented at the ESERA conference, Lyon (France), September. Lau, K. C. (2009). A critical examination of PISA’s assessment on scientific literacy. International Journal of Science Education, 7(6), 1061-1088. Organisation for Economic Co-operation and Development (OECD). (2009). PISA take the test: sample questions from OECD’s PISA assessments. Paris: OECD. Organisation for Economic Co-operation and Development (OECD). (2007). PISA 2006: Science competencies for tomorrow’s world. Paris: OECD. Sadler, T. D. & Zeidler, D. L. (2009). Scientific literacy, PISA, and socioscientific discourse: assessment for progressive aims of science education. Journal of Research in Science Teaching, 46(8), 909-921. Tamir, P. (1990). Justifying the selection of answers in multiple choice items. International Journal of Science Education, 12(5), 563-573. Treagust, D: F. (1988). Development and use of diagnostic tests to evaluate students’ misconceptions in science. International Journal of Science Education, 19(2), 159-169.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.