Do PISA Questions Really Assess Students’ Acquisition of Scientific Competencies? A Proposal About Emphasis on Justifications.
Author(s):
Beatriz Crujeiras-Pérez (presenting / submitting) María Pilar Jiménez-Aleixandre (presenting)
Conference:
ECER 2012
Format:
Paper

Session Information

09 SES 07 B, Findings from International Comparative Achievement Studies: Issues in Study Design and Reporting

Parallel Paper Session

Time:
2012-09-19
17:15-18:45
Room:
FCT - Aula 15
Chair:
Martin Goy

Contribution

This paper addresses the assessment of students’ scientific competency by examining how do they justify their choices, compared with their actual choices in multiple choice questions (MCQ). It makes part of a wider study about development of competencies in laboratory contexts.

International assessments, like PISA, contribute to a greater understanding of science education by helping to understand potential changes in policies, programs, and practices in science teaching (Bybee, 2009). PISA assesses the extent to which students have acquired essential knowledge and skills, focusing on student competencies in reading, mathematics and science (OCDE, 2007).

The relevance of PISA reports and its impact on emphasizing the application of knowledge over memorizing of facts has been acknowledged. PISA reports have been analysed from a diversity of perspectives. Its methodology has been criticized in terms of validity (Lau, 2009), adequacy of the data (Golstein, 2004) and the interest scales (Drechsel, Carstensen & Prenzel, 2011). Others like Sadler & Zeidler (2009) analysed the PISA approach to assessment using the lens of socioscientific issues.

This paper addresses the assessment of scientific competencies. PISA 2006 survey includes multiple choice items to assess them, which ask students to select one option.

Our focus is on the role of justification, which in our opinion is grounded on two dimensions, theoretical and methodological. From a theoretical perspective, PISA’s 2006 definition of scientific literacy includes three scientific competencies: identifying scientific issues, explaining phenomena scientifically and using scientific evidences. Using evidence is characterized as the ability to draw evidence-based conclusions about science-related issues. Being able to justify conclusions (or choices) on the basis of evidence is one of the operations making part of the use of evidence (Jiménez-Aleixandre & Puig, 2011). In other words, justifications play a central role in argumentation, which can be defined as the connection between claims and data through justifications (Jiménez-Aleixandre, 2008). Engaging in scientific argumentation is related to the acquisition of scientific competencies.

From a methodological perspective, we agree with authors as Tamir (1990) and Treagust (1989) in considering that justifications of choices in multiple choice items significantly increase the information that test results provide about students’ knowledge.

In PISA survey students’ performance of scientific competencies is detailed into a scale of six proficiency levels, with level 6 representing the highest scores and level 1 the lowest (OCDE, 2007).

We seek to examine if the MCQ format is adequate to assess sophisticate aspects of students performances, for instance whether they choose an option by reasoning or at random.

The research objectives are: a) to examine students’ acquisition of scientific competencies by analyzing their written justifications of their choices in a PISA item about tooth decay. b) To compare students’ justifications with students’ choices to MCQ.

Method

The methodological approach combines quantitative and qualitative research. The participants are 59 students in grades 7 and 8 from three high schools. Their written responses to a PISA item with three questions about tooth decay (OECD, 2009) were collected. This item was modified asking students to justify their choice to each question. Students’ written justifications were analyzed by means of three rubrics based on the scale of proficiency levels for scientific competencies. We developed each rubric depending on the competency assessed in each question. For questions 1 (the role of bacteria) and 2 (identify the trend in a graphic) the rubrics focus on using scientific evidence, and for question 3 on identifying scientific issues that would be explored by experiments. More details about how each rubric was developed will be discussed in the extended paper.

Expected Outcomes

Question 1 required students to identify the role of bacteria in tooth decay. 89.5% of them chose the correct option, but only 47.4% were able to justify their choice on the basis of identifying the relevant information in the text (level 3 of proficiency). Question 2 required students to identify which of four statements was supported by the data given in a graph. 47.4% of them chose the correct option, but only 10.5% were able to establish a justification based on the trend observed in the graph (level 2 of proficiency). The remaining 36.9% did not include in their justifications the information from the graph. Question 3 required students to identify which of two questions could be answered by scientific experiments (level 3 of proficiency). 63.1% of them chose the correct answer but only 31.6% were able to justify their choices. 57.9% answered to the items of the question instead of justifying if they could be answered scientifically, and the remaining 10.5% did not justify their choice. We suggest that an implication would be the interest of requiring students to justify their choices in PISA MCQ because it would provide more reliable information about students’ acquisition of scientific competencies.

References

Bybee, R. (2009). Scientific literacy and contexts in PISA 2006 Science. Journal of Research in Science Teaching, 46(8), 862-864. Drechsel, B., Carstensen, C., & Prenzel, M. (2011). The role of content and context in PISA interest scales: A study of the embedded interest items in the PISA 2006 science assessment. International Journal of Science Education, 33(1), 73-95. Goldstein, H. (2004). International comparisons of student attainment: some issues arising from the PISA study. Assessment in Education, 11(3), 319-330. Jiménez-Aleixandre, M. P. (2008). Designing argumentation learning environments. In S. Erduran, & M. P. Jiménez-Aleixandre (Eds.), Argumentation in Science Education. Perspectives from classroom-based research. Dordrecht: Springer. Jiménez-Aleixandre, M. P., & Puig, B. (2011). The role of justifications in integrating evidence in arguments: Making sense of gene expression. Paper presented at the ESERA conference, Lyon (France), September. Lau, K. C. (2009). A critical examination of PISA’s assessment on scientific literacy. International Journal of Science Education, 7(6), 1061-1088. Organisation for Economic Co-operation and Development (OECD). (2009). PISA take the test: sample questions from OECD’s PISA assessments. Paris: OECD. Organisation for Economic Co-operation and Development (OECD). (2007). PISA 2006: Science competencies for tomorrow’s world. Paris: OECD. Sadler, T. D. & Zeidler, D. L. (2009). Scientific literacy, PISA, and socioscientific discourse: assessment for progressive aims of science education. Journal of Research in Science Teaching, 46(8), 909-921. Tamir, P. (1990). Justifying the selection of answers in multiple choice items. International Journal of Science Education, 12(5), 563-573. Treagust, D: F. (1988). Development and use of diagnostic tests to evaluate students’ misconceptions in science. International Journal of Science Education, 19(2), 159-169.

Author Information

Beatriz Crujeiras-Pérez (presenting / submitting)
University of Santiago de Compostela
Didáctica das ciencias experimentais
Santiago de Compostela
University of Santiago de Compostela, Spain

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.