Students’ Ability to Think Scientifically: Findings from PISA 2015 and an In-depth Case Study
Author(s):
Nani Teig (presenting / submitting) Ronny Scherer
Conference:
ECER 2016
Format:
Paper

Session Information

ERG SES H 08, Studies on Education

Paper Session

Time:
2016-08-23
11:00-12:30
Room:
OB-E0.01
Chair:
Joanna Madalinska-Michalak

Contribution

One of the major goals in science education is to help students become scientifically literate. Although not all students will ultimately pursue a career in science, the thinking skills they use when engaged in inquiry activities are relevant to their everyday life decision-making (Kuhn, 2007). In order to successfully participate in the process of inquiry, students are expected to demonstrate fundamental scientific thinking skills (Kuhn & Pease, 2008).

Kuhn defined scientific thinking as purposefully seeking knowledge that drives the conceptual change process and lead to the goal of constructing scientific understanding (2007). Scientific thinking encompasses the abilities needed for the application of scientific inquiry, such as designing and evaluating scientific investigations, evaluating evidence, and making causal inferences for forming and modifying theories related to  the phenomenon under investigation (Zimmerman, 2007).

Scientific thinking skills comprise a number of key subskills; yet, existing research has mostly focused on a single subskill rather than taking an integrated and much broader perspective. The present study therefore takes an integrated approach to investigating scientific thinking. Specifically, it studies the skills that students apply when they solve problems that require the coordination of the effects of multiple variables and the coordination of theory and evidence. An analysis of the PISA 2015 science data will provide insight into both the quality and the strategies of students’ scientific thinking. An in-depth case study will also be used to further explore the difficulties students encounter in performing scientific thinking. The research questions that guide this study are:

(1)   What do the 2015 PISA science data tell us about the level of students’ ability to coordinate the effects of multiple variables?

(2)   Which strategies do high and low achievers use to coordinate the effects of multiple variables?

(3)   What is the quality of the students’ reasoning as they solve problems that require the coordination of theory and evidence?

(4)   What are the challenges that students face in coordinating the effects of multiple variables and in coordinating theory and evidence?

 

The ability to predict outcomes based on the simultaneous effects of multiple variables is essential for producing experiments that yield interpretable evidence and for facilitating inferential skills (Zimmerman, 2007).  One approach to examining this ability is to identify the strategies students use for creating desired outcomes. Research on scientific thinking has heavily focused on a specific strategy called control-of-variables strategy (CVS). This strategy creates an experimental situation where the effects of independent variables on dependent variables can be disentangled.  There is a growing body of research analysing the CVS on a small-scale basis (review in Schwichow, Croker, Zimmerman, Höffler, & Härtig, 2015). A standardized assessment such as PISA, however, offers an opportunity to study students’ strategic approaches further on a large-scale basis with representative samples and the potential of generalizability.

Coordinating theory and evidence is another important inquiry skill. Kuhn and Pearsall (2000) claim that it is the essence of advanced scientific thinking because it requires an openness to revise or completely reformulate one’s initial theory or belief in response to a pattern of evidence. Interpreting evidence from first-hand data is crucial in assessing participants’ scientific reasoning (Zimmerman, 2007). Delen and Krajcik (2015) suggested the importance of collecting data through first-hand investigation in creating a quality of reasoning explanation. Research also shows that there is no significant difference in participants’ reasoning quality when they interpreted data from physical or simulated systems (Klahr, Triona, & Williams, 2007). Due to the possibilities to incorporate simulated hands-on environments into computer-based science tests such as the one administered in PISA, novel opportunities exist to assess a range of reasoning skills rather than a single skill.

Method

a. PISA 2015 secondary data analysis In 2015, the entire PISA test was conducted for the first time and for most countries through computer-based assessment with science as the main subject. The data set that will be analysed in this study is from the Norwegian PISA sample which comprises approximately 5400 students from 230 schools. Since generating first-hand data through scientific experiments is considered an important aspect of scientific thinking, the current research concentrates only on the PISA science items that contain an interactive simulation. These items required students to generate evidence through the simulation on the one hand, and to explain how their evidence fits with a proposed theory on the other hand (OECD, 2013). Students’ responses on the PISA items. The responses will be evaluated with two different scoring rubrics. To answer Research Question 1, the rubric for assessing students’ ability to coordinate the effects of multiple variables focuses on the consideration of relevant variables (“none”, “some variables”, “all variables”), goal-directed actions (“ineffective actions with insignificant outcome”, “ineffective actions with correct outcome”, “effective actions with significant outcome”), and quality of evidence (“no evidence”, “inadequate data”, “adequate and relevant data”). For evaluating students’ ability to coordinate theory and evidence (Research Question 3), the rubric emphasizes evidence interpretation and theory application (“none”, “limited interpretation of evidence and/or relevant theory“, “accurate interpretation of evidence and relevant theory application”), and conclusion (“none”, “partially accurate”, “accurate and supported conclusion”). Log-file data. All the steps and actions students performed while solving the scientific literacy items on PISA 2015 were recorded and stored in the computers as log-file data. To answer Research Question 2, the analysis of log-file data will focus on identifying differences between high and low science achievers’ problem solving strategies, including the elements of their experimental design strategy (e.g., CVS, response times, and trial repetition). b. In-depth case study While the PISA secondary data analysis can provide quality data on students’ ability to think scientifically, it is unable to offer in-depth and realistic insights into the challenges that students encountered while solving the PISA items. Consequently, an in-depth case study method using multiple sources of data (think aloud, computer screen recording, video recording, interview) is needed in order to address Research Question 4. The study participants will be six to eight 15-year old (comparable in age to the PISA sample) who are equally divided by gender and have varying ability levels.

Expected Outcomes

By examining students’ responses on the PISA 2015 science items (Research Question 1 and 3), we can identify some common features that can further our understanding on students’ ability to think scientifically, especially on the ways they coordinate the effect of multiple variables and to coordinate theory and evidence. With respect to Research Question 3, we expect differences between low and high achievers in multiple variables reasoning. Specifically, following the expert-novice paradigm in the context of problem solving, high achievers may analyse the connections between variables more deeply, think and reason faster, and perform the CVS more efficiently than low achievers. Hence, we expect differences in the efficiency of CVS and response times. Through in-depth case study method, we are most likely able to identify the difficulties students encounter when they solve items that require coordinating the effects of multiple variables and coordinating theory and evidence skill (Research Question 4). The identification of the challenges can provide a potential contribution to science teaching and learning. As science educators strive to help students to become scientifically literate individuals, it is crucial for them to understand the challenges that hinder students to be able to perform scientific thinking properly.

References

Delen, I., & Krajcik, J. (2015). What do students’ explanations look like when they use second-hand data? International Journal of Science Education, 37(12), 1953-1973. Klahr, D., Triona, L. M., & Williams, C. (2007). Hands on what? The relative effectiveness of physical versus virtual materials in an engineering design project by middle school children. Journal of Research in Science Teaching, 44(1), 183-203. Kuhn, D. (2007). What Is Scientific Thinking and How Does It Develop? Blackwell Handbook of Childhood Cognitive Development (pp. 371-393): Blackwell Publishers Ltd. Kuhn, D., & Pearsall, S. (2000). Developmental origins of scientific thinking. Journal of Cognition and Development, 1(1), 113-129. Kuhn, D., & Pease, M. (2008). What needs to develop in the development of inquiry skills? Cognition and instruction, 26(4), 512-559. OECD. (2013). PISA 2015 Released Field Trial Cognitive Items. Paris: OECD. Schwichow, M., Croker, S., Zimmerman, C., Höffler, T., & Härtig, H. (2016). Teaching the control-of-variables strategy: A meta-analysis. Developmental Review, 39, 37-63. Zimmerman, C. (2007). The development of scientific thinking skills in elementary and middle school. Developmental Review, 27(2), 172-223.

Author Information

Nani Teig (presenting / submitting)
University of Oslo
Department of Teacher Education and School Research
OSLO
Centre for Educational Measurement at University of Oslo (CEMO), Norway

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.