Session Information
09 ONLINE 00 PS, General Poster Session (online) - NW 09
General Poster Session
Contribution
In recent years, there has been a rapid expansion of interest in integrating scientific inquiry into computer-based assessments (Smetana & Bell, 2012). Compared to paper-and-pencil tests, computer-based assessments can measure complex inquiry skills more effectively in a wider range of science contexts (Pellegrino & Quellmalz, 2010). Another advantage of the shift towards dynamic, interactive, and even adaptive computer-based assessments of inquiry is the availability of “log files”—files that contain the records of all the actions students performed during an assessment, along with their corresponding timestamps. Log files data hold a great potential to generate a deeper understanding of students’ inquiry performance as they provide information beyond the correctness of responses (Gobert et al., 2015; Scalise & Clarke-Midura, 2018). Instead of merely knowingwhat has been achieved, log files contain additional information about behaviours that occur during the inquiry process, thereby providing possible insights into how the responses were produced (Gobert et al., 2015; Greiff et al., 2018). By analysing log files data, researchers can identify student inquiry performance not only from the correctness of responses but also their underlying behaviours during the assessment (Baker et al., 2016).
While previous research has examined the frequency and pattern of actions that underlie the responses to inquiry tasks for explaining student performance (e.g., Greiff et al., 2018; Li et al., 2017; Scalise & Clarke-Midura, 2018), few studies have combined these behavioural indicators with response times to identify students’ inquiry performance, especially during the investigative and inferential phases of inquiry. Students who received similar scores on an inquiry task might not have interacted with the computer-based environment in the same way, even though they may have reached the same level of performance. In fact, some students might have been able to apply an optimal strategy directly, while others may have explored various strategies before completing the task. Even among those who demonstrated an identical strategy, there might be differences in the amount of time they spent exploring the task and the extent to which they performed a systematic and efficient exploration. Although students’ final response provides useful information about what has been accomplished (product data), it says very little about how they engaged with the inquiry tasks to produce such results (process data).
The present study provides a glimpse into future research using a wealth of information in students’ log files data by addressing the following research question: How can students’ strategies in solving scientific inquiry tasks be mapped into different performance levels? It offers a new research frontier by demonstrating how log files data are used to identify different types of strategies that underlie successful and unsuccessful performance and provides teachers with relevant suggestions to improve their instructions.
Method
To address the research question, this study examined log files data from the Programme for International Student Assessment (PISA), a computer-based assessment of scientific literacy. PISA 2015 offers large-scale and publicly available log files data from 57 countries and economies, providing the potential for generalizability. These data could further explain science performance within a country while providing an opportunity for cross-country validation studies. PISA holds great promise for capturing challenges related to scientific inquiry, including in the investigative phase (e.g., to formulate hypotheses, design experiments, and collect evidence from the experiments) and the inferential phase, such as in interpreting evidence, drawing conclusions, and developing explanations (Teig et al., 2020). This study showcases how log files data can be analysed to identify students’ strategies in solving inquiry tasks using innovative process mining and latent profile approaches. Process mining is widely used to improve workflows in business, management, and computer science. This study crosses disciplinary boundaries by integrating process mining to discover student strategies based on data from event logs in computer-based assessment. The second example shows how log files data are examined using an explanatory statistical approach called latent profile analysis.
Expected Outcomes
Findings from the process mining approach indicate that almost all students could identify the phenomenon under investigation and design an experiment during the investigative phase. However, although most students generated relevant data, only two-third of them selected these data as appropriate evidence to support their explanations in the inferential phase. Results from the latent profile approach shows unique patterns of performance based on students’ inquiry exploration behaviour, inquiry strategy, time-on-task, and item accuracy. The findings indicate three distinct profiles of inquiry performance with varying problem-solving patterns: strategic, emergent, and disengaged. The present study highlights the potential of using log files data to gain a deeper understanding of student performance. Rather than solely relying on what has been achieved or the accuracy of student responses, log files offer additional insights to understand how the responses were produced. The existence of and the differences between students’ inquiry profiles suggests that their final responses on the simulated inquiry tasks do not necessarily provide a clear picture about what they can accomplish and the challenges they encounter in solving the tasks effectively. Given the richness of reasoning skills involved in scientific inquiry, the assessment of both the products and processes of inquiry is essential to elicit what students know and how they apply their knowledge in the context of real-life situations. This study contributes to understanding how students interact with complex simulated inquiry tasks and showcases how log files data from PISA 2015 can aid this understanding using process mining and latent profile approaches. This study discusses the implications of both approaches in analysing student process data and improving the assessment of scientific inquiry as well as how teachers can adapt their instructions to support student understanding of scientific inquiry.
References
Baker, R., Clarke‐Midura, J., & Ocumpaugh, J. (2016). Towards general models of effective science inquiry in virtual performance assessments. Journal of Computer Assisted Learning, 32(3), 267-280. Gobert, J. D., Kim, Y. J., Sao Pedro, M. A., Kennedy, M., & Betts, C. G. (2015). Using educational data mining to assess students’ skills at designing and conducting experiments within a complex systems microworld. Thinking Skills and Creativity, 18, 81-90. Greiff, S., Molnár, G., Martin, R., Zimmermann, J., & Csapó, B. (2018). Students' exploration strategies in computer-simulated complex problem environments: A latent class approach. Computers & Education, 126, 248-263. Li, H., Gobert, J., & Dickler, R. (2017). Dusting off the messy middle: Assessing students’ inquiry skills through doing and writing. In E. André, R. Baker, X. Hu, M. M. T. Rodrigo, & B. du Boulay (Eds.), Artificial Intelligence in Education (Vol. 10331, pp. 175-187). Springer Pellegrino, J. W., & Quellmalz, E. S. (2010). Perspectives on the integration of technology and assessment. Journal of Research on Technology in Education, 43(2), 119-134. Scalise, K., & Clarke-Midura, J. (2018). The many faces of scientific inquiry: Effectively measuring what students do and not only what they say. Journal of research in science teaching, 55(10), 1469-1496. Smetana, L. K., & Bell, R. L. (2012). Computer simulations to support science instruction and learning: A critical review of the literature. International Journal of Science Education, 34(9), 1337-1370. Teig, N., Scherer, R., & Kjærnsli, M. (2020). Identifying patterns of students' performance on simulated inquiry tasks using PISA 2015 log‐file data. Journal of research in science teaching, 57(9), 1400-1429.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.