Session Information
09 SES 03 B, Challenges in Educational Measurement Practices
Paper Session
Contribution
Reading literacy is considered an essential factor for learning and personal development (Mullis & Martin, 2015). International assessments like PIRLS are tracking trends and shaping literacy policies. They seek to evaluate global student learning, offering crucial insights into educational performance to shape informed policy decisions. Given the ongoing technological expansion and innovation, a shift in delivery mode became an inevitable progression (Jerrim, 2018). PIRLS has adapted to these changes, introducing the digital format in 2016 (ePIRLS) and achieving a significant milestone in 2021 with the partial transition to a digital assessment, through a web-based digital delivery system. Digital PIRLS included a variety of reading texts presented in an engaging and visually attractive format that were designed to motivate students to read and interact with the texts and answer comprehension questions. While considerable effort has been invested to ensure content similarity between the two formats, variations persist due to the distinct modes of administration (Almaskut et al., 2023). This creates the need for further analysis and exploration to better understand the impact of these differences on the overall outcomes and effectiveness of the administered modes.
Previous research has highlighted the presence of a mode effect, varying in magnitude, when comparing paper-based and digital assessments (Jerrim, 2018; Kingston, 2009). Jerrim's (2018) analysis of PISA 2015 field trial data across Germany, Ireland, and Sweden indicates a consistent trend of students scoring lower in digital assessments compared to their counterparts assessed on paper. Furthermore, Kingston's meta-analysis (2009) indicates that, on average, elementary students score higher on paper and exhibit small effect sizes when transitioning from paper-based to digital reading assessments. On the other hand, PIRLS 2016 was administered both in paper and digitally in 14 countries, where students in nine countries performed better in digital assessments, while only in five countries did students perform better in paper (Grammatikopoulou et al., 2024).
Formulärets överkant
Additionally, research underscores the distinct consequences of printed and digital text on memory, concentration, and comprehension (Delgado et al., 2018; Baron, 2021). Furthermore, previous findings support the fact that there is variation when it comes to the factors influencing performance in these two modes. Time spent on internet and computer use for school was found as a significant predictor of digital assessments, but not of paper-based (Gilleece & Eivers, 2018).
The present study
Sweden was among the 26 countries out of 57 that administered the digital format in PIRLS 2021. Another paper-based text -replicated from PIRLS 2016- was also administered to a ‘bridge’ sample. To maintain consistency across formats, both digital PIRLS and paper PIRLS share identical content in terms of reading passages and questions. However, digital PIRLS utilizes certain features and item types that are not accessible in the traditional paper and pencil mode. The digital version showcased advantages such as operational efficiency and enhanced features, while maintaining content consistency with the paper format. The primary aim of the present study is to investigate a potential mode effect between digital and paper formats, if there, and explore any variations in reading achievement between the two formats. Despite advancements in digital assessment, there remains a gap in our understanding of how the shift from traditional paper-based assessments to digital formats may impact reading literacy outcomes. By delving into these potential differences, we aim to contribute valuable insights into the evolving landscape of educational assessments, informing educators, policymakers, and researchers about the effectiveness and potential challenges associated with the integration of digital modes in literacy evaluation.
Method
The present study uses PIRLS 2021 data for Sweden. Sweden participated in digital PIRLS 2021 with 5175 students. A bridge sample, separate and equivalent, was administered on paper for 1863 students (Almaskut et al., 2023). The study aims to explore the potential mode effect in both paper-based and digital assessments, utilising item data from digital PIRLS and paper PIRLS. To assess and compare digital PIRLS and paper PIRLS as measures, we will employ a bifactor structural equation model, with a general reading achievement factor and specific factors representing the digital and paper formats. Constructing a bifactor model involves specifying key components to capture the nuances of reading achievement in both digital and paper formats. In this framework, a general reading achievement factor is introduced alongside specific factors representing the unique aspects of the digital and paper assessment modes. Notably, PIRLS categorizes reading into two broad purposes: reading for literary experience and reading to acquire and use information. Building upon this categorization, we will construct two variables based on the stated purposes of reading: the literary and the informational. We will explore how these variables contribute to reading achievement and whether there are variations in reading achievement between digital and paper formats. The model will incorporate paths from 'Literary’ and 'Information’ to both the general factor and specific factors. These paths facilitate the examination of how each observed variable influences the overall reading achievement and its specific manifestations in the digital and paper contexts. Additionally, observed indicators for each variable are included, ensuring a comprehensive representation of the constructs in the bifactor model. Furthermore, the analysis will control for socio-economic status (SES), immigrant background, and gender as variables while exploring mode effects or bias in either mode.
Expected Outcomes
The study will employ a bifactor model in the context of PIRLS 2021 data for Sweden to elucidate the multifaceted construct of reading literacy/achievement and potential mode effects between digital and paper formats. While the empirical results are pending, we anticipate several key outcomes. We expect to observe variations in the relationships between our latent constructs and observed indicators based on the mode of assessment. Based on previous findings, we tentatively expect to discern the presence of both general and specific factors, indicating that there are unique aspects associated with digital and paper reading processes that significantly impact reading achievement beyond the shared aspects captured by the general factor. Our expectation is grounded in the understanding that different areas and processes of reading may exhibit varied patterns. For instance, we speculate that while informational reading might predominantly contribute to the general reading achievement factor, fictional or longer text reading may exhibit specific factors. This differentiation in our analysis aims to provide a more nuanced understanding of the complex relationships within the reading achievement construct, considering the diverse aspects of reading activities and processes associated with digital and paper formats. The complexities showed in our analyses may prompt inquiries into additional contextual factors, the stability of mode effects across different populations, and the longitudinal impact on reading outcomes. In conclusion, our study's expected outcomes encompass a comprehensive exploration of mode effects, the unique contributions of latent factors, the significance of specific indicators, implications for educational practice, and the identification of future research directions.
References
Almaskut, A., LaRoche, S., & Foy, P. (2023). Sample Design in PIRLS 2021. TIMSS & PIRLS International Study Center. https://doi.org/10.6017/lse.tpisc.tr2103.kb9560 Baron, N. S. (2021). Know what? How digital technologies undermine learning and remembering. Journal of Pragmatics, 175, 27–37. https://doi.org/10.1016/j.pragma.2021.01.011 Cheung, K., Mak, S., & Sit, P. (2013). Online Reading Activities and ICT Use as Mediating Variables in Explaining the Gender Difference in Digital Reading Literacy: Comparing Hong Kong and Korea. The Asia-Pacific Education Researcher, 22(4), 709–720. https://doi.org/10.1007/s40299-013-0077-x Cho, B.-Y., Hwang, H., & Jang, B. G. (2021). Predicting fourth grade digital reading comprehension: A secondary data analysis of (e)PIRLS 2016. International Journal of Educational Research, 105, 101696. https://doi.org/10.1016/j.ijer.2020.101696 Delgado, P., Vargas, C., Ackerman, R., & Salmerón, L. (2018). Don’t throw away your printed books: A meta-analysis on the effects of reading media on reading comprehension. Educational Research Review, 25, 23–38. https://doi.org/10.1016/j.edurev.2018.09.003 Gilleece, L., & Eivers, E. (2018). Characteristics associated with paper-based and online reading in Ireland: Findings from PIRLS and ePIRLS 2016. International Journal of Educational Research, 91, 16–27. https://doi.org/10.1016/j.ijer.2018.07.004 Grammatikopoulou, E., Johansson, S., & Rosén, M., (2024). Paper-based and Digital Reading in 14 countries: Exploring cross-country variation in mode effects. Unpublished manuscript. Jerrim, J., Micklewright, J., Heine, J.-H., Salzer, C., & McKeown, C. (2018). PISA 2015: How big is the ‘mode effect’ and what has been done about it? Oxford Review of Education, 44(4), 476–493. https://doi.org/10.1080/03054985.2018.1430025 Kingston, N. M. (2008). Comparability of Computer- and Paper-Administered Multiple-Choice Tests for K–12 Populations: A Synthesis. Applied Measurement in Education, 22(1), 22–37. https://doi.org/10.1080/08957340802558326 Krull, J. L., & MacKinnon, D. P. (2001). Multilevel Modeling of Individual and Group Level Mediated Effects. Multivariate Behavioral Research, 36(2), 249–277. https://doi.org/10.1207/S15327906MBR3602_06 Mullis, I. V. S., & Martin, M. O. (Eds.). (2015). PIRLS 2016 Assessment Framework (2nd ed.). Retrieved from Boston College, TIMSS & PIRLS International Study Center website: http://timssandpirls.bc.edu/pirls2016/framework.html Rasmusson, M., & Åberg-Bengtsson, L. (2015). Does Performance in Digital Reading Relate to Computer Game Playing? A Study of Factor Structure and Gender Patterns in 15-Year-Olds’ Reading Literacy Performance. Scandinavian Journal of Educational Research, 59(6), 691–709. https://doi.org/10.1080/00313831.2014.965795
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.