Session Information
09 SES 07 A, Innovations, Challenges, and Insights from International Large-Scale Assessments (Part 2): Methodological Challenges
Symposium
Contribution
Recently, educational testing has been transitioning to computer-based assessment (CBA) from paper-based surveys (PBA) at the international and national levels. For instance, the Progress in International Reading Literacy Study (PIRLS) started this transition in 2016, and the Trends in International Mathematics and Science Study (TIMSS) in 2019. This change might introduce unintended comparability and fairness issues. Such issues can arise from e.g., the students’ access to and familiarity with digital devices, which is potentially related to their socioeconomic background. The extent of these influences may also vary considerably across countries. The present study investigates the mode effects in PIRLS 2021 and TIMSS 2019 for all participating educational systems in the context of test fairness. The Standards for Educational and Psychological Testing (American Educational Research Association [AERA] et al., 2014) state that fairness has no single technical meaning. Four general aspects of fairness to consider are the equivalent opportunity of test-takers in the testing process, the lack of measurement bias, the access to the construct(s) as measured, and the validity of individual test score interpretations for the intended uses (AERA et al., 2014). Mode effects were defined by Kroehne et al. (2019) as “differences in measurement caused by unequal properties of different test administrations, holding test items (e.g., the stimulus, the reading text, and questions) constant” (p.99). This study explores the extent of the mode effects while controlling for individual covariates, such as gender, language spoken at home, home resources at home, and access to digital devices. We also investigate whether there remains unobserved heterogeneity at the country level. The analyses are based on the publicly available TIMSS 2019 and PIRLS 2021 data collected using representative samples in grades four. The applied method is mixed effects Rasch modeling using the item response data of educational systems that administered the new CBA test and also PBA for a randomly equivalent sample of students, i.e., 25 in PIRLS 2021, and 28 in TIMSS 2019 (Martin et al., 2020; Mullis & Martin, 2019). The preliminary results indicate that there are statistically significant mode effects after controlling for individual characteristics. The mode effects and the influence of individual characteristics differ across subject domains. Our preliminary results are in line with previous empirical evidence but to our knowledge, this study is the first one to explore several educational systems and multiple subject domains while also accounting for individual characteristics and the nested nature of the data.
References
AERA, APA, & NCME. (2014). Standards for educational and psychological testing. American Educational Research Association. Kroehne, U., Buerger, S., Hahnel, C., & Goldhammer, F. (2019). Construct equivalence of PISA reading comprehension measured with paper-based and computer-based assessments. Educational Measurement: Issues and Practice, 38(3), 97–111. https://doi.org/10.1111/emip.12280 Martin, M. O., von Davier, M., & Mullis, I. V. S. (Eds.). (2020). Methods and procedures: TIMSS 2019 technical report. TIMSS & PIRLS International Study Center, Boston College. Mullis, I. V. S., & Martin, M. O. (Eds.). (2019). PIRLS 2021 assessment frameworks. TIMSS & PIRLS International Study Center, Boston College.
Update Modus of this Database
The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.