09 SES 03 A, Comparing Computer- and Paper-Based-Assessment
Programme for International Student Assessment (PISA) assesses the competencies of 15 year-old-students in reading, mathematics and science. In 2012 the main focus of PISA was on mathematics. Although it has traditionally been conducted using pen-and-paper tests, in 2012 PISA included for the very first time an optional computer-based assessment in mathematics (CBAM). Out of 65 economies participating in the 2012 round, 32 economies also conducted the computer-based mathematics test. Among them were 17 European countries. The rational underlying the introduction of the CBAM was that “a level of competency in mathematical literacy in the twenty-first century includes usage of computers” (OECD 2013a). Thus the CBAM units comprised both items that are not dependent on the mode of assessment (pen-and-paper versus computer) and those that require knowledge of doing mathematics with the assistance of a computer. The latter category included items which requested students to make charts from data or produce graphs, use on-screen calculators or rulers, sort information and plan efficient sorting strategies or transform an image using a mouse for its rotation and/or translation applying calculators, statistical software, geometric construction and visualisation utilities, and virtual measurement instruments (OECD, 2013a).
Despite the fact the technology has the potential to modify all aspects of the assessment process (Almond, Steinberg, & Mislevy, 2003) and allow for the assessment of a wider “bandwidth” of mathematical proficiency (Stacey & Wiliam, 2013), there are several reasons why a change in assessment mode may influence students’ results across countries (Jerrim, 2016). For example, schools may be less accustomed to conduct computer-based assessments, different cognitive processes may be needed for reading on paper and computer (Hou, Rashid & Lee, 2017), and students’ basic computer skills still very much vary across the countries (OECD, 2015).
Throughout the years, motivation is regarded as the driving force behind students’ learning (Wigfield, Tonks & Klauda, 2009), while mathematics self-related beliefs portray students’ subjective views on the domain. PISA data for 2012, point to significant variance across countries and economies when it comes to students’ intrinsic interest to learn mathematics, (OECD, 2013b). Students who have low levels of mathematics self-efficacy (one’s belief that he(she) can through their actions produce desired effects) are at a high risk of underperforming in mathematics, despite own abilities (Schunk & Pajares, 2009). Correspondingly, PISA 2012 results have indicated math self-efficacy is associated with a difference of 49 score points in mathematics – the equivalent of more than one school year. Similarly longitudinal studies of self-concept (perceived competence in mathematics) and achievement show that they are reciprocally related over time (Marsh, Xu & Martin, 2012), while across the PISA participating countries 43% of students reported perceiving themselves as not good at mathematics (OECD, 2013b).
In view of the above premises this paper examines patterns in students’ achievement in mathematics in European countries participating in both assessment modes – pen-and-paper and computer. In particular we explore (a) what are the predictors of achievement for both assessment modes at the country level; (b) whether particular groups of students can be distinguished across European countries taking into account their math self-related beliefs and ICT practices at school, and (c) whether students in any of these groups score higher/lower in math taking into account their achievement at pen-and-paper and computer math tests. In particular we are interested in what are the typical math problems students can solve when observing particular student groups and their performance in computer based assessment. We focus on the entire student body in each of the countries.
Almond, R. G., Steinberg, L. S., & Mislevy, R. J. (2003). A four-process architecture for assessment delivery, with connections to assessment design. Los Angeles, CA: University of California Hou, J., Rashid, J. & Lee, K.M. (2017). Cognitive map or medium materiality? Reading on paper and screen. Computers in Human Behavior, 67, 84–94. Jerrim, J. (2016). PISA 2012: how do results for the paper and computer tests compare?, Assessment in Education: Principles, Policy & Practice. DOI: 10.1080/0969594X.2016.1147420 Marsh, H.W., Xu, K. & Martin, A.J. (2012). Self-concept: A synergy of theory, method, and application, in K. Harris., S. Graham & T. Urdan (Eds.), APA Educational Psychology Handbook, Vol. 1: Theories, Constructs, and Critical Issues, (pp. 427-458), Washington, D.C.: American Psychological Association. OECD (2015). Students, Computers and Learning: Making the Connection, Paris: OECD Publishing. OECD (2013). Ready To Learn: Students’ Engagement, Drive And Self-Beliefs – Volume III, Paris: OECD Publishing. OECD (2013). PISA 2012 Assessment and Analytical Framework: Mathematics, Reading, Science, Problem Solving and Financial Literacy, Paris: OECD Publishing. Schunk, D. H., & Pajares, F. (2009). Self-efficacy theory. In K. R. Wentzel & A. Wigfield (Eds.), Handbook of motivation at school, (pp. 35- 53). New York: Routledge. Stacey, K. & Wiliam, D. (2013). Technology and assessment in mathematics, in M A Clements, et al. (Eds.), Third International Handbook of Mathematics Education (pp. 721-751). New York: Springer. Wigfield, A., Tonks, S. & Klauda, S.L. (2009). Expectancy – value theory, in K.R. Wentzel and A. Wigfield (Eds.), Handbook of motivation at school, (pp. 55- 75). New York: Routledge.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.