09 SES 01 B, E-Assessment and Model-Test Critically Discussed
In the time of the New Media Age, when digital devices – that can easily replace printed books and newspapers – have become in common use in many societies, many research studies now focus on the constantly changing modes of reading and claim that new types of reading devices contain innovative types of texts that need novel types of reading skills and methodology. We claim that the shift from printed to digital is not just a simple change of instruments or data media (like the ones from papyrus to paper, scroll to newspaper and book, or book to e-book), but also a significant change to the nature of comprehending texts as well. Hence, we can talk about Digital Literacy as still a new and rapidly evolving field, and in order to get a better understanding of it, we need to conduct reading assessments that involve the digital as well. This makes the authentic reading assessment of young people extraordinarily complex.
Starting from this fundamental position, in our work we will discuss the problem of assessing digital reading literacy and especially the key role of the Organisation of Economic Cooperation and Development (OECD)’s Programme for International Student Assessment (PISA)’s series of Reading Literacy Assessments. OECD/PISA surveys are certainly the globally famous and politically influential tests, largely internationally accepted. However, they are also the most controversial measurements concerning 15-year-old students’ ‘levels of competence’ and so generate very significant debates about their validity – and also trigger the greatest debates, often political, about educational practices all over the world with implications that teachers are failures or successes due to PISA rankings. In the era of digital reading, we find our previous knowledge about reading – for e.g. its process, function and strategy – is shifting or conceptually challenged. Different researchers claim different, often contradictory things about the nature of digital reading, and stick to disputes about “old school” and “new” reading, printed vs. online materials, linear vs. non-linear reading etc., and OECD/PISA assessments are necessarily a kind of ‘battlefield’. Scientific debates should help to develop common knowledge; however, if the research results are based on wrong conceptual frameworks, false standards and misunderstandings, then they will not lead to sensible debates and proper conclusions. The universal disunity concerning digital reading theories and strategies hinders the main purpose of reading assessments, namely the intention of developing children’s reading skills and so does not provide useful assistance for teachers in classroom practice; teachers are often the casualties on these high stakes battlefields.
It is the case with OECD/PISA, even in those years (2009, 2012, 2015 and 2018) when digital literacy is fully recognised, and the assessment claimed to involve digital reading, that this was not authentically included. We claim that (1) the PISA assessments’ theoretical background did not make enough allowance for the significant role of digital reading, neither did the tasks and answer sheets. In addition, (2) PISA took digital writing skills and digital reading fluency among 15-year-old students for granted, and this attitude was in contradiction within their own theoretical background. (3) These are a number of significant theoretical mistakes with huge effects and influence on the assessments’ results. Thus (4) the results of the PISA surveys between 2009- 2018 should be re-evaluated and reinterpreted involving proper inclusion of factors such as digital reading, writing skills and reading fluency. (5) As this will not be achieved retrospectively then we have strong grounds to be critical of the reading literacy assessments and their results between the questioned intervals.
Our conceptual framework is a theoretical one, using two aspects. The first is a concept of digital literacy, that is a literacy related to, but not limited by traditional notions of literacy. Digital literacy involves reading and writing multimodal texts on electronic devices using their affordances in text production and reception. The second aspect is phenomenology. We treat the PISA tests and their global impact as a real world phenomenon that deserves study in its own distinctive form. Our research is grounded in the documents and artefacts produced by OECD/PISA and in the observable reactions to its tests and outcomes. The specific key resources are those official OECD/PISA reading literacy assessment reports and analytical frameworks which have been published since 2009 by the OECD itself. These documents are essential because they substantiate – or seek to substantiate – the whole PISA assessment system, the theoretical and methodological background of each survey and the developments that PISA experts claim to have made from one iteration to the next. The phenomenon that is PISA continues– in spite of much profound criticism and challenges to its practice – to be very popular and governments tend to take in account its standards and analysis in their policymaking. We consider it a problematic phenomenon, criticise its methods and provide important insights into its problematic intervention into the field of contemporary education with many negative effects on students and teachers.
We believe that our work will help to clarify some misunderstandings concerning the nature of digital reading in the OECD/PISA reading literacy framework, drawing attention to the importance of authentic digital reading, showing that reading from screen is more than a simple platform shift. We offer an alternate approach in order to argue for the need to develop the OECD/PISA digital reading literacy survey system so as to get closer to accomplishing the highly difficult challenge of assessing contemporary digital reading. Developing our theories of digital reading and texts comprehension could help to create and design not just better surveys, but also better educational materials in a long run. We evaluate the nature of PISA as a phenomenon, acknowledging its positive intentions but revealing some of its negative consequences and arguing for its reform in relation to digital reading assessment.
Chard, D. J., Pikulski, J. J. and McDonagh, S. H. (2006). Fluency: The link between decoding and comprehension for struggling readers. In T. Rasinski, C. Blanchowicz and K. Lems (Eds.), Fluency instruction: Research-based best practices. New York: Guilford Press, 39–61. Coiro, J. and Dobler, E. (2007): Exploring the Online Reading Comprehension Strategies Used by Sixth-Grade Skilled Readers to Search for and Locate Information ont he Internet. Reading Research Quarterly. (42)2, 214-257. Goodwyn, A. (2013) Machines to think with? E-books, Kindles and English teachers, the much prophesied death of the book revisited. Changing English. 20 (2), 148-159. Goodwyn, A. (2013). E-readers and the future of reading in schools’. In A. Goodwyn, C. Durrant, & L. Reid (Eds.). International perspectives on the teaching of English, (pp.65-78). London, Routledge. Goodwyn, A. (2014). Reading is now 'cool': a study of English teachers' perspectives on e-reading devices as a challenge and an opportunity. Educational Review. 66 (3), 263-275. Goodwyn, A. (2015). Is it still King Lear? The e-reader: the phenomenon of the Kindle and other reading devices. In T. Bin Lin, V. Chen, & Chai. C.S. (Eds.) New Media and Learning in the 21st Century: A socio-cultural perspective (pp. 145-161), London, Springer. Murnane, R., Sawhill, I. and Snow, C. (2012): Literacy Challenges for the Twenty-First Century: Introducing the Issue. The Future of Children. (22)2, 3-15. OECD, (2009). PISA 2009 Assessment Framework: Key competencies in reading, mathematics and science Paris: OECD Publication Service. OECD, (2013). PISA 2012 Assessment and Analytical Framework: Mathematics, Reading, Science, Problem Solving and Financial Literacy. Paris: OECD Publication Service. OECD, (2016). PISA 2018 Draft Analytical Frameworks May 2016. Paris: OECD Publication Service. OECD, (2017). PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic [sic], Financial Literacy and Collaborative Problem Solving (Revised edition). Paris: OECD Publication Service. Smolin, L. I., and Lawless, K. A. (2003): Becoming literate in the technological age: New responsibilities and tools for teachers. The Reading Teacher. (56)6, 570-577. Szabó, K. (2016b): Digital and Visual Literacy: The Role of Visuality in Contemporary Online Reading. In. In the Beginning was the Image: The Omnipresence of Pictures: Time, Truth, Tradition. Series Visual Learning 6. Frankfurt am Main: Peter Lang GmbH, Internationaler Verlag der Wissenschaften, 103-112. Walsh, M. (2010): Multimodal literacy: What does it mean for classroom practice? Australian Journal of Language Literacy. (33)3, 211-239.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.