Session Information
09 SES 01 A, Relating Reading Motivation and Behaviors to Reading Achievement – Findings from PIRLS and PISA
Paper Session
Contribution
Research on adolescents' ltieracy developemnt, including cross-sectional and experimental studies, points to the key roles played by students' engaement in reading and their use of reading strategies to support liteacy development (e.g., Artelt, Schiefele and Schneider, 2001; Brozo, Shiel & Topping, 2007-08; Cantrell et al., 2010; OECD, 2010). However, much of the earlier work on associations between engagement, strategy usage and performance is based on how students engagement with paper-based texts, and how well they read such texts.
Since 2015, reading literacy (and other domains) have been assessed on computer in most particpating in the OECD Programme for International Student Assessment (PISA). PISA 2018 marked the first cycle in which PISA was a major assessment domain since this transition. This meant that PISA assessed reading literacy with a broad range of texts and questions, including, for the first time, those suited to a comuter-based environment, such as simulated websites and multiple-texts. Furthermore, many of the questions on the PISA student questionnaire, which is administered in conjunction with reading assessment, were adapted since earlier cycles to reflect students' engagement with digital texts, and their endorsement of strategies required to understand digital texts.
While the initial international report on PISA 2018 (OECD, 2019) provided an overview of performance across particpating countries, it did not provide a detailed analysis of students' reading engagement, their perception of themselves as readers or their endorsement of different reading strategies. The current paper seeks to provide a detailed consideration of relationshps among students' reading habits and practices, their perceptions of themselves as readers, their endorsement of key reading comprehension strategies, the frequency with which they implement online reading strategies, and their performance on a computer-based test of reading literacy. The paper uses both descriptive statistics and multi-level models of reading literacy to examine these relationships.
The paper draws on data for six European countries: - Ireland, the UK, Finland, Estonia, Poland and Sweden. It seeks to address the following quesitons:
a) How do students' engagement in reading and their endorsement of key reading strategies compare across six countries, how do they relate to performance on an computer-based assessment of reading literacy, and how have student scores on these variables changed since PISA 2009?
b) What proportions of school- and student-level variance are explained by student engagement and strategy variables in each country, when other variables, including school- and student-level socioecomic status are held constant?
c) What are the implications of the outcomes for the development of reading in adolescents across and within countries, in the context of a transition to digital texts?
The paper is especially relevant to adolescent's literacy development, given substantial changes in students' reading habits, even since PISA 2009 (when reading literacy was also a major assessment domain), with some educators worried that students' reading practices may no longer provide them with the skills required to achieve a deep understand of both literary and informational texts, regardless of the format in which students read such texts (Shiel et al., in press).
Method
This paper uses both descriptive statistics (percentages, mean scores, correlations), and multi-level (hierarchical linear) models to examine associations between students' engagement in reading, their self-perceptions of themselves as readers, their endorsement and use of reading strategies, and their reading performance. Descriptive statistics are computed using the IDB Analyzer, a tool developed by the International Association for the Evaluation of Educational Achievement that takes into account the complex nature of the PISA dataset, which includes 10 plausible values (estimates of reading achievement) for each student. In particular, the IDB Analyzer facilitates the estimation of correct standard errors around percentages and scale scores. The Analyzer also allows for the computation of accurate standard errors of the difference to allow for an examination of differences between countries and between 2009 and 2018. M-Plus Version-8 (with multi-level add on) (Muthén & Muthén, 2018) was used to model performance on reading literacy, by simultaneously taking school- and student-level variables into account. Variable examined for inclusion in the models included school ESCS (economic, social and cultural status - a PISA-specific measure of socioeconomic status), student ESCS, student gender, student grade level, student immigrant status, student engagement in reading (a categorical variable, based on the frequency with which students reported engaging in reading various texts), student enjoyment of reading (a continuous variable based on students' responses to five questions about the extent to which they enjoy reading), students' perceptions of themselves as readers (a continuous variable, based on six questions about how students view themselves as readers), students' endorsement of reading strategies (continuous variables for understanding and remembering information, summarising information, and assessing the credibility of sources), and students' online reading strategies (categorical variables such as frequency of evaluating two websites to identify which is most relevant to their work). Variables that were significant when entered into the model on their own were carried over for inclusion in the second stage of modelling, where variables were entered in clusters (background variables, including gender and ESCS), engagement and enjoyment variables, strategy variables, and variables related to use of online strategies. Interactions of student-level variables and cross-level interactions were also examined.
Expected Outcomes
As noted, data for Ireland have already been examined. They show less frequent reading for enjoyment reported by boys, compared with girls, and a drop in frequency since 2009. While reading emails declined since 2009, chatting online and searching for information to learn about a topic increased. Strategies that were associated with paper-based reading in 2009 (understanding and remembering, summarising) were also associated with computer-based reading in 2018, as was a strategy examined for the first time - assessing credibility of sources. Despite poorer overall performance, significantly more boys in 2018 strongly agreed or agreed that they were good at reading difficult texts, compared with girls, while significantly more girls agreed that they were good readers. The multi-level model showed the ongoing relevance of strategies such as understanding and remembering and summarising, despite the shift in outcome measure to computer-based text in 2018, while the parameter for new accessing credibility of sources scale was also statistically significant. Frequency of reading for enjoyment and enjoyment of reading were statistically significantly, while controlling for other variables in the model. The outcomes of the model support the ongoing relevance of engagement in reading and enjoyment of reading, as well as knowledge of reading strategies, for success on computer-based reading.
References
Artelt, C., U. Schiefele and W. Schneider (2001), “Predictors of reading literacy”, European Journal of Psychology of Education, Vol. XVI/3, pp. 363-383, http://dx.doi.org/10.1007/BF03173188. Brozo, B., Shiel, G., & Topping, K. (2007/08). Engagement in reading; Lessons learned from three PISA countries. Journal of Adolescent and Adult Literacy, 51(4), 301-315. doi:10.1598/JAAL.51.4.2 Cantrell, S. C., Almasi, J. F., Carter, J. C., Rintamaa, M., & Madden, A. (2010). The impact of a strategy-based intervention on the comprehension and strategy use of struggling adolescent readers. Journal of Educational Psychology, 102(2), 257–280. https://doi.org/10.1037/a0018212 OECD (2010), PISA 2009 Results: Learning to Learn – Student Engagement, Strategies and Practices (Volume III) http://dx.doi.org/10.1787/9789264083943-en OECD. (2019). PISA 2018 Results Volume 1: What students know and can do. Paris: OECD Publishing. Shiel, G., McHugh, G., Denner, S., Delaney, M., & McKeown, C. (in press). Reading literacy in Ireland in PISA 2018: Performance, policy and practice. Dublin: Educational Research Centre.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.