Session Information
31 SES 14 A, Enhancing Learners’ Reading and Writing Skills via Intervention and Assessment
Paper Session
Contribution
This paper presents the research design and first results of a study that evaluates the implementation and effectiveness of a comprehensive reading and writing program. The results contribute to the internationally growing field of implementation research. A recent literature review shows that many implementation studies focus only on (science-based) intervention, but do not adequately take into account other factors that might influence the implementation process and the outcomes (Schrader et al., 2020). Taking this into account by analyzing the implementation of the program at multiple levels we expect to contribute valuable knowledge. The target group of the program are fifth graders at non-academic track secondary schools in Germany.
Being literate is a central prerequisite for school education and participating in society. If learners’ competences in reading and writing do not keep up with what is required on an age-appropriate level, individuals are likely to be affected negatively in their socialization. Results of several international school achievement studies show that the reading and writing skills of students in Germany are on average. Deviations, however, are quite high (Mullis et al., 2017; OECD, 2019; Stanat et al., 2017): 20.7% of the German students who took part in PISA 2018 are struggling readers (OECD, 2019). Students at non-academic track schools in Germany have a lower reading competence compared to the means of students at academic track schools (difference of 120 points; OECD, 2019). National achievement studies also confirm deficits (Stanat et. al, 2016; 2017).
Given these trends, the Ministry of Education, Youth and Sports in Baden-Wuerttemberg commissioned the conception, implementation and evaluation of a training program to promote basic reading and writing skills in the fifth grade at 59 non-academic track secondary schools. The training program is a pilot project with the goal of scaling the program up if it proves effective in the evaluation.
The Mercator-Institute for Literacy and Language Education at the University of Cologne developed learning materials for the program based on scientific findings on linguistic competence development. The aim is to promote competence development in four fields: reading fluency, reading strategies, writing fluency and writing strategies. Research has shown that they are essential for the further development of reading and writing skills (e.g. Graham & Harris, 2018; National Reading Panel, 2000).
The program started in September 2020 with a training by the Meractor-Institute for multipliers on the promotion of reading fluency. The Centre for School Quality and Teacher Education Baden-Wuerttemberg carried out the training and the multipliers then instructed the teachers at participating schools, with support of the Centre. From February 2021, the students will work on their reading fluency with the material. Each student receives an audio pen and a workbook to use in class during 7 weeks (4 days per week, 20 minutes per day). The multipliers continuously support the participating schools and teachers during the implementation period. The promotion of reading strategies, writing fluency and writing strategies follows the same pattern. A stepwise procedure to promote skills at school proved to be effective in previous projects (e.g. Titz et al., 2017).
The Institute for Educational Analysis Baden-Wuerttemberg is in charge of evaluating the effectiveness of the training intervention in cooperation with the University of Muenster. The evaluation also aims at identifying necessary adaptions to the program and gaining general knowledge on supporting factors regarding its implementation. The study presented here refers to the evaluation of the reading skills training and addresses two research questions (RQ):
RQ1 How effective is the training program regarding the learning outcomes?
RQ2 Which are the main influencing factors to implement a science-based intervention in practice successfully?
Method
To assess the effectiveness of the training intervention (RQ1) we chose a quasi-experimental control group design. The students in the intervention group (N = approx. 1700) receive the training and take part in several standardized tests and questionnaires. The control group (N = approx. 1000) solely participates in the tests and questionnaires but does not receive the training. Several performance indicators will be used in order to analyze and compare the reading skills development of the students between groups. One indicator is the result of an annual statewide-standardized reading competence test (Lernstand 5; Institute for Educational Analysis Baden-Wuerttemberg, 2020). Prior to the implementation of the training material, the students of both groups will also participate in a standardized reading comprehension/fluency test (ELFE II; Lenhard et al., 2020). Until June 2021 all students will complete 6 standardized web-based reading-diagnostic tests (quop; Förster & Souvigner, 2011) to assess the development of their reading ability every 3-4 weeks. This design allows to compare the learning gains of the students between groups and across time. In order to guarantee the comparability of the students within groups as well as between groups and to account for context factors, additional data will be collected: sociodemographic indicators, general cognitive abilities (matrices sub-scale of the CFT 20-R; Weiß, 2019), instructional quality, COVID-19 related school closures or dis-tance learning settings as well as teacher characteristics and school context factors. To identify supporting factors for the implementation and to assess the impact of the accompanying measures of the program on its success (RQ2), data on these measures will be collected via questionnaires. That includes the evaluation of the trainings of the multipliers and teachers as well as the transfer of the training content to the classroom. More specific, multipliers, teachers and students will be asked about the implementation fidelity, the quality and quantity of the specific training received and supplementary measures. Regular informal meetings with all parties involved are expected to serve as additional data source to gain insights into the implementation process.
Expected Outcomes
The program and the data collection started in autumn 2020 and we will be able to present results on the reading skills training at the conference (data collection is planned to be finalized by the begin-ning of June 2021). By comparing the reading development between the two groups, we expect to gain results that give insights on the effectiveness of the training program. Via questionnaires, we hope to get better knowledge about supporting and challenging factors when implementing a science-based intervention in practice. In many European countries the development of educational programs for the improve-ment of education is under way. We expect to contribute relevant and valuable knowledge to the internationally growing field of implementation research. More precisely, our study adds to the per-spective of evaluating educational programs ‘as conducted’ when applied under ecologically valid conditions (Century & Cassata, 2016). A particular strength of the pilot project is that different institutions, policymakers and practitioners work closely together – not only during designing the intervention and its evaluation, but also during the whole period of the implementation. On the one hand, we expect that a close exchange with practitioners, policymakers and scientists from different disciplines helps along a deeper understanding of the results. On the other hand, we hope to be able to adjust certain parts of the intervention and evaluation while their realization. We are optimistic that this might increase the acceptance of scientific interventions on the part of the practitioners and policymakers. At the same time, working collaboratively might help scientists to be able to better address the needs of practitioners when planning and conducting interventions and thus enabling them to better address the needs of students. These aspects are important for educational scientists to contribute with their research to society and towards an evidence-based education.
References
Century, J., & Cassata, A. (2016). Implementation Research: Finding Common Ground on What, How, Why, Where, and Who. Review of Research in Education, 40(1), 169–215. Förster, N. & Souvignier, E. (2011). Curriculum-Based Measurement: Developing a computer-based assessment instrument for monitoring student reading progress on multiple indicators. Learning Disabilities: A Contemporary Journal, 9(2), 65-88. Institute for Educational Analysis Baden-Wuerttemberg (2020). Lernstand 5. https://ibbw.kultus-bw.de/,Lde/Startseite/Kompetenzmessung/Lernstand+5 Graham, S. & Harris, K. R. (2018). Evidence-Based Writing Practices: A Meta-Analysis of Existing Meta-Analyses. In: Fidalgo, R., Harris, K. R. & Braaksma, Marine (Eds.), Design Principles for Teaching Effective Writing (p. 13-37). Brill. https://doi.org/10.1163/9789004270480_003 Lenhard, W., Lenhard, A. & Schneider, W. (2020). ELFE II - Ein Leseverständnistest für Erst- bis Siebtklässler - Version II [A Reading Comprehension Test for First to Seventh graders] (4th Ed.). Hogrefe. Mullis, I. V. S., Martin, M. O., Foy, P., & Hooper, M. (2017). PIRLS 2016 International Results in Reading. TIMSS & PIRLS International Study Center, Lynch School of Education, Boston College. National Reading Panel (2000). Report of the National Reading Panel: Teaching children to read: An Evidence-Based Assessment of the Scientific Research Literature on Reading and Its Implications for Reading Instruction: Reports of the Subgroups. National Institute of Child Health and Human Development. OECD (2019). PISA 2018 Results (Volume I): What Students Know and Can Do. PISA, OECD Publishing. https://doi.org/10.1787/5f07c754-en Schrader, J., Hasselhorn, M., Hetfleisch, P. & Goeze, A. (2020). Implementation research: How science can contribute to improvements in the education system. Zeitschrift für Erziehungswissenschaften, 23(2), 9-59. https://doi.org/10.1007/s11618-020-00927-z Stanat, P., Böhme, K., Schipolowski, Haag, N., (Eds.) (2016). IQB Trends in Student Achievement 2015: The Second National Assessment of German and Mathematics at the End of the Ninth Grade. Waxmann. Stanat, P., Schipolowski, S., Rjosk, C., Weirich, S., Haag, N., (Eds.) (2017). IQB Trends in Student Achievement 2016: The Second National Assessment of German and Mathematics at the End of Fourth Grade. Waxmann. Titz, C., Geyer, S., Ropeter, A., Wagner, H., Weber, S. & Hasselhorn, M. (2017). Konzepte zur Sprach- und Schriftsprachförderung entwickeln [Developing Concepts to promote Oral and Standard Language]. Kohlhammer. Weiß, R. H. (2019). CFT 20-R mit WS/ZF. Grundintelligenz Skala 2 - Revision mit Wortschatztest und Zahlenfolgentest - Revision [Cattell’s Fluid Intelligence Test, Scale 2] (2nd Ed.). Hogrefe.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.