Session Information
10 SES 01 B, Field Experience in Teacher Education
Paper Session
Contribution
The aim of International Large Scale Assessments (ILSAs) is to provide countries with comparable, anchored information as indicators of systemic health. It is however not only the number of participating countries in ILSAs that have expanded over time but also that the scope of the constructs, kinds of populations, and methods used to asses these as accurately as possible have expanded. Kirsch, Von Davier, Gonzalez & Yamamoto (2013) make the important point that the basis of international large-scale assessments is to collect reliable, valid, and comparable information about the skills possessed by specific populations with an understanding of how those skills are related to educational, economic, and social outcomes. In this way, different perspectives on the role of ILSAs are catered for as agents of change (Ritzen 2013) and catalysts for educational effectiveness and development (Klieme 2013). It may therefore not be desirable to draw comparisons on context-specific issues that are vastly varied across countries (Klemenčič and Mirazchiyski, 2018).
For this reason, the current study uses the Progress in International Reading Literacy Study (PIRLS) 2016 data to illustrate the importance of using national analysis strategies to provide meaning to the intended, implemented and attained curricula to national contexts in the interest of informing policy decisions, practice and future directions. This study uses Austria’s PIRLS 2016 Grade 4 data and South Africa’s PIRLS Literacy 2016 data and firstly attempts to create a scale of the PIRLS 2016 teacher questionnaire reading instruction items. This scaling is done to secondly determine its possible relationship to the Low International and High International Benchmark achievement for each country respectively, while controlling for school socio-economic status. Thirdly, differences in results between Austria and South Africa are related to curriculum intentions and expectations to provide a national context against which results can be interpreted. The study’s objective is to show differences in teachers’ reported patterns of reading instruction practice between two countries with differing education landscapes in order to make an argument for the necessity of in-depth national analyses to make meaning of contextual large-scale data.
As theoretical framework, the study utilises the IEA’s tripartite model of the intended, implemented and attained curricula (Robitaille, 1993). In using this, the current study aims to interpret the attained curriculum beyond overall league table standings. While overall scores are often driven by aptitude, student motivation and class (Schmidt, Houang, Cogan, Blömeke, Tatto, Hsieh, Santillan, Bankov, Han, Cedillo & Schwille, 2008), differences in teacher reading instruction practices may provide a nuanced reflection of contexts with equity differences and the kinds of opportunities afforded to students to learn.
The main research question that guides the current study asks to what extent differences in benchmark achievement and a scale of teachers’ reported reading instructional activities in Austria and South Africa can be explained.
Sub-questions include:
- How do reported frequencies for specific reading instruction practice present differently for Austria and South Africa?
- How is a scale of teachers’ reading instructional practices associated with the Low and Advanced International benchmarks for reading achievement for Austria and South Africa respectively?
- To what extent does the current study provide evidence of the importance of engagement with ILSA data at national levels between cycles of testing to ensure meaningful engagement beyond league table standings?
Method
This study will firstly examine questions from the PIRLS 2016 Teacher Questionnaire that dealt with reading instruction practices (items R8, R9, R10 and R11). These questions specifically asked of teachers to rate the frequency with which they create reading instruction opportunities by: 1. Reading aloud to students, asking students to read aloud, asking students to read silently on their own, teaching students strategies for decoding sounds, words, and new vocabulary systematically and providing opportunities for developing fluency (R8). 2. Providing reading materials that match students’ reading interests, appropriate for their reading levels, linking new concepts to prior knowledge, deepening understanding of text, discussions of text, challenge opinions stated in text, reading text with multiple perspectives, reading books of students’ own choosing and giving individualized feedback (R9). Additionally, questions 10 and 11 asked teachers to rate the frequency with which they asked their students to engage in certain reading instruction activities such as: 3. Locating information in text, identifying main ideas, explaining and supporting understanding, comparing what was read to experiences or other readings, making predictions, making generalizations, evaluating and critiquing text style or structure, determining the author’s perspective, self-monitoring reading and teaching skimming or scanning strategies (R10). 4. Writing something, answering oral questions, talking with each other, or taking a written quiz or test after students have read something (R11). Using SPSS, these questions will be scaled to provide scale scores for Austrian and South African data respectively as indicator of teachers’ reading instructional opportunities teachers create or activities they ask their students to do. Secondly, the use of multiple regression methods using the IDB Analyzer will establish possible relationships between the reading instructional activities scale on the Low and the Advanced International Benchmark respectively for Austria and South Africa. This study takes into account the socio-economic status of schools for the different countries because of vast differences in equity profiles. A developed, middle European country like Austria stands in stark contrast to a developing context such as South Africa, a country that is increasingly characterised by inequitable distribution, where the gap between economically affluent segments of society and those living in extreme poverty is on the increase (Masipa, 2018). The rationale for using the Low and Advanced International Benchmark performance as outcome measures in this study is to show possible effects at the extreme ends of benchmark performance where differences in students’ reading skills are most pronounced.
Expected Outcomes
It is expected that the study will show different patterns in teachers’ answers regarding reading instruction between Austria and South Africa which will be discussed in the context of teacher education in the two countries respectively. Furthermore, the study is expected to provide evidence for the differences in selected international benchmark results between Austria and South Africa and how these can be explained with the possible association of reading instructional scales. Findings will be related to curriculum intentions and expectations for each country respectively to provide a national context against which results can be interpreted. The study’s objective is to show differences in teachers’ reported patterns of reading instruction practice between two countries with differing education landscapes in order to make an argument for the necessity of in-depth national analyses to make meaning of international large-scale data. Given the difference in equity profiles between Austria and South Africa, the findings from the current study are expected to illustrate the importance of social, contextual aspects in which a country’s education system operates. These issues cannot be isolated from or ignored from overall results for the sake of league-table standing comparisons as overriding consideration in educational decision making. The current study does not dispute the value of overall results and international comparisons from ILSA studies, but it is hoped that the current study will provide a tangible illustration of how ILSA data should foremost be interrogated and analysed for national planning and development purposes.
References
Kirsch I., Von Davier M., Gonzalez E., Yamamoto K., (Eds). (2013). The role of international largescale assessments: Perspectives from technology, economy, and educational research. Dordrecht: Springer. Klemenčič E., Mirazchiyski PV. (2018). League tables in educational evidence-based policy-making: can we stop the horse race, please? Comparative Education, 54(3), 309–324. Klieme, E. (2013). The role of large-scale assessments in research on educational effectiveness and school development. In I. Kirsch, M. Von Davier, E. Gonzalez, & K. Yamamoto (Eds.), The role of international large-scale assessments: Perspectives from technology, economy, and educational research (pp. 115–147). Dordrecht: Springer. Masipa, T. (2018). South Africa's transition to democracy and democratic consolidation: A reflection on socio‐economic challenges. Journal of Public Affairs, 18(4), e1713. Ritzen, J. (2013). International large-scale assessments as change agents. In I. Kirsch, M. Von Davier, E. Gonzalez, & K. Yamamoto (Eds.), The role of international large-scale assessments: Perspectives from technology, economy, and educational research (pp. 13–24). Dordrecht: Springer. Robitaille, D. F. (1993). Curriculum Frameworks for Mathematics and Science. TIMSS Monograph No. 1. Pacific Educational Press, Faculty of Education, University of British Columbia, Vancouver, British Columbia V6T 1Z4, Canada. Schmidt, W.H., Houang, R.T., Cogan, L., Blömeke, S., Tatto, M.T., Hsieh, F.J., Santillan, M., Bankov, K., Han, S.I., Cedillo, T. & Schwille, J. (2008). Opportunity to learn in the preparation of mathematics teachers: its structure and how it varies across six countries. ZDM, 40(5), pp.735-747. https://doi.org/10.1007/s11858-008-0115-y.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.