Session Information
09 ONLINE 28 A, Impact of COVID-19 on Teaching and Learning
Paper Session
MeetingID: 836 2362 5275 Code: 14EtJY
Contribution
The educational disruption caused by COVID-19 has been widespread across the world. Education policy makers and practitioners have turned to research to estimate the impact of the pandemic on learning outcomes. Such estimates are needed to make recovery plans and prepare for future disruptions. The research reported here is the result of transnational organisations cooperating with six African governments to implement the COVID-19: Monitoring Impacts on Learning Outcomes (MILO) project (UIS & ACER, 2022). Contextual data and data on reading and mathematics learning outcomes were collected and analysed to estimate the effects of the pandemic and to inform education policy and practice. The project made innovative use of Item Response Theory (IRT) methods to equate different baseline assessment data collected prior to COVID-19 onto a common metric to measure the magnitude of the effect.
The MILO project integrated global best practice with local needs. On the transnational side, the project was organised by the UNESCO Institute for Statistics (UIS) and funded by the Global Partnership for Education (GPE). The Australian Council for Educational Research (ACER) was the technical partner and implementation support was provided by the Conference of Ministers of Education of French-Speaking Countries (CONFEMEN). At the national level, there were six participating countries: Burkina Faso, Burundi, Côte d’Ivoire Kenya, Senegal and Zambia. One of the project goals was to build capacity in large-scale assessment in the participating countries.
Sustainable Development Goal (SDG) 4.1 provides a global target for countries around the world. However, not all countries have equal access to valid and reliable tools and methods (Gustaffson, 2021) to measure their progress towards SDG indicator 4.1.1b, “the proportion of children...at the end of primary ... achieving at least a minimum proficiency level in (i) reading and (ii) mathematics, by sex.” (United Nations, 2015).
The pandemic has exacerbated the need for countries to monitor learning, with concerns about the impact of disruptions to education on the SDGs (United Nations Department of Economic and Social Affairs, 2020; UNESCO, 2020). Simulations on the impact of school closures predicted significant learning loss, which could continue even after schools re-opened and lead to reduced life outcomes (Azevedo et al., 2020; Kaffenberger, 2021).
Three research questions for the MILO project are addressed in this paper:
- What is the impact of COVID-19 on reading learning outcomes?
- What is the impact of COVID-19 on mathematics learning outcomes?
- What school- and student-level factors contributed to the efficacy of the educational response to the disruption caused by COVID-19?
An Assessment for Minimum Proficiency Levels (AMPL) was developed to assess reading and mathematics. The AMPL was based on the MILO Assessment Blueprint (ACER, 2021a) and was strongly aligned to the Global Proficiency Frameworks (USAID et al., 2020a, 2020b).
The MILO Conceptual Framework (ACER, 2021b) underpinned the design of the contextual questionnaires which focused on the following themes: the COVID-19 disruption; student characteristics; home environment; school environment; teaching and learning; and assessment and monitoring. The instruments were designed to collect comparable data across countries, while allowing countries to make the adaptations necessary to ensure they reflected their local contexts.
In the MILO project, the AMPL and contextual questionnaires provided policy makers from the six countries with data on their country’s unique responses to COVID-19 and the impact of the pandemic on learning. The AMPL also has application beyond the pandemic – it is an efficient and reliable global tool that can be used by countries and the international development community to monitor progress towards SDG 4.1.1b.
Method
The MILO study and instrument design, sampling, field operations and analytic approaches used were all based on best practice (ACER & UIS, 2017). Participants A total of 31,590 Grade 5-7 students participated across six countries. Students were sampled using a two-stage approach: 250-300 schools were sampled and then 20-25 students were sampled from the target grade within each school. School participation, including replacement, was 99-100% across countries. Student participation ranged from 84-98%. Instrumentation The AMPL used a rotated assessment design comprised of 29 reading and 29 mathematics items, sourced from the UIS’s Global Item Bank. A questionnaire was administered to each student. A country-specific subset of the reading and mathematics baseline assessments was administered to every student to establish the psychometric link to the pre-pandemic data. Students completed the AMPL (two hours) and questionnaire (35 minutes) on day one and the subset of the baseline assessments (two hours) on day two. School questionnaires were also administered to principals from the sampled schools and a national-level representative completed the system questionnaire. Analytic approach The project used established analytic techniques from international large-scale assessments (von Davier & Sinhary, 2014). Using ACER ConQuest software (Adams et al., 2021), the Mixed Coefficients Multinomial Logit Model (MCMLM) as described by Adams et al. (1997) was used to scale the AMPL data. After calibration, unidimensional scaling was conducted in each domain to yield plausible values for secondary analysis. An empirical link was made between the pre-pandemic baseline data and the AMPL using common item equating, thereby placing the baseline data on the AMPL scale. Sampling weights and replicate weight methods were used to account for the sampling design. Standard setting A participatory standard setting exercise was undertaken to establish cut-points for reading and mathematics on the AMPL. The cut-points refer to the Minimum Proficiency Level (MPL) as referenced in SDG 4.1.1b. The MPL was set collaboratively, involving experts at the national level, and endorsed by the MILO countries and a broader group of international stakeholders. Once endorsed, the final cut-points were applied to report the proportion of students at and above the MPL for reading and mathematics in 2021. The same AMPL cut-points were then applied to report the proportion of students that were at or above the MPL prior to the pandemic. Estimates of effect of COVID-19 were calculated by comparing the proportion of each population meeting or exceeding the MPL.
Expected Outcomes
Of the five countries that had reading data from prior to the pandemic, none experienced learning losses or gains. For mathematics, none of the six countries experienced learning losses for the end of primary population overall. Burkina Faso had learning gains in mathematics – 17.9% of the population met the MPLs in 2019 compared to 23.7% in 2021. In Kenya, learning losses in mathematics were found for boys between 2019 and 2021. Overall, the proportions of students that met the MPLs was small – for example, in 2021 fewer than 1% of students met the reading MPL in Burundi, and just over 2% met the mathematics MPL in Zambia. This may indicate a floor effect, making it unlikely for any learning loss to be observed as measured by the proportion of students meeting the MPL. Had the pandemic not occurred, learning gains may also have been observed. Access to remote learning and technology were common barriers. However, one of the factors that was crucial across all countries, was family and community support provided to students. Students who received higher levels of family support for learning during the COVID-19 disruption, had higher levels of proficiency in reading and mathematics. This finding aligns with other research that has highlighted the importance of parental involvement in academic achievement (e.g., UNICEF & SEAMEO, 2020). Principals recognised the importance of family and community support. The most common approach reported to minimise the impact of the pandemic on teaching and learning was engaging the broader community. While school and teacher support had a small association with student achievement in some countries, the relationship with achievement was not as strong or consistent as with family support.
References
Adams, R. J., Wilson, M., & Wang, W. C. (1997). The multidimensional random coefficients multinomial logit model. Applied Psychological Measurement, 21(1), 1–23. Adams, R. J., Wu, M. L., Cloney, D., & Wilson, M. R. (2020). ACER ConQuest: Generalised Item Response Modelling Software [Computer software]. Version 5. ACER. Australian Council for Educational Research & UNESCO Institute for Statistics (2017). Principles of Good Practice in Learning Assessment. ACER. https://research.acer.edu.au/cgi/viewcontent.cgi?article=1035&context=monitoring_learning Australian Council for Educational Research. (2021a). Reading and Mathematics Assessment Blueprint: MILO. UIS. https://milo.uis.unesco.org/wp-content/uploads/sites/17/2022/01/COVID-19-MILO_Reading-and-Mathematics-Assessment-Blueprint-v1.0_31-Jan-2021.pdf Australian Council for Educational Research. (2021b). Contextual Framework: MILO. UIS. https://milo.uis.unesco.org/wp-content/uploads/sites/17/2022/01/COVID-19-MILO_Contextual-Framework-v1.0_16-Jun-2021.pdf Azevedo, J. P., Hasan, A., Goldemberg, D., Geven, K., & Iqbal, S. A. (2020). Simulating the potential impacts of COVID-19 school closures on schooling and learning outcomes: A set of global estimates [Policy research working paper]. The World Bank Research Observer, 36(1), 1–40. Gustaffson, M. (2021). Assessing Learning Proficiency Levels and Trends for Sustainable Development Goal 4.1: A Focus on Africa. UNESCO. https://tcg.uis.unesco.org/wp-content/uploads/sites/4/2021/11/Measuring-Learning-Proficiency-SDG-4-1_Oct-2021.pdf Kaffenberger, M. (2021). Modelling the long-run learning impact of the Covid-19 learning shock: Actions to (more than) mitigate loss. International Journal of Educational Development, 81, 102326. https://doi.org/10.1016/j.ijedudev.2020.102326 UNESCO Institute for Statistics & Australian Council for Educational Research. (2022). COVID-19 in Sub-Saharan Africa: Monitoring Impacts on Learning Outcomes. Main report. http://milo.uis.unesco.org/wp-content/uploads/sites/17/2022/01/MILO-Main-Report-SSA-Jan-2022_EN.pdf UNESCO. (2020). Education: From disruption to recovery. https://en.unesco.org/covid19/educationresponse UNICEF & Southeast Asian Ministers of Education Organization. (2020). SEA-PLM 2019 Main Regional Report: Children’s learning in 6 Southeast Asian Countries. https://research.acer.edu.au/ar_misc/52/ United Nations Department of Economic and Social Affairs. (2020). Responding to COVID-19 and recovering better. UN Publishing. https://www.un.org/en/desa/covid-19 United Nations. (2015). Transforming our world: The 2030 Agenda for Sustainable Development. UN Publishing. https://sdgs.un.org/goals/goal4 United States Agency for International Development, UNESCO Institute for Statistics, The World Bank Group, Foreign, Commonwealth & Development Office, Australian Council for Educational Research, & Bill and Melinda Gates Foundation. (2020a). Global proficiency framework for reading. USAID. https://www.edu-links.org/sites/default/files/media/file/GPF-Math-Final.pdf United States Agency for International Development, UNESCO Institute for Statistics, The World Bank Group, Foreign, Commonwealth & Development Office, Australian Council for Educational Research, & Bill and Melinda Gates Foundation. (2020b). Global proficiency framework for mathematics: Grades 1 to 9. USAID. https://www.edulinks.org/sites/default/files/media/file/GPF-Reading-Final.pdf von Davier, M., & Sinharay, S. (2013). Analytics in international large-scale assessments: Item Response Theory and population models. In L. von Rutkowski, M. Davier, & D. Rutkowski (Eds.), Handbook of International Large-Scale Assessment: Background, Technical Issues, and Methods of Data Analysis (1st ed.). Chapman and Hall/CRC. https://doi.org/10.1201/b16061
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.