Session Information
11 SES 13 A, School Financing and School Policy
Paper Session
Contribution
The education system is a key social institution of any modern state, critical to the socio-economic and cultural development of society. For this institution to work effectively in a complex and uncertain world, its management must be data-driven (Burns et al., 2016). Informed transformation of educational content and the systematic use of new technologies in pedagogical practice are important characteristics of effective educational systems (Nelson & Campbell, 2017; Wiseman, 2010). Only research can determine "what works," similar to what is successfully implemented in, for example, health care (Davies, 1999).
The lack of effective use of research in educational policy and practice is regularly highlighted, and mechanisms are sought to strengthen the impact of research on decision-making in education systems (OECD,2022). The importance of and need for evidence-based decision-making was particularly vivid during the pandemic, highlighting how the existing gap between research and policy decisions may have dramatic consequences (Stuart & Dowdy, 2021).
A fairly wide pool of interventions designed to increase educational outcomes exists and well documented in international research. Prominent examples of cataloging this type of information are, for example, the following electronic resources:
- The U.S. Department of Education's Institute of Educational Sciences WWC open repository[1];
- repository of practices to help Every Student Succeeds Act, Evidence for ESSA[2];
- BEE Encyclopaedia of Educational Practices, created by Johns Hopkins University[3].
Webpages such as these appear due to the fallacy of the view that any educational projects are useful and effective. Moreover, evaluations of their effectiveness are often highly contradictory. It is not uncommon to find that proposed interventions do not lead to any worthwhile outcome (Lortie-Forgues & Inglis, 2019). However, the very fact that educational programs are evaluated for their effectiveness is extremely important in optimizing resources for their implementation and scaling.
The movement toward evidence-based education is most evident in English-speaking countries (Dekker & Meeter, 2022). However, there is growing interest in China (Slavin et al., 2021), Sweden (Segerholm et al., 2022), the Netherlands (Wubbels & van Tartwijk, 2017), Italy (Mincu & Romiti, 2022), France (Bressoux et al., 2019)[4]. To a much lesser extent, we see the development of this movement in the Post-Soviet states. The importance of the studies has been only increasing and is especially evident since these countries began to participate in PISA, the results of the first waves of which opened up possibilities for analyzing the connection between differences in the quality of educational outcomes and the reforms implemented (Khavenson & Carnoy, 2016).
Of particular interest is the ability of countries to generate and use their own "contextualized" research data about "what works" in education. Here one encounters a problem — in general, there are few works characterizing the landscape of educational research in the post-Soviet space (Chankseliani, 2017; Hernández-Torrano et al., 2021). We found no systematic review or meta-analysis of publications summarizing research on the impact of educational interventions on school students' academic outcomes that could allow policymakers and practitioners to construct informed educational policies.
The purpose of our paper is to evaluate the effectiveness of educational improvement programs in the Post-Soviet states, as there is a clear lack of systematization of such information in the presence of a clear demand for data-driven reforms. Our paper simultaneously addresses two tasks leading to the overall goal — a meta-analysis of studies summarizing the experience of post-Soviet countries in terms of interventions aimed at improving educational outcomes, and the search for a basis for building educational policies that would consider the current state of affairs.
[1] https://ies.ed.gov/ncee/wwc/
[2] https://www.evidenceforessa.org/page/about
[3] https://bestevidence.org/
[4] References are omitted in the list due to the lack of space.
Method
To answer the research question meta-analysis methodology was used. While conducting meta-analysis, we carefully followed the PRISMA statement guidelines (Page et al., 2021). To create the database, we used the following selection criteria: • the study must be devoted to evaluating the effectiveness of the program aimed to improve the academic achievements of schoolchildren or develop their skills facilitating successful learning; • the study should be conducted in the form of an experiment (the presence of randomization procedure in the study was coded, but was not a strict exclusion criterion); • schoolchildren must be the general population of the study; • the study is conducted on the sample of schoolchildren from one of the 15 post-Soviet countries; • the text is published in Russian or English; • the research must be a published article in a peer-reviewed journal or a defended PhD thesis; • year of publication from 1992 to 2023. To achieve the research goal, we systematically searched in these 4 scientific databases: Scopus, Web of Science, ProQuest, Google Scholar (the search was conducted in September 2022) . We were looking for the keywords that characterize the study design (experiment, control, rct etc.) and the dependent variable (achievement, performance, learning outcomes etc.). A total number of the publications suitable for analysis was 262. All of them were screened using Rayyan software . At the last step, we selected 27 papers for the full-text analysis. Three members of the research group went through the procedure of coding. The key parameters recognized in the coding scheme were: authors, year, type of publication, country, sample characteristics, sample size, dependent variable, type of the intervention, duration of the intervention, presence of randomization. We chose Cohen's d as the key statistic for an effect size calculation. We used an online calculator to convert the statistics published in studies to d. Afterwards, we carried out a classical version of the statistical meta-analysis with the random effects model in JASP . In addition, we assessed the homogeneity of the studies and checked whether the results demonstrate any kind of publication bias. To estimate the bias, we used the graphical method and the Egger’s test, as well as the selection models.
Expected Outcomes
Our final results are based on 28 effect sizes from 27 publications with a total sample of 14,853 schoolchildren. According to the age of the respondents, the studies covered samples from grades 1 to 9. It should be noted that there were no papers with high school students as the general population. Regarding the program type, vast majority of them were classified as pedagogical technologies. The overall mean effect size of the studies is 0.48 with a 95% confidence interval of 0.36 to 0.60 (and a range of 0.02-1.55). At the same time, we see that the effect size varies greatly across the studies — indicator of heterogeneity equals 82% (i2). If we consider three studies in which proper randomization was carried out, the effect size of the interventions decreases to 0.07 and becomes insignificant. We are going to build our discussion around the particular limitations and general barriers one the way of carrying high quality research. For example, we can conclude from the available research that in those countries all along there were and are: difficulties in accessing data when conducting research (Jonbekova, 2020); specifics of research culture and methodology, especially experimental research (Gromyko & Davydov, 1998); problems with standards of reviewing, publishing, academic integrity (Kuzhabekova & Mukhamejanova, 2017), a general low level of integration into international science. It is important to note that the idea of "what works" is only possible in a situation where the goals of the education system are clear (Hammersley, 2005), but many countries were dealing with much more severe issues since the collapse of the USSR. Politicians' words about the need for research are often just a blind "fashion" following. The very statement “we need data-driven policy” in a situation where there are almost no data is, at the very least, deceitful.
References
Burns, T., Köster, F., & Fuster, M. (2016). Education Governance in Action. OECD. https://doi.org/10.1787/9789264262829-en Chankseliani, M. (2017). Charting the development of knowledge on Soviet and post-Soviet education through the pages of comparative and international education journals. Comparative Education, 53(2), 265–283. https://doi.org/10.1080/03050068.2017.1293407 Davies, P. (1999). What is Evidence-based Education? British Journal of Educational Studies, 47(2), 108–121. https://doi.org/10.1111/1467-8527.00106 Dekker, I., & Meeter, M. (2022). Evidence-based education: Objections and future directions. Frontiers in Education, 7. https://doi.org/10.3389/feduc.2022.941410 Gromyko, Iu. V., & Davydov, V. V. (1998). The Conception of Experimental Work in Education Ideas for a formative experiment. Journal of Russian & East European Psychology, 36(4), 72–82. https://doi.org/10.2753/RPO1061-0405360472 Hernández-Torrano, D., Karabassova, L., Izekenova, Z., & Courtney, M. G. R. (2021). Mapping education research in post-Soviet countries: A bibliometric analysis. International Journal of Educational Development, 87, 102502. https://doi.org/10.1016/j.ijedudev.2021.102502 Jonbekova, D. (2020). Educational research in Central Asia: methodological and ethical dilemmas in Kazakhstan, Kyrgyzstan and Tajikistan. Compare: A Journal of Comparative and International Education, 50(3), 352–370. https://doi.org/10.1080/03057925.2018.1511371 Khavenson, T., & Carnoy, M. (2016). The unintended and intended academic consequences of educational reforms: the cases of Post-Soviet Estonia, Latvia and Russia. Oxford Review of Education, 42(2), 178–199. https://doi.org/10.1080/03054985.2016.1157063 Kuzhabekova, A., & Mukhamejanova, D. (2017). Productive researchers in countries with limited research capacity. Studies in Graduate and Postdoctoral Education, 8(1), 30–47. https://doi.org/10.1108/SGPE-08-2016-0018 Lortie-Forgues, H., & Inglis, M. (2019). Rigorous Large-Scale Educational RCTs Are Often Uninformative: Should We Be Concerned? Educational Researcher, 48(3), 158–166. https://doi.org/10.3102/0013189X19832850 Nelson, J., & Campbell, C. (2017). Evidence-informed practice in education: meanings and applications. Educational Research, 59(2), 127–135. https://doi.org/10.1080/00131881.2017.1314115 OECD. (2022). Who Cares about Using Education Research in Policy and Practice? OECD. https://doi.org/10.1787/d7ff793d-en Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ, n71. https://doi.org/10.1136/bmj.n71 Stuart, E. A., & Dowdy, D. W. (2021). Evidence-based COVID-19 policy-making in schools. Nature Medicine, 27(12), 2078–2079. https://doi.org/10.1038/s41591-021-01585-2 Wiseman, A. W. (2010). The Uses of Evidence for Educational Policymaking: Global Contexts and International Trends. Review of Research in Education, 34(1), 1–24. https://doi.org/10.3102/0091732X09350472
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.