26 SES 01 A, New Insights into Leadership Practices
Decision-making responsibilities at the school level have grown substantially in most education systems over the last few decades (e.g., Cheng, Ko, and Lee 2016, Woessmann et al. 2009, OECD 2012). School leaders, as the final arbiters in school-level decision-making processes (Spillane and Lee 2014), are autonomous with respect to several functional domains. The large educational effectiveness research (EER) knowledge base demonstrates that schools and school leaders can make a difference in terms of increasing student achievement levels, but that certain interventions are more effectual than others (e.g. Scheerens 2016, Hattie 2009, Robinson, Hohepa, and Lloyd 2009). At the heart of EER are two foundational questions: ‘what makes a “good” school?’ and ‘how do we make more schools “good”?’ (Reynolds et al. 2014). Considering the mission of EER, there should ideally be considerable congruence between school interventions seen in practice and those factors that have been studied by EER researchers and found to be associated with high(er) effect sizes.
Whether school leaders act in accordance with the EER knowledge base is an interesting question about which little is known. Do the interventions that take place in school practice reflect the findings of the EER literature? In those cases where analogies are present, the question arises of whether the resemblance is a particularly close one, with a high degree of likeness between practice and theory, or of whether the parallel is more general. Likewise, whether those school interventions that are introduced in practice are those with high(er) effect sizes is another relevant question. A thematic identification of the analogies between current school interventions and EER factors might provide valuable information for educational researchers, policymakers, practitioners, and training institutes internationally. Comparative analyses might reveal that school leaders address issues that have received only limited research attention; conversely, the findings might suggest that researchers have put substantial effort into determining the effectiveness of interventions that are rarely seen in practice. If this were the case, new questions would arise for all those involved in school improvement.
The aim of this paper is to present an analysis of how current school interventions relate to the effectiveness factors presented in three internationally authoritative EER syntheses. This goal was accomplished via a comparative analysis contrasting a dataset of nearly 600 actual school interventions identified via a questionnaire distributed among school leaders with the effectiveness factors presented by Robinson, Hohepa, and Lloyd (2009); Scheerens (2016); and Hattie (2009). We additionally conducted an in-depth analysis of the three school interventions with the highest frequencies. The paper, moreover, offers the mean effect sizes and ranks of all school interventions with an analogous effectiveness factor.
The research context is Dutch secondary education, which by international standards, features a very high level of school autonomy (OECD 2016b). Within a framework of learning objectives, standardized examinations, and block grants established by the national government, the administration of Dutch schools is highly decentralized. This feature of the Dutch education system implies a broad range of decision-making responsibilities.
The dataset of current Dutch secondary school interventions was derived from a digital questionnaire with open-ended questions that was distributed among 543 school leaders with ultimate decision-making responsibility at the school level in January 2015. In total, 196 school leaders completed the questionnaire. This percentage equals approximately 14% of the total population. Even though the population that received an email invitation was, by design, not completely random, the distribution of school characteristics (e.g., education type, available educational tracks, and province) across the response group did not demonstrate any substantial differences from national numbers (DUO 2015). Despite their recognized limitations, we followed Higgins’s (2016, 40) reasoning that ‘the data from meta-analysis offer the best source of information to address cumulative questions about effects in different areas of educational research.’ For this reason, only synthesis studies were incorporated into this explorative comparative analysis. Since no syntheses exist that are exclusively based on Dutch secondary education research, we relied on three internationally authoritative studies. To analyze the interventions both from a school perspective and a school leadership perspective, we used syntheses from both effectiveness traditions. To analyze how current school interventions in Dutch secondary education relate to the effectiveness factors in the three syntheses, we compared each school intervention from the dataset to each individual factor presented in each synthesis study according to the available definitions or descriptions. Each analogy between a school intervention and an effectiveness factor was categorized into one of four analogy types. These types were primarily based on the abstraction level of the effectiveness factor in question. Some effectiveness factors were described in a fairly specific and clear manner, and these included bilingual programs, career interventions, and ability grouping initiatives. Other factors featured a more general or comprehensive character, e.g., school climate, curriculum quality, and educational leadership. These different levels of abstraction and their accompanying effect sizes influenced both the amount of space for interpretation and the applicability of the factors. To manage these differences, we differentiated among four analogy types. Three of the four types were based on the specificity of the effectiveness factor and the presence of an effect size. A fourth variant was added to indicate the lack of an analogous effectiveness factor. There was, of course, a fifth analogy option, one describing those effectiveness factors that did not demonstrate any parallels with a school intervention. These analogies, however, were not included in our comparative analysis.
This study’s comparative analysis demonstrated that a wide range of current Dutch secondary school interventions lacked an analogous factor in at least one of the examined syntheses, despite the relatively inclusive stance that we adopted in identifying those analogies. This means that the EER meta-analyses offered little to no support for a substantial share of actual practices regarding Dutch secondary schools and school leaders. The general analogies and, where applicable, the corresponding aggregated effect sizes may serve as a broad indication for school leaders, researchers, policymakers, and training institutes of the thematic direction of current school interventions and their expected overall effectiveness. These analogies are, however, less informative for school leaders seeking EER-based evidence to use in their decision-making processes regarding specific school interventions or for researchers and policymakers aiming to study the effectiveness of current school interventions. Additionally, the mean effect sizes and ranks, along with the findings regarding the three most frequently implemented interventions, demonstrated that the three syntheses offered little evidence that the vast majority of interventions substantially improve student achievement. When present, the corresponding effect sizes were relatively low, or even negative. Still, it is these interventions that shape current Dutch secondary school practice. Our findings raise questions in light of the ongoing debate about the gap between educational research and practice. In a world in which school leaders in particular—and practitioners in general—are increasingly urged to use the accumulating supply of evidence in their daily practice, identifying whether school leaders are generally reluctant to use evidence or whether they consider the available evidence insufficiently applicable to their school practice is critical. Insights like these are highly relevant for the EER domain to increase its ‘its impact for good upon children’ (Reynolds et al. 2014, 198).
Cheng, Yin Cheong, James Ko, and Theodore Tai Hoi Lee. 2016. "School autonomy, leadership and learning: a reconceptualisation." International Journal of Educational Management 30 (2):177-196. doi: 10.1108/IJEM-08-2015-0108. DUO. 2015. School Locations in Dutch Secondary Education. Hattie, John. 2009. Visible learning. A synthesis of over 800 meta-analyses relating to achievement. New York, NY: Routledge. Higgins, Steve. 2016. "Meta-synthesis and comparative metaanalysis of education research findings: some risks and benefits." Review of Education 4 (1):31-53. doi: 10.1002/rev3.3067. OECD. 2012. Education at a Glance 2012: OECD Indicators: OECD Publishing. OECD. 2016a. Netherlands 2016: Foundations for the future, Reviews of National Policies for Education. Paris, France: OECD Publishing. OECD. 2016b. PISA 2015 Results (Volume II): Policies and Practices for Successful Schools. Edited by PISA. Paris: OECD Publishing. Reynolds, David, Pam Sammons, Bieke De Fraine, Jan Van Damme, Tony Townsend, Charles Teddlie, and Sam Stringfield. 2014. "Educational effectiveness research (EER): a state-of-the-art review." School Effectiveness and School Improvement 25 (2):197-230. doi: 10.1080/09243453.2014.885450. Robinson, Viviane, Margie Hohepa, and Claire Lloyd. 2009. School leadership and student outcomes: Identifying what works and why. Best evidence synthesis iteration (BES). Wellington, New Zealand: Ministry of Education. Scheerens, Jaap. 2016. Educational effectiveness and ineffectiveness. A critical review of the knowledge base. Dordrecht, the Netherlands: Springer. Spillane, James P., and Linda C. Lee. 2014. "Novice school principals' sense of ultimate responsibility: Problems of practice in transitioning to the principal's office." Educational Administration Quarterly 50 (3):431-465. doi: 10.1177/0013161X13505290. Woessmann, Ludger, Elke Luedemann, Gabriela Schuetz, and Martin R. West. 2009. School accountability, autonomy and choice around the world. Cheltenham, UK & Northampton, MA: Edward Elgar.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.