09 SES 14 A, School Innovation, Accountability and Effectiveness: Findings from large scale assessments
Paper/Pecha Kucha Session
The study of the factors related to school effectiveness is a relevant topic within the field of educational research, since it can provide insightful information on how to improve education systems and it can inform educational policies at many levels. This research field has gained relevance in the past few decades (Gamazo, Olmos-Migueláñez & Martínez-Abad, 2016). The availability of data pertaining both to student performance and to the socioeconomic, demographic, organisational and educational characteristics of students and schools has allowed for the proliferation of studies on the relationship of all kinds of variables with student performance, and on the essential practices necessary to provide a quality and equal education. Large-scale international assessments, such as PISA, are an essential source for these kinds of educational data.
The concepts of school effectiveness and academic performance are related, but the implications of the term “school effectiveness” imply an exploration that goes beyond the mere study of the factors related to academic achievement. A school is defined as effective when it “achieves a comprehensive and integral development of each and every one of its students, even higher that it would be expected taking into consideration their previous performance and the social, financial and cultural situation of their families” (Murillo, 2005, p.25). Thus, this line of research is interested in the value that schools add to student performance beyond what is expected of them based on their socioeconomic background and resources.
PISA provides researchers with many school variables of different kinds, which several authors, such as Murillo (2007) or Jornet, González-Such and Perales (2012) have classified in three categories: input (gender, grade repetition, socioeconomic level, school resources, rate of migrant students, etc.), process (study habits, academic expectations, teaching methodology, leadership, teacher training, etc.) and product (academic performance). In turn, these
factors can also be divided in two levels: student and school.
However, the validity of PISA context measurements has been greatly discussed among researchers, with some authors advising caution when using and interpreting PISA data on account of low scale reliability and possible misunderstanding among respondents (Rutkowski & Rutkowski, 2010), indicators poorly grounded in theoretical bases (Caro, Sandoval-Hernández & Lüdtke, 2013) or poor design and lack of statistical control of the scales (De la Orden & Jornet, 2012).
At the other end of the methodological spectrum, some authors, such as by Lizasoain and Angulo (2014) or Murillo (2007), have favoured qualitative approaches to study the factors related to school effectiveness, having found that variables such as teacher involvement, student assessment, attention to diversity or leadership styles have a clear impact on the effectiveness of schools.
Thus, considering that the relationship between school-level process factors and school effectiveness has been proven through qualitative methodologies, our research question is: Can PISA’s school-level process factors be linked to school effectiveness?
Bearing all of this in mind, and based on the calculation of the effectiveness of each of the schools that participated in PISA 2015 in Spain, our research aims to answer the following research question: Which school-level process factors from the PISA questionnaires can be linked to school effectiveness?
This secondary analysis of PISA 2015 data is of a non-experimental expost-facto nature, due to the lack of experimental control over the variable collection. The sample for this study was composed of all 15-year-old students (born between January and December 1999) who participated in the 2015 PISA test in Spain: 31,273 students, where 49.4% (15,437) were female and 50.6% (15,836) were male. These students were enrolled in 897 schools. In order to conduct this research, the specific instruments created for the PISA test were used. There are two main kinds of instruments: the competence assessment tests used to measure student performance in reading, mathematics and science, and the context questionnaires administered to students, parents and schools. The latter provide a large amount of information on socioeconomic, cultural and demographic questions, and they also report on other topics of educational interest, such as school climate, student motivation, teacher training, or school assessment practices. To obtain information on the effectiveness of schools we used hierarchical linear models (Snijders & Bosker, 2012). A model was defined for each competence assessed, keeping only the significant predicting variables at both levels, which allowed us to calculate the difference between the actual school score and the expected score according to the socioeconomic and cultural variables, also referred to as school “residual”, obtained through empirical Bayes estimators (Raudenbush, Bryk, Cheong, Congdon, & Du Toit ,2011). With this information, we established three groups of schools: high-residual (residual in the top 33% in all three competences), low-residual (residual in the bottom 33% in all three competences), and the rest of the schools. Then, we proceeded with the calculation of the correlation between the main process variables at school level (leadership, curriculum development, professional development, school responsibility over resources and curriculum, teacher participation, school autonomy, etc.) and the dichotomous variable generated (criterion variable: high or low residual school) through a point-biserial correlation.
This analysis revealed that there were no statistically significant differences in the scores of the process variables obtained by high and low residual schools. This conclusion differs from some qualitative studies which concluded that there are many process variables related to school effectiveness in a relevant way, such as the research by Lizasoain and Angulo (2014), or the work of Murillo (2007). The causes for this relevant discrepancy may vary in nature. De la Orden and Jornet (2012) highlight the shortcomings presented by the PISA questionnaires to properly measure the contextual factors, which could hinder a correct analysis of the educational reality and lead to wrong conclusions. This issue might point to the unfitness of the PISA context measurements to accurately portray the reality of the participating schools, and also a necessity to search for alternative sources for contextual data which could give the research a higher degree of internal validity. The results obtained in this study suggest the need to encourage in-depth research about the factors related to school effectiveness. This study should always be based on the comparison between high and low residual schools so as to make it possible to eliminate those factors that occur in both types of school, thus making them irrelevant. Therefore, we propose the establishment of new lines of research which, starting from the school selection presented in this study, make use of alternative sources of non-contextual data, such as qualitative research techniques, in order to determine the school factors that are relevant for the study of school effectiveness, thus being able to draw relevant conclusions for educational policies and practices.
Caro, D. H., Sandoval-Hernández, A., & Lüdtke, O. (2013). Cultural, social, and economic capital constructs in international assessments: An evaluation using exploratory structural equation modeling. School Effectiveness and School Improvement, 25(3), 433–450. doi:10.1080/09243453.2013.812568 De la Orden. A. & Jornet, J. M. (2012). La utilidad de las evaluaciones de sistemas educativos: el valor de la consideración del contexto. Bordón, 64(2), 69-88. Joaristi, L., Lizasoain, L., & Azpillaga, V. (2014). Detección y caracterización de los centros escolares de alta eficacia de la Comunidad Autónoma del País Vasco mediante Modelos Transversales Contextualizados y Modelos Jerárquicos. Estudios sobre Educación, 27, 37-61. Lizasoain, L., & Angulo, A. (2014). Buenas prácticas de escuelas eficaces del País Vasco. Metodología y primeros resultados. Participación Educativa, 3(4), 17-27. Martínez-Abad, F., Lizasoain, L., Castro, M., & Joaristi, L. (2017). Selección de escuelas de alta y baja eficacia en Baja California (México). REDIE. Revista Electrónica de Investigación Educativa, 19(2), 38-53. Meunier, M. (2011). Immigration and student achievement: Evidence from Switzerland. Economics of education review, 30(1), 16-38. Murillo, F. J. (Coord.) (2007). Investigación Iberoamericana sobre Eficacia Escolar. Bogotá: Convenio Andrés Bello. Organisation for Economic Co-operation and Development (OECD) (2014). PISA 2012 Technical Report. Paris: OECD Publishing Organisation for Economic Co-operation and Development (OECD) (2016). PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematics and Financial Literacy. Paris: OECD Publishing. Organisation for Economic Co-operation and Development (OECD) (2017), PISA: Programme for International Student Assessment, OECD Education Statistics (database). DOI: http://dx.doi.org/10.1787/data-00365-en Raudenbush, S. W., Bryk, A. S., Cheong, Y. F., Congdon, R., & Du Toit, M. (2011). Hierarchical linear and nonlinear modeling (HLM7). Lincolnwood, IL: Scientific Software International. Rutkowski, L., & Rutkowski, D. (2010). Getting it “better”: The importance of improving background questionnaires in international large-scale assessment. Journal of Curriculum Studies, 42(3), 411–430. Snijders, T., & Bosker, R. J. (2012). Multilevel analysis: An introduction to basic and advanced multilevel modeling (ujh2nd Edition). London: Sage Publications.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.