23 SES 03 A, Governance and Monitoring
While teacher evaluation methods have attracted researchers’ interest, primarily in the US context, as part of value-added models for accountability measures (e.g. Harris, Ingle, & Rutledge, 2014; Carnoy & Loeb, 2002).), no prior research has examined the trends in teacher monitoring and evaluation methods and their association with student performance from a comparative international perspective. In this study, we explore the Programme for International Student Assessment (PISA) 2009, 2012, and 2015 data from the curriculum/didaktik point of view as well from PISA as a normative tool. The study addresses two main research questions: (1) How do curriculum and Didaktik traditions compare across teacher monitoring methods (TMM) using PISA 2009, 2012 and 2015 data? and (2) What is the association of TMMs with student science performance in PISA 2009, 2012 and 2015? The first objective of the study is to examine trends in teacher monitoring methods across schools within individual countries representing curriculum (primarily English speaking) and didaktik (Continental and Nordic Europe) traditions. The second objective is to examine association of TMM measures with student science performance within countries across three PISA waves. For theoretical framing, we first use Didaktik/Bildung theory to place the discussion within Continental and Nordic Europe educational thinking. Klette (2007) describes this tradition “as a relation between teachers and learners (the who), subject matter (the what) and instructional methods (the how)” (p. 147). Hopmann (2007), states that Didaktik is matter of order, sequence, choice; features that align well with contemporary thinking regarding US standards revision and attention to Learning Progressions. Within the frame of order, sequence, and choice, Hopmann states that “Didaktik became the main tool for creating space for local teaching by providing interpretative tools for dealing with state guidelines on a local basis” (p. 113). German concept Bildung is a noun meaning something like “being educated, educatedness.” It also carries the connotations of German word bilden, which means, “to form, to shape”. In Bildung, whatever is done or learned is done or learned to develop one’s own individuality, to unfold the capabilities of the I. Overall, Didaktik tradition, is defined as being more teacher-oriented, more content-focused, one where there is higher professional teacher autonomy (Author et al., 2015; Westbury, 2000; Deng & Luke, 2008). Next, we draw from the Curriculum tradition, which despite being categorized into several perspectives, such as humanistic, scholar, reconstructionist, and social efficiency, among else, it has been dominated by the social efficiency tradition (Deng & Luke, 2008; Author et al., 2015; Schiro, 2013, Kliebard, 2004; Schubert, 2008). Overall, curriculum tradition is defined as being more institution oriented and teaching methods-focused, while also being more evaluation-intensive (Westbury, 2000). Then we turn to the trans-national policy flows in education, where it has been argued that powerful knowledge producers and organizations such as the Organisation for Economic Cooperation and Development (OECD) and the World Bank for example, influence national education policies in various forms – from curriculum designs, to teacher education, and to accountability models to name a few (Drori et al., 2003; Non-author & Author, 2012). From this perspective, it is relevant to examine trends in teacher monitoring methods as defined by OECD in PISA school questionnaires, because what gets measured has the potential to become mainstream in national education systems – especially in those that regularly participate in PISA. Lastly, the literature on teacher monitoring distinguishes between two purposes behind teacher monitoring and evaluation systems, namely summative and formative. Summative assessment of teachers relies on TMMs that aim at recognizing/rewarding teachers who perform better and punishing those that don’t, while formative assessment is aimed at continuous professional development of teachers (Isoré, 2009).
The study utilized data from PISA surveys administered in 2009, 2012, and 2015 and employed quantitative research methods. To address the first research question on: How do curriculum and Didaktik traditions compare across teacher monitoring methods using PISA 2009, 2012 and 2015 data? two-sample difference of proportion test to compare the means of TMMs for curriculum and didaktik samples was used. This descriptive procedure was helpful to test the hypothesis whether Curriculum or Didaktik countries show greater use of any of the TMMs, namely tests, teachers, principals or instructors. To address the second research question on association of individual TMM variables with student science performance in PISA 2009, 2012, and 2015, inferential statistical analysis relying on Hierarchical Linear Modeling (HLM) was utilized, in order to examine the effectiveness of TMMs in student performance and to capture the nested nature of PISA data (Raudenbush & Bryk, 2002). The second question is exploratory in nature and no hypothesis was developed. The OECD initiated its work on developing PISA surveys around mid-1990s and administered the first PISA survey in 2000 (OECD, 2002). PISA tests 15-year students’ skills in three cognitive domains including mathematics, science and reading. The countries representing Didaktik include Denmark, Finland, Norway, Sweden, Austria and Germany, while Curriculum grouping includes Australia, Canada, Ireland, New Zealand, United Kingdom and the United States. Historical, geographical, empirical and practical criteria are considered to categorize the countries in the two respective groups. Teacher monitoring methods data derive from four items included in the school survey completed by school principals. PISA started to collect data on TMMs for the first time in 2009. Principals provided the responses on the following question: During
The results indicate that within countries, the use of all four TMMs is becoming more widespread from one PISA wave to another. In almost all 12 countries in the sample, the proportion of students in schools where any of the four TMM is used is lower in 2009 and increases progressively in 2012 and 2015. The increase is more dramatic for some of the countries, especially those under Didaktik – such as Germany and Sweden – in use of tests as a TMM, than others. Across countries and PISA waves and in each TMM specifically, we find that curriculum countries show higher use of all four TMMs than didaktik countries. The tests of proportions showed the differences in the use of TMMs in curriculum and didaktik countries is statistically significant and higher for curriculum than didaktik countries in all four TMMs. The results validate the theoretical claims that curriculum countries are more evaluation-intensive both in terms of evaluating teachers through teacher monitoring, and evaluating students through performance tests. Associations of TMMs with student science performance in PISA 2009, 2012, and 2015 showed mixed estimates, with some being relatively large and significant for a number of countries, but they were not statistically significant for most of them. There were more significant estimates in PISA 2012 data, with all estimates being relatively large, and also negative, in most cases, with UK being an example where teachers, principals, and inspectors had a negative association with student science performance. On the contrary, TMMs were relatively large and positive in four cases in four didaktik countries in PISA 2012 (Austria, Germany, Denmark, and Finland). Overall, the evidence from HLM models suggests that different TMMs have different effects on science performance in various countries, however the effects diminished by PISA 2015.
Carnoy, M., & Loeb, S. (2002). Does external accountability affect student outcomes? A cross-state analysis. Educational evaluation and policy analysis, 24(4), 305-331. Deng, Z. & Luke, A. (2008). Subject Matter: Defining and Theorizing School Subjects. In Connelly, F. M., He, M. F., & Phillion, J. (Eds.). The Sage Handbook of Curriculum and Instruction. Sage. 66–87. Drori GS, Meyer JW, Ramirez FO, et al. (2003). Science in the Modern World Polity. Institutionalization and Globalization. Stanford, CA: Stanford University Press. Harris, D. N., Ingle, W. K., & Rutledge, S. A. (2014). How teacher evaluation methods matter for accountability: A comparative analysis of teacher effectiveness ratings by principals and teacher value-added measures. American Educational Research Journal, 51(1), 73-112. Hopmann, S. (2007). Restrained Teaching: the common core of Didaktik. European Educational Research Journal, 6(2), 109–124. Isoré, M. (2009). Teacher Evaluation: Current Practices in OECD Countries and a Literature Review, OECD Education Working Papers, No. 23, OECD Publishing. http://dx.doi.org/10.1787/223283631428 Klette, K. (2007). Trends in Research on Teaching and Learning in Schools: didactics meets classroom studies. European Educational Research Journal, 6(2), 147–160. Kliebard, H. M. (2004). The struggle for the American curriculum, 1893-1958. Routledge. Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (Vol. 1). Sage Publications, Inc. Schiro, M. S. (2013). Curriculum theory: Conflicting visions and enduring concerns. Sage Publications. Schubert, W. H. (2008). Curriculum Inquiry. In Connelly, F. M., He, M. F., & Phillion, J. (Eds.). The Sage Handbook of Curriculum and Instruction. Sage. 399–419. Non-author and Author. (2012). Author, Non-author & Non-author. (2015) Westbury, I. (2000). Teaching as a Reflective Practice: What Might Didaktik Teach Curriculum? In I. Westbury, S. Hopmann, & K. Riquarts (Eds.), Teaching as a reflective practice: The German Didaktik tradition (pp. 15-39). Mahwah, NJ: Lawrence Erlbaum Associates.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.