A Thorn in the Side. Does the Medical Model of Evidence fit for (current) Comparative Educational Research?
Author(s):
Conference:
ECER 2009
Format:
Paper

Session Information

23 SES 06 D, Research Politics and the Knowledge-Policy Relationship II

Paper Session. Continued from 23 SES 05 D

Time:
2009-09-29
10:30-12:00
Room:
HG, HS 21
Chair:
Lisbeth Lundahl

Contribution

There is little doubt that current educational policy is marked by a growing demand for evidence-focused research. This movement is closely linked up to outcome-based instruction and standard-based educational reform (Hopmann et.al. 2007, Koretz 2008). Behind that one will find arguments stating that public policies must be informed – do decide and to navigate in insecure waters – by profound data (OECD 2000). As means are experimental research (Oakley 2000, Slavin 2002, 2004) or the gathering of cumulative data (i.g. Hanushek & Wößmann 2005, CEC 2007) seen. Both research strategies build ex- or implicitly on comparisons. The comparative element producing such claimed evidence is to be seen in the relation of results in the light of an a priori set benchmark. Evidence will occur in better (or worse) results. In the light of the functions of educational comparative research (Hörner 1993, Schriewer 1982) this would correspond to the melioristic one. But then the function will not point to evidence in the classical understanding (Sackett et.al. 1996, 1997) in as far as the politically motivated research process asks for the better system. This gets evident since a objective of such research is to map (inter)national performance distribution. The originating data is to be used for (system) improvement (OECD 2001, Peterson & West 2003, Fitzner 2004). In that case it astonishes that not the experimental or evolutionistic function of comparative methodology has been applied. Comparative methodology is concerned about the unity of similarity and difference. The comparison answers the question: How do two or more objects/ results relate to each other. According to that, it is not possible to answer the question if one or another system or pedagogical treatment works better by medicine based experiments or assessment studies. According to their inner research logic they will show only effects. They can not identify causes. On the basis of effects one will not be able to “lend” or to “borrow” solutions. Then, a nation’s relative position on a league table can not be used as “evidence” to reflect or to justify educational reform efforts. Therefore, the paper will explore how comparative methodology was used in recent assessment and experimental studies and ask for their ability to produce the requested evidence. Further on, this paper seeks to discuss if these research strategies are about a government-funded initiative designed to take forward the challenge of systematic educational research.

Method

To do so, the medicine based concept of evidence is to be elaborated (Evans & Benefield 2001). Then, the paper will make use of the function matrix: experiment, ideography, meliorism, evolution (Hörner 1993, Werler 2009). To produce empirical data on the used comparative function the paper will apply a combination of qualitative content analysis and frequency analysis (Mostyn 1985; Mayring 2000) on a sample of recent assessment and experimental studies (1996 – 2008) published in European educational research journals. Whilst the frequency analysis will produce keywords in context the content analysis will show latent content in its context (Krippendorf 1980). Comparing the theoretically developed functions of comparing with the found functions in the data one is able to show if evidence gets possible or not.

Expected Outcomes

The analysis of data thus obtained will answer the question in how far a real comparison is applied or if the comparison is just about legitimization of (political) expectations. Furthermore, it will be shown in how fare comparisons are made up in the light of relation between expectation and result. Thirdly, the study will show that it might be useful to compare how countries or institutions create the relation between expectations and outcome since this is the basis of evidence focussed research.

References

CEC - Commission of the European Communities. (2007). Towards more knowledgebased policy and practice in education and training: Commission Staff Working Document. URL: http://ec.europa.eu/education/policies/2010/doc/sec1098_en.pdf [14.02.2009]. Conee , E. & Feldman, R. (1985). Evidentialism, In: Philosophical Studies 48, pp. 15–34. Evans, J. & Benefield, P. (2001). Systematic Reviews of Educational Research: Does the Medical Model Fit? British Educational Research Journal 27, 5, p. 527–541. Fitzner, Th. (Eds.) (2004). Bildungsstandards. Internationale Erfahrungen - Schulentwicklung - Bildungsreform. Bad Boll: Evangelische Akademie. Hanushek, E. A. & Wößmann, L. (2005). Does Educational Tracking Affect Performance and Inequality? Differences-in-Differences Evidence across Countries, Economic Journal, 116, C63-C76. Hopmann, S.T; Brinek, G. et.al. (2007). PISA According to PISA - Does PISA Keep What It Promises? In: S. T. Hopmann; G. Brinek; M. Retzel (Eds.), PISA zufolge PISA - PISA According to PISA. Hält PISA, was es verspricht? - Does PISA Keep What It Promises? , (pp. 9-15). Münster: Lit. Hörner, W. (1993). Technische Bildung und Schule. Eine Problemanalyse im internationalen Vergleich. Köln: Böhlau. Koretz, D. (2008). Measuring Up: What Educational Testing Really Tells Us. Cambridge, MA: Harvard University Press. Krippendorff, K. (1980). Content analysis. An Introduction to its Methodology. Beverly Hills: Sage. Mayring, Ph. (2000). Qualitative Content Analysis. Forum: Qualitative Social Research. 1,2. URL: http://www.qualitative-research.net/index.php/fqs/article/view/1089/2385 [14.02.2009]. Oakley, A. (2000). Experiments in Knowing: gender and method in the social sciences. Cambridge: Polity Press. OECD (2000). Knowledge Management in the Learning Society, Centre for Educational Research and Innovation. Paris: OECD. OECD (2001). Knowledge and Skills for Life. First Results from PISA. Paris: OECD. Peterson, P. & West, M. (Eds.) (2003). No Child Left Behind?: The Politics and Practice of School Accountability. Washington: Brookings Institution Press. Sackett, D. et.al. (1997). Evidence-Based Medicine: How to Practice and Teach EBM. London: Churchill Livingstone. Sackett. D. et.al. (1996). Evidence Based Medicine: What It Is and What It Isn’t. British Medical Journal 312, pp. 71–72. Schriewer, J. (1982). "Erziehung" und "Kultur". Zur Theorie und Methodik vergleichender Erziehungswissenschaft. In: Brinkmann, W. & Renner, K. (Eds.) Die Pädagogik und ihre Bereiche. Paderborn: Schöningh., pp. 185-236. Slavin, R. (2002). Evidence-Based Educational Policies: Transforming Educational Practice and Research. Educational Researcher 31, 7, pp. 15–21. Slavin, R. (2004). Education Research Can and Must Address ‘What Works’ Questions. Educational Researcher 33, 1, pp. 27–28. Werler, T. (2009). Norden. …made by Education. In: Werler, T. & Midtsundstad, J. (Eds.) Nordisk Didaktikk. Bergen: Fagbokforlaget. (in print)

Author Information

University of Agder
Institute of Education
Kristiansand
158

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.