Session Information
03 SES 12, 21st Century Skills and the Curriculum (Part 2)
Paper Session continued from 03 SES 11
Contribution
Currently, evidence-based educational practice is increasingly demanded (ECER call for papers, p. 1). According to Sackett et al. (1996), evidence-based practice „means integrating individual […] expertise with the best available external […] evidence from systematic research” (p. 71). In this context, empirical research on education is expected to systematically generate practice-oriented knowledge on “what works (when and how)” in order to contribute to a quality improvement of educational practices by supporting educational practitioners in making decisions for and with their clients in a better informed, well-founded, and less speculative manner (e.g., Burkhardt & Schoenfeld, 2003; Dede, 2006; Nelson et al., 2009; OECD, 2007).
A crucial point in this matter is how educational practitioners deal with research evidence – for instance when confronted with an evidence-based educational intervention they (do not) choose to implement –, because the intended r e a l effectiveness of evidence-based knowledge is decided on the micro-level, i.e. via the educational practitioner who judges and acts autonomously (e.g., Cordingley, 2004; Dede, 2006). In this paper, implementation is understood and exemplified as the realisation of a research-based educational intervention by practitioners themselves through their individual selective perception and reception as well as their personal usage of the intervention (e.g., Fullan & Pomfret, 1977), which in our case is an evidence-based teacher training. Research on implementation and innovation (e.g., Fullan & Pomfret, 1977), school improvement (e.g., Cordingley, 2004), teacher development (e.g., Millar et al., 2006), and curriculum implementation (e.g., Remillard, 2005) points to various ”fractures” during implementation. However, „there is little systematic understanding of how […] practitioners use research evidence, much less how they acquire or interpret it” (Nelson et al., 2009, p. iii).
Findings of the research discourses mentioned above indicate that the following major aspects might influence implementation of an evidence-based intervention: intervention features (e.g., content; evidence for effectiveness), characteristics of the individuals involved (e.g., prior knowledge), and conditions at the macro-, meso- and micro-level (e.g., structural, financial, or personal circumstances) (e.g., Thurlings et al., 2015). Additionally, in line with a profession theory perspective and the above-mentioned understanding of evidence-based practice (e.g., Millar & Osborne, 2009), it might be expected that practitioners make their judgements regarding research-based educational interventions and how to integrate them into their everyday educational practices in recourse to what this research-based knowledge means for their o w n learners and educational objectives at this particular point in time (e.g., Cordingley, 2004, p. 79). Despite these findings and assumptions, it is still widely unknown which of these potentially influencing factors are really being taken up by educational practitioners (in our case teacher educators) and how strongly these factors guide and intertwine with the selective perception and reception, thus causing implementation ”fractures” that might hinder the original effectiveness of an evidence-based intervention.
Using an evidence-based teacher training as an educational intervention, the research questions are: (1) Which features of an evidence-based training concept do teacher educators freely mention when asked why they agreed to implement it? (2) Which potentially influencing factors (known from the research literature) do teacher educators base their judgement on when asked why they agreed to implement this specific training concept into their own training courses? (3) What kind of role does their own reference frameworks – their learners, educational objectives etc. – play compared to research-based evidence?
Method
Expected Outcomes
References
Burkhardt, H., & Schoenfeld, A. H. (2003). Improving educational research: Toward a more useful, more influential, and better-funded enterprise. Educational Researcher, 32(9), 3–14. Cordingley, P. (2004). Teachers using evidence: Using what we know about teaching and learning to reconceptualize evidence-based practice. In G. Thomas & R. Pring (Eds.), Evidence-based practice in education (pp.77–87). London: University Press. Dede, C. (2006). Scaling up: Evolving innovations beyond ideal settings to challenging contexts of practice. In R. K. Sawyer (Ed.), The Cambridge handbook of learning sciences (pp. 551–566). Cambridge University Press. Fullan, M., & Pomfret, A. (1977). Research on curriculum and instruction implementation. Review of Educational Research, 47(1), 335–397. Millar, R., & Osborne, J. (2009). Research and Practice: A complex relationship? In M. C. Shelley II, L. D. Lore, & B. Hand (eds.), Quality research in literacy and science education: International perspectives and gold standards (pp. 41–61). Dordrecht: Springer. Millar, R., Leach, J., Osborne, J., & Ratcliffe, M. (2006). Research and practice in education. In R. Millar, J. Leach, J. Osborne, & M. Ratcliffe (Eds.), Improving subject teaching: Lessons from research in science education (pp. 3–23). London: Routledge. Nelson, S. R., Leffler, J. C., & Hansen, B. A. (2009). Toward a research agenda for understanding and improving the use of research evidence. Portland, OR: Northwest Regional Educational Laboratory. OECD (2007) = Organisation for Economic Co-operation and Development (Eds.). (2007). Evidence in education: Linking research to policy. Paris: Editor. Ratcliffe, M., Bartholomew, H., Hames, V., Hind, A., Leach, J., Millar, R., & Osborne, J. (2006). From evidence to impact: Users´ perceptions of research and its influence on their practices. In R. Millar, J. Leach, J. Osborne, & M. Ratcliffe (Eds.), Improving subject teaching: Lessons from research in science education (pp. 134–151). London: Routledge. Remillard, J. T. (2005). Examining key concepts in research on teachers´ use of mathematics curricula. Review of Educational research, 75(2), 211–246. Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Hayens, R. B., & Richardson, W. S. (1996). Evidence based medicine: What it is and what it isn´t. BMJ: British Medical Journal, 312(7023), 71–72. Thurlings, M., Evers, A. T., & Vermeulen, M. (2015). Toward a model of explaining teachers´ innovative behavior: A literature review. Review of Educational Research, 85(3), 430–471.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.