Session Information
09 SES 17 A, Assessments and Evaluation Culture in (Higher) Education
Paper Session
Contribution
Overview and paper focus
The use of logic models (Rogers, 2008) to lay out the steps in the process from inputs to outcomes of programmes has become widespread in evaluations of social policy and interventions internationally. In recent years, the use of such models has become of more interest in education as they have been promoted by policy makers and funders including, for example the Education Endowment Foundation (EEF) which funds large-scale intervention evaluations in England. Yet the use of logic models has been questioned within the programme evaluation community on the grounds that such models can give a veneer of theoretical sophistication to a simplistic, instrumentalist perspective on the social world (Astbury and Leeuw, 2010). This paper addresses the question: how can logic models be used to frame and implement mixed methods designs in educational evaluation?
The paper provides a critical review of the use of logic models in the education field, for the first time, considering their place, benefits and limitations, framing the issues in relation to under-theorisation of complex change processes drawing on 'Theory-based' (Weiss, 1995) evaluation perspectives and those building on system and complexity-based approaches (Walton, 2014; Rogers, 2008).
We develop a novel approach to the use of logic models that aims to overcome the limitations discussed in the critical review. We go on to present how we have applied this approach to create and implement mixed methods evaluation designs, in particular related to two recent studies for EEF.
The paper concludes by reflecting on the practical and theoretical implications of this approach, laying out a set of key issues to address in future evaluations for which a logic model-framed mixed methods design may be appropriate.
Theoretical frameworks
The paper is framed by two, related, evaluation frameworks. The first is what is generally referred to as 'Theory-based' evaluation (Weiss, 1995), including theory-driven (Chen, 1990) and Realist (Pawson and Tilley, 1997) evaluation. These terms refer to a set of evaluation frameworks which "involves some attempt to 'unpack' the black box so that the inner components or logic of a program can be inspected" (Astbury and Leeuw, 2010 p.364). Secondly, the study draws on complexity and systems-based evaluation frameworks, particularly Walton's (2014) features of complexity and Rogers' (2008) consideration of how evaluation designs can deal with complexity in relation to programme theory. These approaches are used to develop a new approach to the use of logic models to both address these issues and support mixed methods design.
Our starting point in relation to the mixed methods design framework we used is identifying the approaches that could best answer the research questions as advocated by Gorard and Taylor (2004), but, unlike many mixed methods researchers, we do not believe that this implies that a pragmatic stance, where philosophical assumptions are ignored. Our approach does not fit neatly into classifications of paradigmatic approaches to mixed methods, such as those advocated by Teddlie and Tashakkori (2003). Rather, we argue that answering the research questions requires the use of methods with different underlying philosophical assumptions which need to be acknowledged at all stages of the research. So being explicit that the research generates socially constructed interpretivist perspectives from, for example, interview data and also general trends and patterns from, for example survey data, and are different ways of understanding the evaluation topic. This approach aligns to some extent with the complementary strengths approach advocated by Johnson and Turner (2003 p299) and aspects of a dialectical approach (Greene and Caracelli 2003 p96-97).
Method
Research methods The paper draws on two linked EEF studies, selected as exemplars of how logic models can be used to take account of the critique presented in the theoretical framing section. While the exemplars are drawn from England, they focus on areas that are frequently the subject of large and small scale evaluations in many countries, namely teacher professional development and scaling -up the the use of research evidence. The evaluation of the South and West Yorkshire (S&WY) 'TA Campaign' to encourage schools to adopt practices that align with EEF guidance on the best use of Teaching Assistants (Sharples, Webster and Blatchford, 2015) utilised pre- and post-campaign surveys of all schools in S&WY and a post-campaign survey of comparison schools, case studies of participating schools and analysis of attendance data. The 'RETAIN' early career teacher (ECT) CPD programme evaluation had a longitudinal design utilising repeated surveys of ECTs, semi-structured interviews with participating teachers, their in-school champions and head teacher and the delivery team, and observations of the programme. For each, we demonstrate how we created a logic model that takes into account the complex, situated nature of change processes and the perceived mechanisms and sequences that lead to that change, laying out for each both the final visual representation and the methods used, showing how such a model helps in practice with appropriate, sequenced mixed methods to meet the research aims. We also illustrate the ways in which constructing an evaluation design in this way increases the plausibility of claims that can be made from the study.
Expected Outcomes
Findings The use of the models above in relation to these two studies leads to a set of findings as follows [to be more fully developed in the paper]. The logic model approach described above allows sequenced change processes to be fully theorised in the design phase of mixed method studies if it takes into account the complexity and situated nature of educational change processes (e.g. framing the complex paths from RETAIN CPD via teacher self-efficacy outcomes to intentions to stay in the profession). Such an approach also enables appropriate methods to be used to examine both the change process and its relationship to programme outcomes (e.g. utilising teacher survey and case study approaches in both studies, with findings presented to demonstrate this claimed utility) In addition, the approach allows these methods to test out the contextual factors that enable the change process to be enacted, taking into account the complexity of the social world (e.g. showing that changes to teachers' use of research findings requires a focus on the research object; the change process; and the school environment). The paper concludes by drawing out the issues that need to be addressed in developing, articulating and enacting a well-theorised, situated logic model approach in mixed methods evaluation studies to support future evaluation in education.
References
References Astbury, B. and Leeuw, F.L., 2010. Unpacking black boxes: mechanisms and theory building in evaluation. American Journal of Evaluation, 31(3), pp.363-381. Chen, H.-T.,1990. Theory-driven evaluations. Newbury Park, CA: Sage. Gorard, S. & Taylor, C., 2004. Combining methods in educational and social research. Maidenhead Berkshire: Open University Press. Greene, J. C. & Caracelli, V. J., 1997. Defining and describing the paradigm issue in mixed methods evaluation, in: J. C. Greene & V. J. Caracelli (Eds) Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms. San Francisco: Jossey-Bass, pp. 5-17. Johnson, B. & Turner, L. A., 2003. Data collection strategies in mixed methods research, in: A. Tashakkori & C. Teddlie (Eds) Handbook of mixed methods in social and behavioral research. Thousand Oaks, CA: Sage, pp. 297-319. Pawson, R. and Tilley, N., 1997. Realistic evaluation. London: Sage. Rogers, P.J., 2008. Using programme theory to evaluate complicated and complex aspects of interventions. Evaluation, 14(1), pp.29-48. Sharples, J., Webster, R., and Blatchford, P., 2015. Making the best use of Teaching Assistants: Guidance Report. Available at: https://educationendowmentfoundation.org.uk/public/files/Publications/Campaigns/TA_Guidance_Report_MakingBestUseOfTeachingAssisstants-Printable.pdf Tashakkori, A. & Teddlie, C., 2003. Handbook of mixed methods in social and behavioral research (Thousand Oaks, CA: Sage. Walton, M., 2014. Applying complexity theory: a review to inform evaluation design. Evaluation and program planning, 45, pp.119-126. Weiss, C.H., 1995. Nothing as practical as good theory: Exploring theory-based evaluation for comprehensive community initiatives for children and families. New approaches to evaluating community initiatives: Concepts, methods, and contexts, 1, pp.65-92.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.