Does support matters in interpreting and using school feedback? Findings from a quasi-experimental study

Session Information

MC_POST, Main Conference Poster Session and Lunch Break

Posters will be displayed throughout the conference and submitters are asked to be present in both Poster Sessions to answer questions. Poster Session I: Tuesday, 12.15 - 13.30 Poster Session II: Wednesday 12.15 - 13.30

Time:
2009-09-29
12:15-13:15
Room:
Otkogon
Chair:

Contribution

Feedback delivered by school performance feedback systems (SPFSs) is regarded as a promising tool to engage schools in internal quality assurance activities (Visscher & Coe, 2002, 2003). Receiving feedback as such is not a sufficient condition for stimulating self-evaluation and systematic reflection in schools however. Several school, contextual and SPFS conditions determine if and how schools will use the feedback. However, the knowledge base on this theme is rather limited (Earl & Fullan, 2003; Kerr, Marsh, Ikemoio, Darilek & Barney, 2006; Saunders, 2000; Williams & Coles, 2007). A crucial factor in determining feedback use is the amount of support needed by schools and the fulfilments of these needs (Schildkamp & Teddlie, 2008). For instance: school staff members who were invited to in-service training were more likely to read the reports, make use of them and had a more positive attitude towards the feedback (Tymms, 1995). Feedback support can be supplied by internal or external supporters, in a formal or informal way, in diverse forms as offering extra information in the report to interpret the results, by providing a helpdesk, by in- and on-service trainings, by offering extra resources and funding, etc. This study focuses on the effects of support. The context of this study are school feedback reports delivered in spring 2008 to 196 schools in Flanders (Belgium). Those reports informed schools on the performance of the cohort under study in the first for years of primary education (between 2003-2007). Results were reported for maths, reading fluency and orthography, supplemented with information on pupil mobility and input characteristics (child factors, home factors and Dutch language skills at the start of grade 1). The research questions (based on Kirkpatrick's 4 levels of evaluating training programs (Kirkpatrick, 1998)) that may be answered by using this design are the following: What is the effect of different kinds of support on the interpretation and use of school feedback reports on… 1. …satisfaction concerning the support? 2. …knowledge, skills and attitudes concerning interpretation and use of the school feedback? 3. …interpretation and use of the school feedback? 4. …effects of school feedback?

Method

In this study a quasi-experimental approach is used. Therefore, the 196 schools that received the school reports were randomly assigned to one of three conditions (set up according to Gardner (1995)): 1. Control condition (N=162): These schools received the school reports without participating in a training initiative. They only had a online helpdesk at their disposal (this was also the case for the two experimental conditions). 2. Experimental condition 1 (N=24): These schools participated in a study day. 3. Experimental condition 2 (N=6): These schools were visited by members of the feedback team. During the session schools own data were used (instead of those of a hypothetical school in experimental condition 1). A multi-method approach was used in the data gathering. This paper focuses on the results of a questionnaire focussing on the (effect of) support, the use and effect of school feedback but also on interpretation capabilities of respondents.

Expected Outcomes

The result section starts with describing the (results of) use of school feedback, the attitude towards school feedback and the interpretation capabilities of the participating schools (n = 120). The results show that head teachers (still) attribute few effects to the use of SF reports. However there is a large variation between schools. One of the factors which explains differences between schools is the experimental condition in which they participated. Schools that took part in a experimental condition report a more intensive feedback use and have more confidence in their knowledge and skills to interpret the feedback than those in the control condition. Support did not seem to have an effect on the attitudes towards school feedback however. Based on the results suggestions for supporting schools in the use of school feedback will be included.

References

Earl, L., & Fullan, M. (2003). Using data in leadership for learning. Cambridge Journal of Education, 33(3), 383-394. Gardner, R. (1995). Onservice Teacher Education. In L. W. Anderson (Ed.), International Encyclopedia of Teaching and Teacher Education (second edition). London: Pergamon Press. Kerr, K. A., Marsh, J. A., Ikemoio, G. S., Darilek, H., Barney, H. (2006). Strategies to promote data use for instructional improvement: Actions, outcomes, and lessons from three urban districts. American journal of education, 112, 496-520. Kirkpatrick, D.L. (1998). Another look at evaluating training programs. Alexandria, VA: American Society for Training & Development. Saunders, L. (2000). Understanding schools’ use of ‘value added’ data: The Psychology and Sociology of numbers. Research paper in education, 15(3), 241-258. Schildkamp, K., & Teddlie, C. (2008). School performance feedback systems in the USA and in The Netherlands: a comparison. Educational Research and Evaluation, 14(3), 255-282. Tymms, P. (1995). Influencing educational practice through performance indicators. School Effectiveness and School Improvement, 6, 123-145. Visscher, A. J. (2002). A Framework for Studying School Performance Feedback Systems. In A. J. Visscher, & R. Coe (Eds.), School Improvement through Performance Feedback. Lisse: Swets & Zeitlinger. Williams, D. & Coles, L. (2007). Teachers’ approaches to finding and using research evidence: An information literacy perspective. Educational Research, 49(2), 185-206.

Author Information

University of Antwerp
IOIW
Antwerp
20
Ghent University, Belgium
University of Antwerp, Belgium
Ghent University, Belgium

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.