Schools'’ External Evaluation Process: analysis focused on schools’ satisfaction
Author(s):
Conference:
ECER 2014
Format:
Paper

Session Information

09 SES 12 JS, Systemic Approaches in Educational Monitoring

Paper Session, Joint Session NW 09 and NW 11

Time:
2014-09-05
09:00-10:30
Room:
B012 Anfiteatro
Chair:
Samuel Gento

Contribution

This proposal, focusing on the analysis of the effects of the external evaluation of schools, provides an insight on school managers’ degree of satisfaction with this process in Portugal.

In educational policies, evaluation, particularly schools’ self-evaluation, is associated to concepts such as efficacy, efficiency and quality. Expressed in the recommendations of several governments and international organizations is the idea that the crisis of education is the main cause of national economic and social crises, given that students’ poor school results are correlated with cases of exclusion. Having excellence and competitive edge in mind, so characteristic of economic rationality, we can wager that the efficacy and efficiency of educational systems are achieved by monitoring the results obtained by schools.

As such, the change of educational public policies’ regulation modes in Europe is not a strange matter. Due to the social, political and economic modernization of the past decades, European States have felt the need to create and/or strengthen their inspection systems. From the 90’s onward, there has been a significant rise in inspecting action on several European countries where the public sector is at stake.

International research on public policies of Educational Inspection stresses that the debate regarding school evaluation has become a central point. As a clear example of the inspecting action, Devos and Verhoeven (2003: 403) report that in Belgium, political deciders feared that a greater autonomy could lead to a deterioration of schools’ quality, hence the need to verify the achievement of the laid out goals –  a job undertaken through inspection.

In addition, due to issues related to school management, or to materialize responsibilities assigned to schools, many EU countries have tried to implement self-evaluation systems. Schools are therefore provided with indicators or guidelines in order to undertake this self-evaluation process, although many times without the desired consequences. For example, Meuret and Morlaix (2003: 54), stress that although French schools are encouraged to develop an “evaluation culture”, only about 5 % of them actually use the tools provided.

In Portugal, the publication of Law no. 31/2002, December 20th, brought political and social visibility to schools’ external evaluation, and in 2007 the General Inspectorate of Education and Science, IGEC (Inspeção Geral da Educação e Ciência), launches the School External Evaluation Program, SEEP, which is based on processes presented with the intention of contributing to the improvement of school operations and education, articulating with the self-evaluation and the regulation of the educational system.

If we look at the typology of evaluation models proposed by Alaiz (2007), which reduces the multiplicity of existing reference frames, practices and procedures to two major types of external evaluation models, “structured” and “open”, we can posit that the proposal of Portuguese political-legal frame is closer to the first type of model (“structured”), from the outset by the clear reference to EFQM’s model as the basis of IGEC’s reference frame. Indeed, the SEEP is based on School Integrated Evaluation (a previous program in Portugal that lasted from 1999 to 2002), on EFQM’s model and on the Scottish model “How Good is our School”.

Method

From the methodological standpoint, for the purpose of data collection, an inquiry by questionnaire has been devised: an opinion inquiry aimed at collecting school managers’ perspectives, having considered, for the definition of inquiry items, the IGEC’s theoretical reference frame that encompasses the legislation regarding Schools’ External Evaluation Program (SEEP) (which specifies the latter’s objectives), the SEEP’s reference frame, which specifies the domains and factors to be evaluated, as well as the methodology adopted by the IGEC. A pilot study was previously made, in which some validation procedures were undertaken: inter-judge agreement; selection of a school to perform a preliminary test of the inquiry; spoken reflection (with two participants of the selected school, who read and responded to the inquiry, providing feedback on doubts, or clarification requests, which lead to small changes to the questionnaire); performing the inquiry on 25 participants and analyzing the subsequent Cronbach’s alpha (0,929). Once the consistency of the questionnaire and the inexistence of item redundancy had been confirmed, contact was established with the 83 schools evaluated in 2011/2012, with 25 of those schools responding positively to the solicitation. Afterwards, the validity, reliability and sensitivity (fundamental qualities of an evaluation instrument) of the inquiry were analyzed. Validity was analyzed through factorial analysis, with components extraction with varimax rotation. To analyze internal consistency, the Cronbach’s alpha was again determined, both for the overall of the questionnaire and for the components validated by factorial analysis. An analysis was also conducted on the item-total correlations for the full scale (according to the Pearson correlation coefficient). To verify the questionnaire’s sensitivity, regarding the way results are distributed, that is, to know this instrument’s discrimination capabilities, the Kolmogorov – Smirnov test (KS – normal distribution method) was carried out and the asymmetry and kurtosis values were also considered. Data statistical analysis made it possible to know how satisfied the respondents were on three dimensions: I) IGEC’s reference frame; II) the external evaluation process (“Process”); and III) the external evaluation report (“Report”). To assess the agreement level on each dimension, a global analysis on all of the answers (372) was conducted. Furthermore, a separate analysis where the participants’ held positions were considered relevant was also conducted.

Expected Outcomes

Results show, on the global analysis, an average of 2.94 points (on a scale of 1 to 4) of general satisfaction, and of 3.03 points of satisfaction with the dimension “Process”. This is, in fact, the dimension that presents itself with the highest agreement levels. Despite that, we believe that a more thorough analysis of this dimension is called for, given that some of its variables show significant disagreement levels, and also because it is pertinent to observe how differences in the obtained results can be viewed as a function of the positions held by the participants. When relating obtained data with the type of position held, the almost-always direct relationship between higher satisfaction levels and holding a Top Management position was stressed. Globally speaking, we can affirm that most respondents approve of the SEEP, and that school Top Management elements (principals and board members) always show higher agreement averages than Middle Management elements (Curricular Department Coordinators and Class Directors). The variables with the largest disagreement percentages differ according to management type: Top Management disagrees more on variables of the dimension “Process” and Middle Management disagrees more on the variables of the dimension “Report”. It is also worth stressing that total disagreement is low on all dimensions, showing that there are not many extreme positions regarding the SEEP. However, on a global analysis, it can be verified that the percentage of total agreement is low on all the variables (<15%).

References

ALAIZ, Vitor (2007) «Auto-avaliação das escolas? Há um modelo recomendável?», Correio da Educação, nº 301. Porto: Edições ASA BRAVO, Mª Pilar Colás; CATALÁN, Mª Angeles Rebollo (1994) Evaluación de Programas. Una guia práctica. Sevilla: Editorial KRONOS BARROS, Adil e LEHFELD, Neide (1986) Fundamentos de Metodologia. Edição: McGraw-hill CLÍMACO, Maria do Carmo (2002). A IGE e a Avaliação Integrada das Escolas, in CNE (Ed.), Qualidade e Avaliação da Educação, Lisboa: Conselho Nacional da Educação, pp. 35-46. COMISSÃO EUROPEIA, Eurydice (2004). L’évaluation des établissements d’enseignement obligatoire en Europe, Bruxelas: Eurydice. COMISSÃO EUROPEIA (2001). European Report on the Quality of School Education – Sixteen Quality Indicators, Luxemburgo: Office for Official Publications of de EC. DEVOS, Geert, & VERHOEVEN, Jef (2003). School self-evaluation – Conditions and Caveats. The case of secondary schools. Educational Management & Administration, 31 (4), 403-420. London: Sage. GHIGLIONE, Rodolphe; MATALON, Benjamin (2001) O Inquérito: Teoria e Prática. Oeiras: Celta Editora. HILL, Manuela Magalhães; HILL, Andrew (2008) Investigação por questionário. Lisboa: Edições Sílabo. LIMA, Licínio (2002). Avaliação e concepções organizacionais de escola: para uma hermenêutica organizacional, in Costa, Jorge Adelino, Neto-Mendes, António e Ventura, Alexandre (org.), Avaliação de Organizações Educativas, Aveiro: Universidade de Aveiro, pp. 17-29. MEURET, Denis, & MORLAIX, Sophie (2003). Conditions of Success of a School‟s Self-Evaluation: Some Lessons of an European Experience. School Effectiveness and School Improvement, 14 (1), 53-71. NEWMAN, Isadore; BENZ, Carolyn R. (1998) Qualitative-Quantitative Research Methodology. Exploring the Interactive Continuum. USA: Southern Illinois University OLIVEIRA, Pedro Guedes; CLÍMACO, Maria do Carmo; CARRAVILLA, Maria Antónia; SARRICO, Cláudia; AZEVEDO, José Maria; OLIVEIRA, José Fernando (2006) Relatório final da actividade do Grupo de Trabalho para Avaliação de Escolas. Ministério da Educação PALLANT, Julie (2001) SPSS – Survival manual. Buckingham-Philadelphia: Open University Press SAINT-GEORGES, Pierre (1997). «Pesquisa e crítica das fontes de documentação nos domínios económicos, social e político», in Albarello, L. et al (1997). Práticas e Métodos de Investigação em Ciências Sociais. Lisboa: Gradiva Publicações, p. 15-47 TASHAKKORI, Abbas.; TEDDLIE, Charles, (1998). Mixed Methodology. Combining Qualitative and Quantitative Approaches. California 91320: Sage Publications WEILER, Hans N. (1999). «Perspectivas Comparadas em Descentralização Educativa», in Sermento, M. J. (org.) (1999). Autonomia da escola. Porto: Edições ASA Others: Law No. 31/2002 of December 20th - Adopting the system of education and non-higher education, developing the scheme under Law No 46/86 of 14 October (Law on the Education System).

Author Information

Elisabete Gonçalves (presenting / submitting)
Faculdade de Psicologia e de Ciências da Educação da Universidade do Porto
Centro de Investigação e Intervenção Educativa
Porto
Faculdade de Psicologia e de Ciências da Educação da Universidade do Porto, Portugal
Faculdade de Psicologia e de Ciências da Educação da Universidade do Porto, Portugal

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.