Session Information
09 SES 06 A, Assessment and Feedback in Higher and Adult Education
Paper Session
Contribution
In Spain, the recognized university courses were restructured as part of the Bologna Process, which was implemented in 2007 after approval by the Royal Decree 1393/2007 of 29 October and further amended by Royal Decree 861/2010 of 2 July 2010. As a participating country of the European Higher Education Area, the Spanish higher education system implemented (a) a three-cycle higher education system (i.e., bachelor’s, master’s, and doctoral studies); (b) a set of procedures for the mutual recognition of qualifications in the EU Member States, and (c) a system of quality assurance. Thus, when the Bologna process came into effect in Spain, the Spanish official catalogue of university programs was abolished, and higher education institutions were given the autonomy to design and implement their own bachelor’s and master’s degrees. Nonetheless, this autonomy had critical requirements: each higher education institution’s degree had to be officially recognised by the Spanish government and authorised by the regional one. To do so, both Spanish education ministries and quality agencies developed a set of intense regulations and programs that established the procedures and standards these qualifications must undertake for their design, approval, monitoring, and accreditation.
As a result, when a higher education institution designs and implements a qualification, this needs to undertake intense external quality assurance by any of the Spanish agencies internationally recognized. Some of the most known and ANECA (National Agency for Quality Assessment and Accreditation), which is responsible for monitoring the performance of several higher education institutions of communities that do not have regional agencies.
But, do these good intentions and efforts have a positive impact on the quality of Spanish higher education institution degrees? That is, do they have a positive effect on areas such as the organization and administration, the adequacy of supporting resources, or on teaching practices? And, is this impact perceived by all educational actors (i.e., students, teaching staff, management team, and administrative members)? Almost fifteen years after the introduction of the European standards of quality assurance and its diverse procedures and instruments were introduced, the extent to which they have a positive impact on the quality of the Higher Education degree programs is a topic of much interest and debate (San Roque, 2021). In this study, efforts are put on the design and validation of a comprehensive model with criteria and indicators to analyse the effect these regulations and programs have on a given university, institution, or qualification. Due to the complexity of the model, this work will be focused on a particular dimension: the teaching and learning process. This is a critical element, as the transformation of European higher education teaching and learning methodologies was one of the main axes of the reform propelled by the Bologna Process (Reichert, 2010).
Thus, the present paper is part of a research project to evaluate the impact that accreditation systems have on different areas of undergraduate degrees. The main objective of this study is to analyse the psychometric characteristics of an ad hoc instrument to evaluate the impact on the Teaching-Learning Process.
Method
The methodology used in this study is quantitative, ex-post-facto, and descriptive. Specifically, it has been designed an ad-hoc instrument (composed of twenty items with a 5-grade Likert scale) to assess the impact of the implementation of accreditation systems on Higher Education on the Teaching-Learning Process, through 4 dimensions: Planning, Evaluation, Didactic Methodology, and Didactic Resources. This impact is to be assessed through the perception of different audiences within the university community. Thus, a total of 855 subjects from different degrees of the Complutense University of Madrid (Spain) participated in the study from October to December 2021: Biology (24.6%), Nursing (36.5%), and Primary Education Teacher (38.9%). Thus, the sample includes students (86.6%), teachers (2.7%), members of the dean's team (2%), quality managers (0.3%), degree coordinators (0.6%), practicum coordinators (0.3%), and administration and services staff (7.5%). The SPSS 25 was used to do statistical analyses after collecting data online and face-to-face. To analyse the psychometric characteristics of the instrument, an Exploratory Factorial Analysis (EFA) was carried out (to evaluate construct validity) and Cronbach's alpha was calculated (for reliability analysis). Taking into consideration the scales established by Frías-Navarro (2021), Oviedo and Campo-Arias (2005), and Hair, Tatham, & Black (2008), the results show a high total internal consistency in the instrument (excellent levels of reliability for the 20 items: Cronbach's alpha=0.938). Likewise, very satisfactory values have been obtained in each dimension: - Planning (items 1-8) (Cronbach's alpha=0.909, excellent reliability). - Evaluation (9-13) (0.861, very good). - Didactic methodology (14-17) (0.817, very good). - Didactic resources (18-20) (0.774, moderate). In the EFA, initially, the determinant R (=5.20E-006, valid because it is close to 0), KMO (=0.943, value greater than 0.6), and Bartlett's sphericity test (showing a significant value) have been calculated; p < 0.01). These results confirm the relevance of performing factor analysis. The results show the extraction of 3 final factors by the Principal Components method, with 62.24% of the explained variance (acceptable for being greater than 60%, according to Merenda, 1997 and Hair et al., 2008). All the items present communalities greater than 0.30 (minimum value established by Hair et al., 2008). Oblique rotations were used (given the relationship between the items), obtaining similar results with Oblimin and Promax, which is an indicator of high robustness in the results. Thus, the factors obtained are: - Factor 1: Planning: items 1-8. - Factor 2: Methodology and Evaluation: items 9-17. - Factor 3: Teaching resources: items 18-20.
Expected Outcomes
A reliable and valid questionnaire is presented to evaluate the impact of the accreditation systems of Higher Education degrees. The results of the Exploratory Factor Analysis show that factor 1 obtained coincides with dimension 1 proposed in the initial dimensional and theoretical model (Planning). Factor 2 obtained includes dimensions 2 and 3 defined at a theoretical level (Methodology and Evaluation), which is consistent and justified given that both dimensions are closely related to each other within teaching practice (Rodríguez Cavanerio & Coelho de Torres, 2018). For its part, factor 3 coincides with the items proposed at a theoretical level in the Didactic Resources dimension. All this, together with the excellent reliability results obtained, corroborates the high consistency of the initially proposed model. Thus, it is concluded that the designed questionnaire is a reliable and valid tool to assess the degree of impact that the implementation of accreditation systems in higher education has on the Teaching-Learning Process.
References
Frías-Navarro, D. (2021). Apuntes de consistencia interna de las puntuaciones de un instrumento de medida. Universidad de Valencia. España. Disponible en: https://www.uv.es/friasnav/AlfaCronbach.pdf Hair, J. A., & Tatham, R. R. & Black, W. (2008). Análisis multivariante. Prentice Hall, Madrid Oviedo, H. C. y Campo-Arias, A. (2005). Aproximación al uso del coeficiente alfa de Cronbach. An Approach to the Use of Cronbach’s Alfa. Revista Colombiana de psiquiatría, 34(4), 572-580. http://www.scielo.org.co/scielo.php?script=sci_abstract&pid=S0034-74502005000400009 Merenda, P. (1997). A guide to the proper use of Factor Analysis in the conduct and reporting of research: pitfalls to avoid. Measurement and evaluation in counseling and evaluation, 30, 156-163. Real Decreto 1393/2007, de 29 de octubre, por el que se establece la ordenación de las enseñanzas universitarias oficiales. Boletín Oficial del Estado, 260, de 30 de octubre de 2007, pp. 44037-44048. https://www.boe.es/eli/es/rd/2007/10/29/1393 Real Decreto 861/2010, de 2 de julio, por el que se modifica el Real Decreto 1393/2007, de 29 de octubre, por el que se establece la ordenación de las enseñanzas universitarias oficiales. Boletín Oficial del Estado, 161, de 8 de julio de 2010, pp. 58454-58468. https://www.boe.es/eli/es/rd/2010/07/02/861 Reichert, S. (2010). The intended and unintended effects of the Bologna reforms. Higher Education Management and Policy, 22(1), 1-20. https://doi.org/10.1787/hemp-v22-art6-en. Rodríguez Cavanerio, L. V., & Coelho de Torres, B. G. (2018). Evaluación del desempeño de los docentes de la asignatura Histología y Embriología. Educación Médica Superior, 32(3), 181-194. San Roque, I. M. (2021). Nuevos modelos de docencia, desde la declaración de Bolonia a la era de la COVID. Miscelánea Comillas. Revista de Ciencias Humanas y Sociales, 79(154), 225-253. http://dx.doi.org/10.14422/mis.v79.i154.y2021.007
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.