Session Information
11 SES 12 A, Quality of Higher Education Institutions
Paper Session
Contribution
Standardisation aimed at streamlining educational practices became one of the most common strategies for managing education systems (Rear, 2019; Mohan, 2023). The main task of this strategy is to manage the actions of educational process participants which would lead to providing quality education and results that satisfy the demands of employers and the state. The emergence of quality standards has actualised the need for checking their implementation by educational organisations.
However, there is no unambiguous interpretation of the concept of ‘quality of education’, even though it has been discussed by scientists for decades (Harvey, Green, 1993; Brennan, Shah, 2000; Elvira et al., 2017; Bespalko, 2017; Beerkens, 2020; Harvey, 2024). We can agree with the statement that there is a lack of a unified interpretation of the concept of ‘quality of education’ because it is a relative concept [Tam, 2001]. On the one hand, it is defined by a set of process and result characteristics, on the other hand, by their perception and interpretation of various subjects of the educational process: state bodies of education quality control, university management, teachers, students, graduates, employers, etc. Thus, the study of approaches to assess the quality of education remains relevant.
Moreover, the modern interpretation of the concept of ‘expertise’ is diverse. There is often a misconception that the terms ‘expertise’ and ‘inspection’ are considered to be synonyms. This inconsistency prompted the authors to clarify the concept, then develop a tool that would allow programme supervisors to obtain data for establishing the compliance of the programme with the standard and to define how to develop it.
Based on the literature review two main types of expertise are differentiated: normative and/or evaluative expertise, which establishes a programme's compliance with current standards, and prognostic and/or research expertise (establishes facts to formulate a prediction). We mainly focus on the second type.
There are various models for conducting expert evaluation of programmes (Tam, 2001; Stufflebeam, 2008; Iriste et al., 2018; Kayyali, 2023). Yet they do not entirely allow conducting expertise of a higher education programme, because they focus on a particular aspect. Summarising them and considering our experience, we have developed a three-component model for programme expertise (declaration, view, reality).
The need to conduct a multi-aspect and independent quality assessment of programmes initiated the development of the author's model for the educational programmes expertise in 2018 by Isaeva N., Kobtseva A., and Kasprzhak A. The model initially provided for the expert evaluation of general education programmes. It should be emphasized that the model was tested based on 100 educational programmes and confirmed its viability (Isaeva et al., 2018). In 2023, we adapted this model of expertise in a more expanded team, which also included Davlatova M., and Ozerova M., for expert evaluation of higher education programmes.
We admit that the use of this tool includes an element of subjective evaluation, but it is based on triangulation, which provides more reliable data.
Thus, the aim of this study was to adapt and validate the model of expertise within the framework of higher education programmes and to determine to what extent it can be adopted for expert evaluation of any type of educational programme. We tested the model based on the five Master's degree programmes in 2024.
We had the following research questions:
1. What conclusions about the educational programme can be formulated as a result of using the model of expertise?
2. What management decisions can be made based on the conclusions of the educational programme expertise?
3. What adjustments should be made to the model of expertise based on the approbation results?
Method
In 2018, Isaeva N., Kobtseva A., and Kasprzhak A. developed a model for the expert evaluation of general education programmes, which was successfully adapted for the expert evaluation of higher education programmes in 2023 by the indicated authors and Davlatova M., Ozerova M. The model consists of three key components and assumes that conclusions about a programme should be formulated based on the results of the comparison of three parallel evaluations: 1) Declaration - assessment of the public presentation of the programme, expressed in text, video, and other types of programme materials, which include the website, syllabi, regulations, etc. (i.e. what is stated about the programme) by invited experts. 2) View, which reflects the opinion of the programme supervisor, lectures, administrative and management staff about how the programme implementation. 3) Reality (determined by the evaluation of processes and outcomes by the consumers of the educational service, i.e. students). We proceeded from the fact that opinions and ideas about the purpose of the programme, planned and mastered educational outcomes, and the ways of their achievement by different subjects of the educational process may differ. It should allow experts to ‘diagnose’, then make conclusions and provide recommendations. During expertise the following tool and method were developed and used a template of expert opinion, which was filled out based on the analysis of programme materials (declaration), semi-structured interviews with the supervisors, teachers, and students of the programmes. It is worth noting that the interviews can be supplemented by an online survey to increase the coverage of respondents for management decision-making. The next stage involves transcribing, analysing, and interpreting the interview results and summarising data from expert opinions, and interviews with teachers and students.
Expected Outcomes
The piloting of the adapted model has confirmed its validity based on the Master's programmes expertise. The model of expertise supposes obtaining and comparing information about the programme from three sources: its public presentation (declaration) by the authors, programme supervisor’s, lectures’ perception (view) and students' experience (reality). It allows to determine the current state, transparency of the name and purpose of the programme, the programme’s flexibility, teaching staff, growth points, etc. The following conclusions were made based on the results of the five Master's programmes expertise. Firstly, the unique university educational standards allow developing programmes with a unique set of educational outcomes. The programmes are aimed at training demanded specialists. Secondly, programmes are regularly updated to remain relevant and meet modern requirements through the active participation of lectures and external specialists. Thirdly, lecturers take an evidence-based approach to teaching and developing students' critical thinking skills and ability to make informed decisions. In addition, several important aspects were identified. These include the dichotomy between the programme name and content, which can lead to misunderstandings of students and employers. There is also a gap between the expectations of students and the offered educational practices, which indicates the need to find a balance between traditional and modern teaching methods. The expertise also highlighted the importance of developing their own standards of programme and teaching quality based on the findings. Thus, this study has provided valuable insights for management decision-making and educational programmes improvement. However, there are some limitations. The model requires considerable time, human and financial resources, i.e. hiring qualified experts to analyse documents; specialists to conduct, transcribe and analyse interviews, prepare, organise, analyse student questionnaires; summarise data from different sources and present it to stakeholders).
References
Beerkens M. Evidence-based policy and higher education quality assurance: progress, pitfalls and promise //Impact Evaluation of Quality Management in Higher Education. – Routledge, 2020. pp. 38-53. DOI: 10.1080/21568235.2018.1475248 Bespalko V. P. Kachestvo obrazovanija i kachestvo obuchenija [Quality of Education and Quality of Learning] //Narodnoe obrazovanie – National Education. 2017. no. 3-4 (1461). – pp. 105-113. Available at: https://cyberleninka.ru/article/n/kachestvo-obrazovaniya-i-kachestvo-obucheniya/viewer (accessed: 25.10.2024). (In Russ.). Brennan J., Shah T. Quality assessment and institutional change: Experiences from 14 countries //Higher education. – 2000. – Т. 40. – №. 3. – С. 331-349. DOI: 10.1023/A:1004159425182 Elvira Q., Imants J., Dankbaar B., Segers M. Designing education for professional expertise development //Scandinavian Journal of Educational Research. 2017. Vol. 61. no. 2. pp. 187-204. DOI: 10.1080/00313831.2015.1119729 Harvey L. What have we learned from 30 years of Quality in Higher Education: academics’ views of quality assurance //Quality in Higher Education. 2024. – Vol. 30. no. 3. pp. 360-375. DOI: 10.1080/13538322.2024.2385793 Harvey L., Green D. Defining quality //Assessment & evaluation in higher education. – 1993. – Т. 18. – №. 1. – С. 9-34. DOI: 10.1080/0260293930180102 Iriste S. et al. Expertise as a research method in education //Rural Environ. Educ. Personal. – 2018. Vol. 11. pp. 74-80. DOI: 10.22616/REEP.2018.008 Isaeva N. V., Kasprzhak A. G. Kobceva A. A., Jekspertiza obrazovatel'nyh programm: konceptual'naja ramka, metodika realizacii [Examining Education Programs: Conceptual Framework and Realization Methods] // Pedagogika - Pedagogy. 2018. no. 12. pp. 11-22. Available at: https://www.elibrary.ru/item.asp?id=36693942 (accessed: 20.10.2024). (In Russ.) Kayyali M. An overview of quality assurance in higher education: Concepts and frameworks //International Journal of Management, Sciences, Innovation, and Technology (IJMSIT). – 2023. – Т. 4. – №. 2. – С. 01-04. Mohan R. Measurement, evaluation and assessment in education. – PHI Learning Pvt. Ltd., 2023. Available at: https://books.google.com/books?hl=ru&lr=&id=I8fAEAAAQBAJ&oi=fnd&pg=PP1&dq=standardisation+as+a+set+of+actions+of+states+aimed+at+streamlining+educational+practices+became+one+of+the+most+common+strategies+for+managing+education+systems+&ots=GK1qvUwMOu&sig=dEBTQ9IjWc-cF36TR3a3Y1IZEfE (accessed: 20.10.2024). Rear D. One size fits all? The limitations of standardised assessment in critical thinking //Assessment & Evaluation in Higher Education. – 2019. – Т. 44. – №. 5. – С. 664-675. DOI:10.1080/02602938.2018.1526255 Stufflebeam D. L. The CIPP model for evaluation //Evaluation models. – 2002. DOI: 10.1007/0-306-47559-6_16 Tam M. Measuring quality and performance in higher education //Quality in higher Education. – 2001. – Т. 7. – №. 1. – С. 47-54. DOI: 10.1080/13538320120045076
Update Modus of this Database
The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.