Session Information
22 SES 02 C, Reforms, Rankings & Quality Assurance
Paper Session
Contribution
The evidence on the impact of university rankings on the ways higher education institutions (HEIs) represent themselves to their stakeholders is needed. It is a matter of common knowledge that rankings or performance metrics developed for academic evaluation are utilized to imply “high quality” education that HEIs provide. In what ways the institutions present themselves on the results of world rankings to the public are yet to be discovered for it is a complex task. The complexity lies in the fact that big portion of the data is produced by the outsiders, aka ranking agencies.
Within this context, considering the fact that HEIs are defined as loose-coupling systems, there are weaker links between subsystems responding the incidents happen around HEIs slowly. For loosely coupled systems, it is harder to communicate, evaluate predictability, control their processes, and achieve their goals (Lutz, 1982). There is little known about the use of evaluation metrics provided by university rankings in university governance particularly in controlling these loosely coupled work systems of academia (Glaser et al., 2002).
Before ranking systems have emerged, HEIs have had fewer side effects in terms their communication with the public in regards to its performance and quality. The traditional institutional communication on university performance would include reputation of the university alumni as well as its faculty members. This traditional communication would be more of a “word of mouth” than what it is right now. With the developments and innovations in information communication systems, this traditional apporach may have lost its power in public. Today’s world ranking systems interfere with this linear traditional communication in the sense that they now develop a data set through WoS or Scopus in addition to the institutional data that HEIs provide based on the ranking indicators.
Ranking systems are now right in the middle of this communication network between HEIs and public where HEIs have even less control over the evaluation of their performances in regards to different matters. This new forms of communication, we call Inevitable Cycle of Higher Education (ICoHE), brings serious restrictions and concerns over HEIs in reflecting their performance.
In De Ricjke et al. (2016)’s literature review on how evaluation practices and indicator use affect HEIs and provided four major implications: strategic behavior, goal displacement, task reduction and some potential biases toward interdisciplinarity. Leiden Manifesto with its ten principles (Leiden Manifesto Official Website, n.d.) clearly states that rankings are spreading fast but they are not well-informed and most of the time not applied it right.
Therefore, it is critical to elicit how higher education leaders and administrators perceive and reflect world ranking results in their practices and decision making. Even though most higher education institutions are agains the idea of ranking, they can also not avoid the results and impact of ranking, which creates an ineviatable cycle for higher education. Thus, it needs to be examined how higher education institutions integrate or respond to the world rankings in their communication with the public on the university performance.
Method
In this qualitative research, face-to-face structured interviews will be conducted with the higher education leaders and administrators in Turkey. Convenience sampling will be used and the interview protocol with the necessary ethical permissions received will be used for data collection. For the thematic analysis, transcription of the interviews and open coding will be done by the two different researchers to ensure the validity of the results. The common themes and differences will be identified to explore the reflection and practices of higher education leaders and administrators in Turkey.
Expected Outcomes
The expected outcomes of this research are to gain insight into the daily impact of world rankings on higher education institutions in addition to their communication with the public on their university performance. Develop a set of suggestions for world ranking agencies would be one of the solid outcomes of this research to position higher education researchers in this inevitable cycle of higher education to have a voice in the matter.
References
References: De Rijcke, S., Wouters, P.F., Rushforth, A.D., Franssen, T.P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use – a ltierature review. Research Evaluation, 25 (2), 161-169. Gläser, J., Laudel, G., Hinze, S., & Butler, L. (2002). Impact of evaluation-based funding on the production of scientific knowledge: What to worry about, and how to find out. Expertise for the German Ministry for Education and Research. Lutz, F. W. (1982). Tightening up loose coupling in organizations of higher education. Administrative Science Quarterly, 27(4), 653-669. Leiden Manifesto Official Website (n.d.). Retrieved from http://www.leidenmanifesto.org/ on January 31, 2018.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.