Research building versus research auditing: the ERA intervention in Australia
Author(s):
Conference:
ECER 2010
Format:
Paper

Session Information

23 SES 06 A, Research Politics and the Knowledge-Policy Relationship I

Paper Session

Time:
2010-08-26
10:30-12:00
Room:
M.B. SALI 5, Päärakennus / Main Building
Chair:
Anja Sinikka Heikkinen

Contribution

Research assessment exercises and research ranking tables are of high contemporary significance in higher education, with considerable activity taking place in Europe (including the symposium on European Educational Research Quality Indicators – a cooperative research project in the 7th Framework Programme of the EU (2496) at the 2009 ECER conference). The European agenda is particularly and rightly concerned to attempt to mitigate some of the bias inherent in previous international comparative rankings; and to produce ways of assessing that do less violence to the inherent research agendas of different fields (Lindblad, 2008). This paper analyses the approach being taken in Australia’s first national research assessment exercise (ERA) which will be taking place between February and July 2010 as a way of contributing comparative empirical data to the ongoing European research assessment projects, in which I am also highly interested. It is also a way of developing further my ongoing sociological analysis of knowledge-building, education policy effects, and the work of researchers in current times, and this includes issues about the impact and differential effects of global trends in different national cultures. (Yates 2004, 2007, 2009; Yates and Young 2010). The Australian ERA is working with Australian Bureau of Statistics disciplinary classifications as the unit of measure, in contrast to the NZ PBRF which assesses individuals; and the UK RAE which assesses university departments. It uses its own unique metric base for classification and separates research outputs such as a publication from the researchers who produce them. Central to its new contribution to the metrics profession is a huge classification and ranking of journals by disciplines; but this exercise has been highly contentious within disciplines in Australia, and appears likely to downgrade specifically European journals. It also uses discipline classifications while the same government's research policy encourages inter-disciplinarity work. And overall, ERA produces practical and technical dilemmas for university activity which the paper will discuss. It also produces an incoherent sense of how research or knowledge-building is being conceptualized. The issues I will address in the paper by reference to the guidelines and practices of ERA are: (1) the dilemma of the unit of measurement in research quality exercises, and the problems these pose in relation to knowledge-building in research (2) the fraught issue of metrics and the developing professionalization of a research assessment community whose interests may not be co-terminous with the researchers who are the subject of their activity. (3) a pilot analysis of how, on the ground, this is re-shaping the activities of Australian researchers. The literature from which the analysis will build includes analyses of ‘the audit society’ and research assessment impact on higher education (eg Power 2000; Marginson 2007; Minelli et.al. 2006; Hodkinson 2008); debates about knowledge and the 21st century (eg Gibbons, 1994; Young 2008; and the forthcoming special issue of EJE (45 (1) 2010) on ‘Globalization, Knowledge and Curriculum’ ); and work on researcher identity and its relationship to research assessment practices (Yates 2004, Lamont 2009).

Method

The approach taken in the paper is situated within sociology of knowledge and policy sociology. The evidence that will be the focus of the paper includes policy documents; other documents such as detailed submission guidelines; participant observation at events such as national colloquia to discuss and promote the metrics base which the writer has been involved in as a participant; and at activities at different levels across the very large and research intensive university in which the writer has a senior research management role. The interpretation and analysis will be built on the strands of literature indicated above.

Expected Outcomes

In the context of the burgeoning research assessment and ranking field in higher education, new developments in approach circulate quickly to other countries. The paper will provide one of the first independent analyses of the specific approach that is being developed in Australia. Its analysis of conceptual and technical problematic elements of the approach is relevant to other contexts and national settings. At the same time, by using this new policy development as a focus, the paper will address important conceptual issues about the form in which knowledge-building is now being understood and actually practiced in universities.

References

Gibbons, M., Limoges, C., Nowotny, H., Schwartzman, Scott, P. and Trow, M. (1994). The new production of knowledge. London: Sage. Hodkinson, P. (2008) Scientific Research, education policy, and educational practice in the United Kingdom: the impact of the audit culture on Further Education, cultural Studies ó Critical Methodologies 8 (2) 302-324. Lamont, M. (2009) How Professors Think. Cambridge: Harvard University Press. Lindblad, S. (2008) Navigating in the Field of University Positioning: on international ranking lists, quality indicators and higher education governing EERJ 7 (4) 438-450. Marginson, S. (2007) Global university rankings: Implications in general and for Australia, Journal of Higher Education Policy and Management 29 (2) 131-142. Minelli, E., Gianfranco Rebora, Matteo Turri and Jeroen Huisman (2006). "The impact of research and teaching evaluation in universities: comparing an Italian and a Dutch case." Quality in Higher Education 12(2): 109-124 Power, M. (2000). "The Audit society - second thoughts." International Journal of Auditing 4: 111-119. Yates, Lyn (2004) What does good education research look like? Situating a field and its practices. Open University Press, Maidenhead. Yates, Lyn (2007) ‘Who counts as well as what counts: the desire to be ‘world class’ in Australia’, European Educational Research Journal, [part of EERJ roundtable on ‘Knowledge and Policy: research – information- intervention’] 6 (3), 298-302 Yates, Lyn (2009) The quality and impact agenda in Australia: developing a ‘Research Quality Framework’. In Tina Besley (ed.) Assessing the Quality of Research in Higher Education Sense Publishers, Rotterdam, pp.210-224 Yates, L. & Young, M. (forthcoming 2010) Globalization, knowledge and curriculum. European Journal of Education 45 (1)

Author Information

University of Melbourne
Melbourne Graduate School of Education
Melbourne

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.