What Are We Learning From The Secondary Analysis of PISA? A Preliminary Analysis
Author(s):
Radhika Gorur (presenting / submitting)
Conference:
ECER 2013
Format:
Paper

Session Information

23 SES 03 A, International Comparative Assessments

Paper Session

Time:
2013-09-10
17:15-18:45
Room:
D-506
Chair:
Palle Rasmussen

Contribution

In the 12 years since its inception, the OECD’s Programme for International Student Assessment (PISA) has become a major player in the education policies in many countries (Gorur, 2011a; Grek, 2009). It’s rankings have sent some countries into shock, whilst others have emerged as models to be emulated (Ertl, 2006). However, while PISA rankings have been enormously influential, my previous research on PISA and policy indicated that there is a level of dissatisfaction and anxiety among PISA officials that the PISA data base was being under-utilised to inform policies and practices (Gorur, 2011b). One interviewee, for example, said:

I actually think the PISA data is very rich, but the kind of public reports they [OECD] produce – the international report and the country reports - only scrape the surface of the database. There are a lot more stories that can be found even at the classroom level. … [M]y sense is that there haven’t been enough people to actually do that secondary analysis....There are some attempts – and you get to hear them at some of the IEA conferences – but the big reports can only tell limited stories. (Interview transcript: PISA analyst)

Discussions involving the secondary analysis of PISA and the lessons that might be learned from them appear to be confined to measurement conferences, or possibly to commissioned government reports which are not available to educators at large.

Following these findings, I became curious about the nature and extent of secondary analysis of PISA. This paper presents the preliminary results of a funded project which examines questions such as: Who is using these data bases? What kinds of questions are they asking? Who is funding these projects? Who are the audience/consumers of these studies? Allied to this are a set of broader questions: What are the features of secondary analysis as a knowledge practice? What are the affordances and constraints? What are the actual practices of secondary analysis, and to what extent do these practices mediate the usefulness of the knowledge they produce?

Using resources from science and technology studies, this paper takes an ethnographic approach to examine the nature, scale and scope of the PISA data base and what it affords in terms of secondary analysis, and some aspects of secondary analysis as a contemporary form of knowledge practice.

Given the widespread influence and interest in PISA, this paper has global relevance.

Method

Data sources included published papers based on the secondary analysis of PISA and interviews with analysts involved in secondary analysis, as well as with OECD analysts. Initially, an internet search was launched using appropriate key words to find published secondary analyses of PISA. About 100 papers were analysed to map the nature of questions asked, the field in which it was published (psychology, assessment, education, policy etc.) and the assumed audience and purpose of the research. A framework was developed to organise the analysis of the documents. Following this, a number of analysts engaged in the secondary analysis of PISA were interviewed using semi-structured, conversational interviews about nature, scope, boundaries and affordances of this knowledge practice.

Expected Outcomes

This research is expected to shed some light on secondary analysis as a knowledge practice and an epistemic culture (Knorr Cetina, 1999) and to develop a preliminary mapping of the kinds of the nature and scope of current analyses. Such a mapping could lead to a better understanding of the potential of the PISA data base to inform policy and practice. The results from this study have the potential to bring together practitioners in measurement, policy, sociology and education to enable useful questions to be raised and answered.

References

Ertl, H. (2006). Educational Standards and the Changing Discourse on Education: The reception and consequences of the PISA study in Germany. Oxford Review of Education, 32(5), 619-634. Gorur, R. (2011a). ANT on the PISA Trail: Following the Statistical Pursuit of Certainty. Educational Philosophy & Theory, 43(5-6), 76-93. Gorur, R. (2011b). Policy Assemblage in Education. (PhD), University of Melbourne. Grek, S. (2009). Governing by Numbers: the PISA 'effect' in Europe. Journal of Education Policy, 24(1), 23-37. Knorr Cetina, K. (1999). Epistemic Cultures: How the Sciences Make Knowledge: Harvard University Press.

Author Information

Radhika Gorur (presenting / submitting)
The Victoria Institute, Victoria University, Australia

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.