09 SES 12 D JS, Intellectualizing and Organizing Knowledge: The Construction of Educational Facts
Joint Symposium NW 09 and NW 23
In recent years, educational testing has gained in importance for policy initiatives, in curriculum change, as a research activity, and as a subject in the media (Forsberg & Román, 2015). One of the most active international agencies in performing educational tests is the International Association for the Evaluation of Educational Achievement. Since 1995, their Trends in International Mathematics and Science Study (TIMSS) have been carried out at regular intervals. TIMSS results, together with other tests, have gradually been transformed into reference points for social policies (Pettersson, 2014). In a wider context, the larger phenomenon of International Large-Scale Assessments (ILSA) has arisen creating a framework of global governance manifested through changes in styles of reasoning (Hacking, 1992). Given the importance of this discourse as a social practice we argue that it is crucial to investigate how ILSA-research is constructed through scholarly formats. One important way of illuminating how scientific knowledge is constructed is to study its organisation, dissemination and legitimization in terms of publishing patterns in scientific literature. Studies (Domínguez, Vieira, & Vidal, 2012; Owens, 2013) have shown how investigating research literature can be a viable and important dimension in the scientific uncovering of patterns in how research is communicated and organized at the intersection of different scientific fields. We analyse scientific patterns in a specific corpus of research made up of articles reporting or discussing TIMSS-data. The analysis is based on visualizations of bibliometric networks describing articles published in international peer-reviewed journals. A corpus of articles, limited to the most cited articles using or discussing TIMSS-data, is used. The corpus is analysed using VOSviewer software (van Eck & Waltman, 2014) which has been specifically designed to uncover bibliometric couplings. Through this analysis strategy we hope to illuminate how knowledge is legitimised within TIMSS-research. Attention will be paid to significant actors and, specifically, to those journals, authors, articles, universities and countries participating in the constituting of TIMSS-research. Preliminary analyses indicate publishing patterns where the USA is noticeable. However, there is also a wide network of dissemination within the corpus indicating that TIMSS-research may have a varied research studies base. We hope to contribute to a discussion on how educational knowledge is legitimized by its progress within the field of TIMSS-research, and how, by this process, educational assessment has become part of the constructing of scientific knowledge.
Domínguez, M., Vieira, M. J., & Vidal, J. (2012) The impact of the Programme for International Student. Assessment on academic journals. Assessment in Education: Principles, Policy & Practice- 19(3) p. 393-409. Forsberg, E. & Román, H. (2014) The Art of Borrowing in Swedish Assessment Policies. In Nordin & Sundberg (Eds.). Transnational policy-flows in European education: Conceptualizing and governing knowledge. Oxford Studies in Comparative Education. Symposium Books: East Greenwich. Hacking, I. (1992) “Style” for historians and philosophers. Studies in the History and Philosophy of Science 23(1) p. 1-20. Luzón, A. and Torres, M. (2011). Visualizing PISA Scientific Literature versus PISA Public Usage. In M. A. Pereyra; R. Cowen & H.-G. Kothoff (Eds.). PISA under Examination. Changing Knowledge, Changing Test and Changing Schools. Sense Publisher: Rotterdam. Owens, T. L. (2013). Thinking Beyond League Tables: a review of key PISA research questions. In H.-D. Meyer & A. Benavot (Eds.) PISA, Power and Policy: the emergence of global educational governance. Symposium Books: Oxford. Pettersson, D. (2014). Three narratives: National interpretations of PISA. Knowledge Cultures. 2(4). Van Eck, N. J., & Waltman, L. (2014). Visualizing bibliometric networks. In Y. Ding, R. Rousseau & D. Wolfram (Eds.) Measuring scholarly impact: Methods and practice. Springer: New York.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.