Developing Translation Procedures in International Reading Literacy Assessment
Author(s):
Conference:
ECER 2010
Format:
Poster

Session Information

MC_Poster, Poster Session Main Conference

Main Conference Poster Session

Time:
2010-08-27
12:15-13:15
Room:
2nd Floor Hall (M.B)/ II KRS AULA, Päärakennus / Main Building,
Chair:

Contribution

International reading literacy assessments have attracted wide interest in recent years, the results of these assessments often being the basis on which nations make educational decisions. These  assessments necessitate extensive translation work because all the texts and items used in these assessments have to be translated into the languages of the participating countries. Besides, as measuring instruments the translations have to satisfy exceptionally high standards: they have to be equivalent, or comparable, to each other – they have to measure the same construct at a comparable level of difficulty. If this is not the case the entire assessment and its results risk being invalid.

Over the years criticism has been expressed and suspicions have been aroused concerning the quality and equivalence of instruments translated and used in international assessments. It has been claimed that because of linguistic and cultural differences alone it will never be possible to attain full equivalence in these assessments (e.g. Bonnet, 2002; Bonnet et al., 2003). Moreover, research done on these translations has shown that there have been deficiencies in them and that therefore equivalence may not always have been attained (e.g. Arffman, 2007; Bechger et al., 1998). All in all, the opinion seems to be that while the instruments translated and used in international reading literacy assessments have improved, deficiencies in them may still endanger the validity of the assessments (e.g. Bechger et al., 1998; Bonnet, 2002).

Translations are always the end result of and therefore dependent on the translation process followed. This is also true of instruments translated in international reading literacy assessments. Thus, if there are deficiencies in these instruments these are largely a function of deficiencies in the translation procedures. Therefore, if we want to improve the quality and increase the level of equivalence of instruments used in international reading literacy assessments it is important to explore the translation procedures behind them and to develop them.

The purpose of the study was to address the translation procedures followed when translating instruments in international reading literacy assessments and to examine whether there are factors in them which complicate the translation work, making it difficult to ensure equivalence between the different-language test versions. Knowledge of such factors helps to develop translation procedures that are better able to yield more equivalent instruments. In the end, the study thus aimed at increasing the validity of international reading literacy assessments. However, since translation procedures followed in other types of international studies are often to a significant degree similar to those followed in international assessments of reading literacy the study can also be used when developing translation procedures in these other types of international studies.

 

Method

The data of the study consisted of semi-structured face-to-face and email discussions conducted with the five translators taking part in translating the PISA 2009 reading literacy units (the stimulus texts and question items) from English and French into Finnish. The translators – the experts in the translation process (Vermeer, 1989), those actually implementing the procedures and producing the translations – were asked what difficulties they had when translating the units and seeking to ensure equivalence and what they thought are the main difficulties and obstacles to equivalence when translating instruments in international reading literacy assessments. The discussions were analyzed using content analysis, the analysis relying on findings from Translation Studies (e.g. Danks & Griffin, 1997; cf. Nord, 1991) and test translation (e.g. Arffman, 2007; Grisay, 2002; Hambleton, 2005).

Expected Outcomes

The study pointed to the following factors as easily complicating translation and threatening equivalence in international reading literacy assessments: the source texts varying greatly in e.g. text type and topic, the complex wording of question items, and differences between languages; deficiencies in the competences of the translators; the unusual and elusive skopos, or purpose, of the translation task (equivalence in difficulty); the translation instructions focusing on formal equivalence to the source text and their not being suited to all languages, and the restrictive translation notes; the two (English and French) source versions not being fully equivalent to each other and their not being used in each country: a lack of time and time pressure; the use of the Word format and working on screen; and the verification concentrating on micro-level factors and being carried out against only one source version. Altogether the results show that while the translation procedures in international reading literacy assessments have developed over the years, there are still deficiencies in them which may jeopardise equivalence between the different-language instruments. Suggestions are made as to how to develop the translation procedures in order to ensure more equivalent instruments and to add to the validity of the assessments.

References

Arffman, I. (2007). The problem of equivalence in translating texts in international reading literacy studies. A text analytic study of three English and Finnish texts used in the PISA 2000 reading test. Jyväskylä: University of Jyväskylä, Institute for Educational Research. Bechger, T., van Schooten, E., de Glopper, C., & Hox, J. (1998). The validity of international surveys of reading literacy: The case of the Reading Literacy Study. Studies in Educational Evaluation 24, 99-125. Bonnet, G. (2002). Reflections in a critical eye: On the pitfalls of international assessment [Review of the book Knowledge and skills for life: First results from PISA 2000]. Assessment in Education 9, 387-399. Bonnet, G., Daems, F., de Clopper, C., Horner, S. Lappalainen, H.-P., Nardi, E., Remond, M., Robin, I., Rosen, M., Solheim, R., Tonnessen, F.-E., Vertecchi, B., Vrignaud, P., Wagner, A., & White, J. (2003). Culturally balanced assessment of reading. C-bar. Available: http://cisad.adc.education.fr/reva/pdf/cbarfinalreport.pdf [2009, May 12]. Danks, J., & Griffin, J. (1997). Reading and translation. A psycholinguistic perspective. In J. Danks, G. Shreve, S. Fountain & M. McBeath (Eds.), Cognitive processes in translation and interpreting (pp. 161-175). Thousand Oaks, CA: Sage. Grisay, A. (2002). Translation and cultural appropriateness of the test and survey material. In R. Adams, & M. Wu (Eds.), PISA 2000 technical report (pp. 57-70). Paris: OECD. Hambleton, R. (2005). Issues, designs, and technical guidelines for adapting tests into multiple languages and cultures. In R. Hambleton, P. Merenda & C. Spielberger (Eds.), Adapting educational and psychological tests for cross-cultural assessment (pp. 3-38). Mahwah, NJ: Erlbaum. Nord, C. (1991). Text analysis in translation. Theory, methodology, and didactic application of a model for translation-oriented text analysis. Amsterdam: Rodopi. Vermeer, H. J. (1989). Skopos and commission in translational action. In A. Chesterman (Ed.), Readings in translation theory (pp. 173-187). Helsinki: Finn Lectura.

Author Information

University of Jyväskylä
Finnish Institute for Educational Research
Jyväskylä

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.