Session Information
09 SES 16 A, Improving Measure: Innovations in Indicators and Models
Paper Session
Contribution
International large-scale assessment studies, such as Programme for International Student Assessment (PISA), are an invaluable source of data in education. Apart from testing of students across the world, these studies are also accompanied by questionnaires which provide rich information that puts data from tests into context. Data from such questionnaires, which are typically collected by Likert-scale items, are then commonly used for analysis of different groups of students, be they grouped by gender, socioeconomic status, or compared across countries. However, conclusions taken from these analyses may be sometimes inaccurate as they may be biased by differences in reporting behavior (e.g., Buckley, 2009; He & Van de Vijver, 2016). For example, students may differ in their usage of answering scales or in the amount of effort they put into answering the questionnaires. For the latter, the term careless or insufficient effort responding is used in literature to describe such reporting behavior (Curran, 2016).
One of the tools which can help detect students who exhibit carelessness may be the analysis of response times, as the amount of time spent on responding can provide information on the effort put into answering questions. These analyses work with the assumption that there is a minimum time needed for proper reading and answering of questionnaire items (Goldammer et al., 2020). Students who exhibit response times which are very fast may not have invested effort into reading and answering items, leading to data which could bias the results of subsequent analyses. In literature, such behavior in questionnaires is often labeled as “speeding” (Zhang & Conrad, 2014).
However, the determination of a response time threshold which would guarantee the identification of speeding respondents has proven to be difficult, especially in cross-cultural analyses which involve respondents from a variety of nations, cultures, and of different languages. Previous research has suggested that the respondents’ country (e.g., Vonkova et al., 2018) as well as the language used (e.g., Harzing, 2004) can affect students’ reporting behavior and such characteristics should be taken into consideration in the analyses.
We aim to contribute to this area of research by examining the data through the lens of the language of the questionnaire combined with the student’s country of birth. Specifically, we examine response times to statements which were connected to the student’s attitude to the three core subjects of PISA, in the context of the language of the questionnaire and student’s country of birth across European PISA 2022 participating countries. Our research question is: How do the language of the questionnaire and the student’s country of birth affect the response time to the analyzed statements about the student’s attitude to the three subjects across European PISA 2022 participating countries?
Method
We employ data from the PISA 2022 student questionnaire and questionnaire timing data sets (OECD, n.d.-a; n.d.-b), focusing on 212,121 respondents from 35 European countries who were administered the questionnaire via computer. In these countries, the student questionnaire was administered in a total of 35 languages.
We focus on the response time to question ST268 connected to the student’s attitude to the three core subjects of PISA (PISA time variable ST268_TT). In the question, students were asked to report their agreement with statements on a scale (1) Strongly disagree, (2) Disagree, (3) Agree, and (4) Strongly agree. The subjects in question were test language, mathematics, and science. The three sets of statements were “
Expected Outcomes
Our analysis shows notable differences in response times based on the language of the questionnaire with English, Albanian, and Icelandic being some of the fastest languages while Greek, Ukrainian, and Czech being among some of the slowest. Differences in response times were also found in languages which were administered in multiple countries (e.g., country-born respondents answering in Albanian in Macedonia being slower than country-born respondents answering Albanian in Kosovo). Further, we also found differences in countries which administered the questionnaire in multiple languages (e.g., respondents answering in French in Switzerland being slower than respondents answering in Italian in Switzerland). Response time for respondents who were born in a country different from the country of the data collection were on average less than a second slower than those born in the country of the data collection. In some countries, however, the differences were much higher. In countries like Finland, France, or Sweden respondents who were born outside of the country of the data collection were responding more than three seconds slower than those born in the country. In Montenegro, Kosovo, or Macedonia this effect was reversed with foreign born students being about two seconds faster than those born in the country of the data collection. These results indicate that the language of the questionnaire as well as country of birth may affect the response times and should be taken into consideration when performing response time analyses, especially when analyzing speeding of respondents in cross-cultural studies. Further research could investigate the effect of other student characteristics (e.g., socioeconomic status or parents’ country of birth) on response times.
References
Buckley, J. (2009). Cross-national response styles in international educational assessments: Evidence from PISA 2006. New York University. http://edsurveys.rti.org/PISA/documents/Buckley_PISAresponsestyle.pdf Curran, P. G. (2016). Methods for the detection of carelessly invalid responses in survey data. Journal of Experimental Social Psychology, 66, 4–19. https://doi.org/10.1016/j.jesp.2015.07.006 Goldammer, P., Annen,H., Stöckli, P.L., & Jonas, K.(2020). Careless responding in questionnaire measures: Detection, impact, and remedies. The Leadership Quarterly, 31(4), Article 101384. https://doi.org/10.1016/j.leaqua.2020.101384 Harzing, A.-W. (2004). Does language influence response styles? A test of the cultural accommodation hypothesis in fourteen countries. In B. N. Setiadi, A. Supratiknya, W. J. Lonner, & Y. H. Poortinga (Eds.), Ongoing themes in psychology and culture: Proceedings from the 16th International Congress of the International Association for Cross-Cultural Psychology. https://scholarworks.gvsu.edu/iaccp_papers/260 He, J., & Van de Vijver, F. J. R. (2016). The motivation-achievement paradox in international educational achievement tests: Toward a better understanding. In R. B. King & A. B. I. Bernardo (Eds.), The psychology of Asian learners: A festschrift in honor of David Watkins (pp. 253–268). Springer Science. https://doi.org/10.1007/978-981-287-576-1 OECD. (n.d.-a). PISA 2022 questionnaire timing data file [Data set]. Retrieved January 26, 2025, from https://www.oecd.org/en/data/datasets/pisa-2022-database.html OECD. (n.d.-b). PISA 2022 student questionnaire data file [Data set]. Retrieved January 26, 2025, from https://www.oecd.org/en/data/datasets/pisa-2022-database.html Vonkova, H., Papajoanu, O., & Stipek, J. (2018). Enhancing the cross-cultural comparability of self-reports using the overclaiming technique: An analysis of accuracy and exaggeration in 64 cultures. Journal of Cross-Cultural Psychology, 49(8), 1247–1268. https://doi.org/10.1177/0022022118787042 Zhang, C., & Conrad, F. (2014). Speeding in web surveys: The tendency to answer very fast and its association with straightlining. Survey Research Methods, 8(2), 127–135. https://doi.org/10.18148/srm/2014.v8i2.5453
Update Modus of this Database
The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.