Session Information
09 SES 01 B, Insights into Learning and Assessment
Paper Session
Contribution
Student questionnaire data, typically collected via Likert-scale items, is commonly used to compare different groups of students, be it across countries or based on student characteristics such as gender and socioeconomic status (e.g., OECD, 2017). However, such analyses can lead to inaccurate conclusions as the data might be biased due to the differences in reporting behavior between different groups of students (e.g., He & van de Vijver, 2016; Kyllonen & Bertling, 2013). Students can, for example, differ in the amount of effort they put into filling in the questionnaire.
Our theoretical framework relates to reporting behavior in surveys. Terms careless responding or insufficient effort responding have been used to describe responding patterns in which respondents lack motivation to answer accurately and do not pay attention to the content of items and survey instructions (Goldammer et al., 2020). A number of approaches have been suggested to identify such careless responding, the analysis of response time (to the whole survey or parts of it) being one of them (Curran, 2016; Goldammer et al., 2020). It rests on an assumption that there is a minimum time needed to read and answer a questionnaire item (Goldammer et al., 2020). The term “speeding” has been used for responding too fast to questionnaire items to give much thought to answers (Zhang & Conrad, 2014).
The analysis of response times is a promising tool for identification of the differences in the amount of effort put into filling in questionnaire surveys between different groups of students. It could help identify careless responding (a) between different groups of students during a single wave of measurement as well as (b) changes in careless responding of particular groups of students across different waves of measurement. This could be exploited, for example, in longitudinal research studies using questionnaires (e.g., [foreign language] learning motivation studies) as well as international large-scale assessment (ILSA) studies such as, for example, Programme for International Student Assessment (PISA). So far, however, the knowledge concerning the differences in questionnaire item response times between different groups of students in the context of ILSA studies is rather limited.
Previous research has suggested that students’ reporting behavior may differ across different socioeconomic groups (Vonkova et al., 2017), encouraging further exploration of the reporting behavior-socioeconomic status relationship. In this contribution, we address this research area. Our aim is to analyze the relationship between students’ response times to achievement motivation questionnaire items and their socioeconomic status in European PISA 2015 participating countries. Our research question is: How do questionnaire response times to achievement motivation items differ between students with parents achieving different education levels in European PISA 2015 participating countries?
Method
We analyze data from the PISA 2015 questionnaire, focusing on 171,762 respondents from 29 European countries who were administered the questionnaire via computer. Specifically, we look at the response time to question ST119 (Achievement motivation), and we use the highest achieved education by parents (PISA variable HISCED) as an indicator for the socioeconomic status of students. Only respondents, who had complete information on all analysed variables were included in this analysis. In the question ST119, respondents were asked to answer five statements on achievement motivation using responses Strongly disagree (1), Disagree (2), Agree (3), and Strongly agree (4). The five statements were: (1) I want top grades in most or all of my courses, (2) I want to be able to select from among the best opportunities available when I graduate, (3) I want to be the best, whatever I do, (4) I see myself as an ambitious person, and (5) I want to be one of the best students in my class (OECD, 2014). Response times were taken from the response time dataset for PISA 2015, specifically the variable ST119_TT. They were logged for each screen (in this case screen containing five items relating to achievement motivation) and they were logged in milliseconds. For the purposes of our analysis, we have set an upper limit of two minutes for students to be included in the analysis. That is because a vast majority of respondents were able to respond in this time interval and only 406 respondents took longer. These were typically respondents who took extremely long (one even nearing an hour spent on the screen) and as such would negatively impact the analysis through not displaying standard response behavior. Information on parental education levels (HISCED) was extracted from the PISA 2015 dataset which uses the ISCED (International Standard Classification of Education) 1997 classification. HISCED categories ranged from 0 to 6, representing various levels of educational attainment. HISCED0 represents unfinished ISCED 1 level, HISCED1 and HISCED2 represent ISCED 1 and 2 levels, respectively, HISCED3 represents ISCED 3B and 3C, HISCED4 represents ISCED 3A and 4, HISCED5 represents ISCED 5B, and HISCED6 represents ISCED 5A and 6. Due to the low number of observations, we combined HISCED 0-2 categories for the purposes of our analysis.
Expected Outcomes
Our initial analysis showed a notable inverse relationship between mean response times to question ST119 and HISCED for European countries. Specifically, respondents from families with a lower educational background took longer when answering. This is further highlighted when looking both at median times and results of linear regression with country fixed effects (time being the explained variable and HISCED levels and country dummies being the explanatory variables), both of which display the same trend. However, when examining the variation in each HISCED group, data showed that HISCED0-2 group had fairly higher variation of response time than all other HISCED groups, the lowest being in the HISCED5 group. This suggests that there is a greater heterogeneity in response time within the HISCED0-2 group. This indicates that this group consists of a mix of respondents with low and high response times to the question ST119. Our results show that it is necessary to take response times into consideration when comparing groups of respondents, as they can potentially affect the analysis. Further research may be focused on the relationship of response times and home possessions or other indicators of socioeconomic status. Additionally, further research may also analyze other world regions and compare them with the European results.
References
Curran, P. G. (2016). Methods for the detection of carelessly invalid responses in survey data. Journal of Experimental Social Psychology, 66, 4–19. https://doi.org/10.1016/j.jesp.2015.07.006 Goldammer, P., Annen,H., Stöckli, P.L., & Jonas, K.(2020). Careless responding in questionnaire measures: Detection, impact, and remedies. The Leadership Quarterly, 31(4), Article 101384. https://doi.org/10.1016/j.leaqua.2020.101384 He, J., & Van de Vijver, F. J. R. (2016). The motivation-achievement paradox in international educational achievement tests: Toward a better understanding. In R. B. King & A. B. I. Bernardo (Eds.), The psychology of Asian learners: A festschrift in honor of David Watkins (pp. 253–268). Springer Science. https://doi.org/10.1007/978-981-287-576-1 Kyllonen, P. C., & Bertling, J. (2013). Innovative questionnaire assessment methods to increase cross-country comparability. In L. Rutkowski, M. von Davier, & D. Rutkowski (Eds.), A handbook of international large-scale assessment data analysis: Background, technical issues, and methods of data analysis (pp. 277–285). Chapman Hall/CRC Press. OECD. (2014). PISA 2015 student questionnaire (computer-based version). https://www.oecd.org/pisa/data/CY6_QST_MS_STQ_CBA_Final.pdf OECD. (2017). PISA 2015 results (volume III): Students' well-being. https://doi.org/10.1787/9789264273856-en Vonkova, H., Bendl, S., & Papajoanu, O. (2017). How students report dishonest behavior in school: Self-assessment and anchoring vignettes. The Journal of Experimental Education, 85(1), 36-53. https://doi.org/10.1080/00220973.2015.1094438 Zhang, C., & Conrad, F. (2014). Speeding in web surveys: The tendency to answer very fast and its association with straightlining. Survey Research Methods, 8(2), 127–135. https://doi.org/10.18148/srm/2014.v8i2.5453
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.