Session Information
09 SES 16 A, Improving Measure: Innovations in Indicators and Models
Paper Session
Contribution
Students’ performance in international large-scale assessments, such as the Trends in International Mathematics and Science Study (TIMSS), garners significant attention in public discourse. Typically, student test scores are viewed as straightforward indicators of proficiency, categorising educational systems as either high or low performers. Nevertheless, various factors influence student performance, with motivation being among the most critical. Furthermore, since these assessments have no direct consequences for the students themselves, they are considered low-stakes. Therefore, understanding the motivation and behaviour behind test-taking becomes increasingly crucial, particularly when the tests impact the educational system rather than the individual student.
Recently, there has been a strong emphasis on response time data, largely driven by the growth of computer-based assessments (CBA), especially in international large-scale testing. CBA allows for the automatic gathering of process data from response activities. This information has attracted considerable attention, as it offers vital insights into the complexities of test-taking behaviours as well as cognitive and non-cognitive processes. It allows researchers and educators to analyse students' understanding and their interactions with the assessment process.
Research indicates that many students experience a decline in motivation as they advance in their education (Muenks et al., 2018), which can impact their future decisions (Wang, 2012; Watt et al., 2012). Additionally, current motivation levels are associated with engagement and effort, particularly during test-taking (Wise & DeMars, 2005). Concerns about the implications of motivation are supported by findings that demonstrate a positive correlation between test-taking motivation and both test performance and students' response behaviours to these assessments (Hofverberg et al., 2022; Ivanova et al., 2020; Silm et al., 2020). In Norway, trends in TIMSS data indicate a steady decline in students' motivation and a growing gender gap favouring boys, particularly in terms of interest and confidence (Kaarstein et al., 2024).
Building on prior research, this study delves into the complex relationship between students' response times, motivation, and performance. It draws on data from TIMSS 2023, focusing on ninth-grade mathematics in Norway. Utilising a person-centred approach (Ferguson et al., 2020), it integrates profile and pattern analysis to determine if students' response times—particularly for challenging items—offer valuable insights into their motivation and performance. Keeping this into consideration, we concentrate on: (a) What is the link between students' performance and profiles of motivation related to mathematics? (b) What is the link between students' response time and profiles of motivation related to mathematics? (c) What is the link between student's response time and performance? and (d) Can we distinguish particular patterns in response time when we account for item difficulty, general performance and students motivation profiles?
Method
The study participants comprise 6,324 ninth-grade students from Norway participating in TIMSS 2023 (47.2% girls). As part of the TIMSS procedures, students first take a 90-minute test (including mathematics and science) followed by a contextual questionnaire that covers constructs related to students' backgrounds, attitudes, and beliefs about their school and learning environment. The achievement tests employ a rotated block design, whereas all students respond to the same questions on the contextual questionnaire. This study utilises this data concerning mathematics. Students' motivation towards mathematics is assessed using three distinct composite scales: interest, value, and confidence. The 'Students like learning mathematics' construct emphasises the intrinsic aspects of motivation, exploring whether students enjoy mathematics, like it, or find it boring. Conversely, the 'Student value mathematics' construct addresses extrinsic motivation, concentrating on the perceived benefits of studying mathematics or pursuing a career in the field. Lastly, the construct related to students' confidence evaluates their self-assessment of their abilities in tackling mathematics content. All plausible values are considered when observing students' overall performance in mathematics, including both content and cognitive domains. Two additional constructs were created: the average time on task (AveToT) and the response time effort (RTE). AveToT represents an average deviation from the meantime used on each task, with positive values indicating a longer time than the mean and negative values a shorter time. RTE is the proportion of test items that the students responded to with solution behaviour. The study utilised a two-step analytical approach. First, latent profile analysis (LPA) was conducted using three specified motivational constructs. LPA is a latent variable mixture modelling method that allows testing the fit and significance of various latent profiles among individuals in a dataset (Ferguson et al., 2020). Models with two to nine latent classes (k = 2–9) were assessed to determine the number of profiles generated by the data. Each model was estimated using 5,000 sets of random start values across 100 iterations, retaining the 200 best solutions for the final optimisation phase. The selection of the final solution considered the entropy index cut-off (Geiser, 2013) alongside a combination of the bootstrapped likelihood ratio test (BLRT), the Vuong–Lo–Mendell–Rubin likelihood ratio test (VL-LRT), and the Lo–Mendell–Rubin adjusted LRT test (LMR). In the second step, R3STEP and BCH options were employed to examine gender differences and patterns among motivational profiles, performance, and response times, namely AveToT and RTE.
Expected Outcomes
Upon considering the theoretical framework underpinning profile analyses (e.g., Eccles & Wigfield, 2020), prior studies conducted in Norway using a similar approach (Radišić & Jensen, 2021), as well as the profile characteristics of the solutions and their interpretability, a six-class model indicated the best fit (entropy 0.853). The suggested six-class model was validated by adhering to Geiser's (2013) guidelines for optimal log-likelihood value repetition. The first profile shows low values across all three motivational constructs: interest, value, and confidence. The second shows mid-level values for the value and confidence aspects, whereas the third profile indicates the highest scores for value and confidence but not interest. Profile number four scores in the mid-range across all aspects, unlike group five, which scores very low on interest and confidence while recognising some value in mathematics. The sixth profile outperforms in all three aspects and is the exact opposite of the first group. Gender differences across the profiles align with recognised trends in Norwegian data (Kaarstein et al., 2024). Similar to previous studies, profiles scoring high on confidence or the combination of interest and confidence exhibited more optimal performance patterns (Hofverberg et al., 2022; Radišić & Jensen, 2021). However, the AveToT and RTE patterns are more intricate and vary when considering the complexity of the task and various motivational aspects. Current analyses focus on untangling these patterns, especially when comparing high versus low confidence profiles, particularly in light of the declining motivation results observed among students in Norway.
References
Eccles, J. S., & Wigfield, A. (2020). From expectancy-value theory to situated expectancy-value theory: A developmental, social cognitive, and sociocultural perspective on motivation. Contemporary Educational Psychology, 61, Article 101859. Ferguson, S. L., G. Moore, E. W., & Hull, D. M. (2020). Finding latent groups in observed data: A primer on latent profile analysis in Mplus for applied researchers. International Journal of Behavioral Development, 44(5), 458-468. Geiser, C. (2013). Data Analyses with Mplus. Guilford Press. Hofverberg, A., Eklöf, H., & Lindfors, M. (2022). Who makes an effort? A person-centered examination of motivation and beliefs as predictors of students’ effort and performance on the PISA 2015 science assessment. Frontiers in Education, 6, 1–17. Ivanova, M., Michaelides, M., & Eklöf, H. (2020). How does the number of actions on constructed-response items relate to test-taking effort and performance? Educational Research and Evaluation, 26(5–6), 252–274. Kaarstein, H., Lehre, A.- C. W G; Radišić, J., & Rohatgi, M. (2024). TIMSS 2023 - Kortrapport. Universitet i Oslo. Lundgren, E., & Eklöf, H. (2023). Questionnaire-taking motivation: Using response times to assess motivation to optimize on the PISA 2018 student questionnaire. International Journal of Testing, 23(4), 231–256. Muenks, K., Wigfield, A., & Eccles, J. S. (2018). I can do this! The development and calibration of children’s expectations for success and competence beliefs. Developmental Review, 48, 24–39. Radišić, J., & Jensen, F., (2021). Norske 9.-trinnselevers motivasjon for naturfag og matematikk – en latent profilanalyse av TIMSS 2019. In Nilsen, T., & Kaarstein, H. (Eds.), Med blikket mot naturfag. Nye analyser av TIMSS-data og trender 2015-2021 (pp. 103–139). Universitetsforlaget. Silm, G., Pedaste, M., & Täht, K. (2020). The relationship between performance and test-taking effort when measured with self-report or time-based instruments: A meta-analytic review. Educational Research Review, 31, 100335. Wang, M.-T. (2012). Educational and career interests in math: A longitudinal examination of the links between classroom environment, motivational beliefs, and interests. Developmental Psychology,48(6), 1643–1657. Watt, H. M. G., Shapka, J. D., Morris, Z. A., Durik, A. M., Keating, D. P., & Eccles, J. S. (2012). Gendered motivational processes affecting high school mathematics participation, educational aspirations, and career plans: A comparison of samples from Australia, Canada, and the United States. Developmental Psychology, 48(6), 1594–1611. Wise, S. L., & DeMars, C. E. (2005). Low examinee effort in low-stakes assessment: Problems and potential solutions. Educational Assessment, 10(1), 1–17.
Update Modus of this Database
The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.