Session Information
09 SES 14 A, Measuring Competencies and Skills in Computer-based Assessments
Paper Session
Contribution
This paper investigates construct representation and multi-dimensionality of the PISA 2015 measure of collaborative problem solving (CPS) competence. Indicators of CPS behaviour are mapped to three hypothesised domains, based on the PISA 2015 CPS framework (i.e., establishing and maintaining shared understanding, taking appropriate action to solve a problem, and establishing and maintaining team organisation; OECD, 2017), and item-level data from the PISA 2015 study for England are analysed by fitting an item-response model (Rasch) for a single latent dimension and for a three-dimensional construct.
Collaborative problem solving (CPS) skill has become increasingly researched in educational literature across many domains such as mathematics and science. The inclusion of CPS in the OECD's PISA assessment highlights the interest in the education sector while methods for measurement have been also highlighted (e.g. von Davier et al., 2017). To assess CPS skills is challenging in numerous of ways. Firstly, as a complex interactive activity, the individual performance is hard to be “fixated” and measured when individuals become entangled in process oriented activities (Nouri et al., 2017). Secondly, as Chan and Clarke (2017) argue, CPS appears to have changed from being seen as a method for learning (i.e. a teaching strategy aimed at improving student achievement), to now as “a critical and necessary skill used in education and in the workforce” (OECD, 2017, p. 132).
Although there have been numerous attempts to create innovative assessment methods, there seems to be a lack of consensus on whether different assessment approaches are measuring the same construct (Scoular and Care, 2019). That is because CPS skills are often described in different ways and with different items linked to them. The predominant recent definitions of CPS represented in the literature underpin OECD's 2015 PISA assessment (OECD, 2017) and the research reports from the Assessment and Teaching of 21st Century Skills (ATC21S) project (Care et al., 2018). They both identify two components within the construct (collaboration and problem solving; or social and cognitive skills respectively). However, how these components and their sub-skills relate empirically—whether some act as enablers for others and the degree to which some are interdependent or acting across levels—remains a question (Care et al., 2016). So far, only a limited number of studies have empirically investigated the structure of this construct. For example, Harding and colleagues (2016) found evidence supporting a two-dimensional structure for CPS.
A key question that therefore emerges from this literature is: To what extent frameworks of CPS sample the same skillset and how the composite skills relate? The different approaches to descriptions of the structure of CPS in the Hesse et al. (2015) framework, adopted by the ATC21S initiative, and OECD's PISA framework, provide us with an opportunity to interrogate the nature of the relationship of the CPS skills and competencies in the PISA scenario, with the social and cognitive dimensions in the ATC21S context. To elucidate the main components of CPS, comparison of the two frameworks used in the PISA and ATC21S initiatives through analysis of task components and indicators is needed (Scoular & Care, 2019). The aim of this article is therefore to make this contribution to the existing evidence base by focusing upon the student responses to the newly developed PISA 2015 CPS tasks, the dimensionality of the CPS construct and subsequently the validity of the PISA CPS scale are tested. Of interest of course are construct representation and the degree to which the Hesse et al. and PISA frameworks sample the same skillset. In order to investigate this further, specific mapping of PISA indicators to the Hesse et al. framework’s strands and subskills is undertaken.
Method
Analysis focuses on pupils who answered the test items of the PISA 2015 CPS assessments tasks in England (n = 1584, 829 males and 755 females). The assessment task was designed for 15 year olds. There are 117 behaviours coded to the PISA framework as indicators of the CPS skills they are supposed to reflect in the task. Each indicator’s mapping to the PISA framework was followed by mapping to the hypothesised dimension, strand and sub-skill of Hesse et al. framework. PISA summarises students’ performance in one overall CPS proficiency scale, drawing on all questions in the CPS assessment. From a measurement point of view, each task contained one or more scoreable items. Students could select predefined messages from lists of possible messages to communicate with other group members or perform actions in the task space. Each item was coded in two (dichotomous: 0/1) or more (polytomous: 0, 1, ... m) categories according to the item coding rubrics. Within this article, the data were analysed using the Rasch one parameter simple logistic model (Rasch, 1960). Item response models typically apply a mathematical function to model the probability of a student’s response to an item, as a function of the student’s “ability” level. Analysis was performed on items representing one dimension (collaborative problem solving) and then data was explored as a multidimensional construct consisting of three hypothesised latent dimensions. The multidimensional random coefficients multinominal logit model (Adams et al., 1997) was used to analyse between-item dimensionality. This model permits generalisation to the simple logistic model. Both weighted and unweighted fit were examined as evidence that the underpinning construct was represented by the indicators. The person—item maps that place individuals and items on locations of the latent construct and the item difficulty hierarchy provided evidence for substantive, content, and external validity (Bond & Fox, 2007). Using the deviance statistic, goodness of fit of the data to each latent-trait model was evaluated. In addition, testing for measurement invariance using differential item functioning (DIF) allowed for the assessment of the measure’s item performance for individuals who have differing characteristics, in our case gender.
Expected Outcomes
Item fit statistics indicated that all items exhibited appropriate levels of fit for both models (within the range 0.77 – 1.30; Adams and Khoo, 1995). In addition, there was no evidence of DIF by gender in the measure and therefore, it can be concluded that gender group membership does not substantively affect students’ likelihood of endorsing an item. The multidimensional model was found to fit the student responses better than the unidimensional model, however, high correlations between the dimensions suggested that their relationship is strong enough to warrant considering the construct under investigation as unidimensional for this sample and for some purposes (e.g., for having a cut-score indicating proficiency levels). There are several possible interpretations of a unidimensional measurement model. From a psychometric perspective, it could be argued that all the items measure the same construct. However, the fact that in this sample the different dimensions of the CPS competence measure did not differentiate enough to be considered distinct dimensions does not necessarily mean that they are not testing theoretically different constructs. Unidimensionality could be a result of a common context characterising the assessment of the construct. In addition, it is important to mention that there is a strong tendency for scales in the PISA study to be constructed that are indeed one-dimensional, but possibly at the cost of excluding certain kinds of potentially important information that might be discrepant, which tends to narrow the diversity of assessment as a result (Goldstein, 2004). As a final explanation for the results, we consider a point raised by Care et al. (2016), arguing that collaboration and problem solving are not totally independent of each other, and the degree to which their interaction in a CPS environment modifies both social and cognitive functioning is not yet known.
References
Adams, R. J., Wilson, M., & Wang, W. (1997). The multidimensional random coefficients multinomial logit model. Applied Psychological Measurement, 21(1), 1–23. https://doi.org/10.1177/0146621697211001 Adams, R., & Khoo, S. (1995). Quest: An interactive item analysis program. Australian Council for Educational Research. Bond, T. G., & Fox, C. M. (2007). Applying the Rasch model: Fundamental measurement in the human sciences (2nd ed.). Lawrence Erlbaum. Care, E., Griffin, P., & Wilson, M. (Eds.). (2018). Assessment and Teaching of 21st Century Skills: Research and Applications. Springer, Cham. Care, E., Scoular, C., & Griffin, P. (2016). Assessment of Collaborative Problem Solving in Education Environments. Applied Measurement in Education, 29(4), 250–264. https://doi.org/10.1080/08957347.2016.1209204 Chan, M. C. E., & Clarke, D. (2017). Structured affordances in the use of open-ended tasks to facilitate collaborative problem solving. ZDM, 49(6), 951–963. https://doi.org/10.1007/s11858-017-0876-2 Goldstein, H. (2004). International comparisons of student attainment: Some issues arising from the PISA study. Assessment in Education: Principles, Policy & Practice, 11(3), 319–330. https://doi.org/10.1080/0969594042000304618 Harding, S.-M. E., & Griffin, P. (2016). Rasch Measurement of Collaborative Problem Solving in an Online Environment. Journal of Applied Measurement, 1(17), 35–53. Hesse, F., Care, E., Buder, J., Sassenberg, K., & Griffin, P. (2015). A Framework for Teachable Collaborative Problem Solving Skills. In P. Griffin & E. Care (Eds.), Assessment and Teaching of 21st Century Skills (pp. 37–56). Springer Netherlands. https://doi.org/10.1007/978-94-017-9395-7_2 Nouri, J., Åkerfeldt, A., Fors, U., & Selander, S. (2017). Assessing Collaborative Problem Solving Skills in Technology-Enhanced Learning Environments – The PISA Framework and Modes of Communication. International Journal of Emerging Technologies in Learning (IJET), 12(04), 163–174. OECD. (2017). PISA 2015 Assessment and Analytical Framework: Science, Reading, Mathematic, Financial Literacy and Collaborative Problem Solving. OECD Publishing. https://doi.org/10.1787/9789264281820-en Rasch, G. (1960). Probalistic Models for Some Intelligence and Attainment Tests. Danmarks Pædagogiske Institut. Scoular, C., & Care, E. (2019). Monitoring patterns of social and cognitive student behaviors in online collaborative problem solving assessments. Computers in Human Behavior, 105874. https://doi.org/10.1016/j.chb.2019.01.007 von Davier, A. A., Hao, J., Liu, L., & Kyllonen, P. (2017). Interdisciplinary research agenda in support of assessment of collaborative problem solving: Lessons learned from developing a Collaborative Science Assessment Prototype. Computers in Human Behavior, 76, 631–640. https://doi.org/10.1016/j.chb.2017.04.059
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.