Session Information
09 SES 02 A, Assessing Students’ 21st Century Skills
Paper Session
Contribution
The rapidly growing literature on collaborative problem solving (CPS) suggests that it is becoming increasingly important in today’s education and workforce environments (Griffin et al., 2012; National Research Council, 2011; von Davier et al., 2017). As a competence, CPS is claimed to be necessary for students to succeed in group problem solving activities and future employment. The increasing interest in this concept is also highlighted by the growing number of initiatives to develop assessments that measure students’ CPS competence, and consequently provide evidence that students are equipped with skills to meet the CPS demands of their future careers. For example, intergovernmental economic organisations such as the Organisation for Economic Cooperation and Development (OECD) developed a large-scale approach towards measuring students’ CPS competence as part of its Programme for International Student Assessment (PISA) 2015 study to inform education systems and policy makers to develop programmes that would improve students’ collaboration skills (OECD, 2017). Finally, the timeliness of the topic is highlighted by the recent publication of special issues in journals, such as Applied Measurement in Education (Greiff & Kyllonen, 2016), Journal of Educational Measurement (von Davier, 2017) and Computers in Human Behavior (Graesser et al., 2020), specifically targeting developments in the assessment and measurement of CPS competence.
To evaluate whether students are equipped with this complex and multi-faceted CPS competence, new assessments have been recently developed (Author, 2020; Griffin et al., 2012; von Davier et al., 2017). However, an absence of consensus on how to operationalise CPS competence, makes the development and evaluation of assessments challenging. Traditional assessment types such as paper-and-pencil, multiple-choice tests are considered inappropriate for capturing the complexity of CPS competence (Andrews-Todd & Forsyth, 2020; Care & Griffin, 2014). As a result, recent developments in computer simulations are now being implemented to develop scenario-based assessment tasks that capture students’ actions and discourse as they engage with a task. There is, therefore, a need to synthesise existing knowledge about these assessment instruments as well as to evaluate the validity evidence for the derived CPS competence measures.
Two existing literature reviews on CPS competence, conducted in recent years, focus primarily on the definition of the concept (Graesser et al., 2018; Oliveri et al., 2017). The first review (Oliveri et al., 2017) presents various theoretical frameworks of CPS competence, limited to higher education and workplace contexts. Although valuable, this review does not include assessments comprising of computer-simulated, scenario-based tasks, which are currently developed to provide more realistic and engaging assessment environments. In the second review (Graesser et al., 2018), two CPS frameworks for secondary school students are presented, followed by examples of technological advances relevant to the assessment of CPS competence. Nevertheless, the process of identifying and including studies in the review is not made explicit (Gough et al., 2016). In fact, an explicit perspective on the assessment of CPS competence across educational settings with the use of computer-simulated, scenario-based tasks is still lacking.
This paper provides an analysis of relevant, and systematically selected, empirical articles assessing students’ CPS competence with the use of computer-simulated, scenario-based tasks. Following a systematic literature review methodology (Gough et al., 2016; Petticrew & Roberts, 2006), this paper contributes to knowledge about the assessment of students’ CPS competence via three aims: i) to describe the characteristics of the existing CPS assessments, ii) to categorise the facets of CPS competence targeted for measurement, and iii) to evaluate the strategies adopted for validating the CPS competence measures.
Method
The steps described by Gough et al. (2016) and Petticrew and Roberts (2006) to guide systematic literature reviews were followed, i.e., developing research questions, determining the types of studies to be located, carrying out a comprehensive literature search, formulating inclusion criteria, appraising study quality, and extracting and synthesising data. An electronic search was conducted in four scientific databases covering research in social sciences (including educational research). This review was focussed on the field of education, since students emerging from schools into the workforce and public life are expected to be able to work in teams to solve diverse problems (Rosen & Foltz, 2014). The search was conducted in December 2020. Boolean operators were employed to combine the key terms: “collaborative problem solving” and student/pupil/learner, making the search specific to student populations. The search terms were applied to the fields of title, abstract, and keywords. To include timely and up-to date articles, the period between 2010 and 2020 was chosen to map current literature published in the last decade. To assess study quality, peer review was used as a criterion for inclusion, although it is recognised that this has its own limitations as a quality check criterion. Additionally, a publication was retained for review when assessment of students’ CPS competence formed part of the content and focus of the article, meaning that CPS was not used merely as an example among other learning outcomes or as a term specific to a field other than education. The selection of articles was restricted to those using computer-simulated, scenario-based tasks, excluding articles using solely other task types such as self-assessments or teacher evaluations. In total, the search yielded 665 articles. After screening, 26 different articles were selected for review representing 15 different assessments of students’ CPS competence. The following coding template for extracting relevant information about the assessments was developed: General information and research design (i.e., authors, year of publication, country of data collection, instruments of data collection, sample size, educational level) and Task design features (i.e., group size, group composition, subject domain, communication mode, scoring approach).
Expected Outcomes
The number of assessments aimed to measure students’ CPS competence has increased over time to provide empirical evidence useful to inform policy and practice. Since existing literature reviews were found to be limited due to their focus, this article offers a state-of-the-art in the assessment of CPS competence and can form the basis for future research. By revealing the gaps in existing assessments, this article contributes to a continued focus on developing assessments that will result in measures more authentically well aligned with the whole concept. Assessments targeted almost exclusively secondary school students, pointing to the lack of tests measuring primary and higher education students’ CPS competence, which future assessments could focus on. Results also showed a variation in the content of the problem-solving tasks. Computer-simulated, scenario-based tasks can capture an abundance of data including students’ actions and messages. However, the content of messages has been so far largely ignored when scoring student interaction using algorithms. Furthermore, several skills, such as ‘active listening’, although still considered as important in CPS competence by many, were scarcely or not at all covered by the reviewed assessments. The results also revealed that evidence for the external, substantive, and consequential validity aspects is only scarcely examined. This is problematic as it is difficult to evaluate CPS competence measures when such evidence is not provided. In light of this, it is recommended that educational policy should be sensitive to the authenticity of assessments and the social consequences of test score use and interpretation. It may be more informative and productive to research students in real-life situations to inform assessment development and subsequently get measures more authentically well aligned with CPS in authentic situations.
References
Author (2020) Care, E., & Griffin, P. (2014). An Approach to Assessment of Collaborative Problem Solving. Research and Practice in Technology Enhanced Learning, 9(3), 367–388. Gough, D., Oliver, S., & Thomas, J. (2016). An introduction to systematic reviews (2nd ed.). SAGE Publications Ltd. Graesser, A. C., Fiore, S. M., Greiff, S., Andrews-Todd, J., Foltz, P. W., & Hesse, F. W. (2018). Advancing the Science of Collaborative Problem Solving. Psychological Science in the Public Interest, 19(2), 59–92. https://doi.org/10.1177/1529100618808244 Graesser, A. C., Greiff, S., Stadler, M., & Shubeck, K. T. (2020). Collaboration in the 21st century: The theory, assessment, and teaching of collaborative problem solving. Computers in Human Behavior, 104, 106134. https://doi.org/10.1016/j.chb.2019.09.010 Greiff, S., & Kyllonen, P. (2016). Contemporary Assessment Challenges: The Measurement of 21st Century Skills. Applied Measurement in Education, 29(4), 243–244. https://doi.org/10.1080/08957347.2016.1209209 Griffin, P., McGaw, B., & Care, E. (2012). Assessment and Teaching of 21st Century Skills. Springer Netherlands. https://doi.org/10.1007/978-94-007-2324-5 National Research Council. (2011). Assessing 21st Century Skills: Summary of a Workshop. National Academies Press (US). http://www.ncbi.nlm.nih.gov/books/NBK84218/ OECD. (2017). PISA 2015 Results (Volume V): Collaborative problem solving. OECD Publishing. https://doi.org/10.1787/9789264285521-en Oliveri, M., Lawless, R., & Molloy, H. (2017). A Literature Review on Collaborative Problem Solving for College and Workforce Readiness. ETS Research Report Series, 2017(1), 1–27. https://doi.org/10.1002/ets2.12133 Petticrew, M., & Roberts, H. (2006). Systematic reviews in the social sciences: A practical guide. Blackwell. Rosen, Y., & Foltz, P. (2014). Assessing collaborative problem solving through automated technologies. Research and Practice in Technology Enhanced Learning, 9, 389–410. von Davier, A. (2017). Computational Psychometrics in Support of Collaborative Educational Assessments: Computational Psychometrics. Journal of Educational Measurement, 54(1), 3–11. https://doi.org/10.1111/jedm.12129 von Davier, A., Zhu, M., & Kyllonen, P. (Eds.). (2017). Innovative Assessment of Collaboration. Springer International Publishing. https://www.springer.com/gb/book/9783319332598
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.