Session Information
22 SES 09 D, Discussing Curriculum and Skills
Paper Session
Contribution
Perspective(s) or theoretical framework
Research on generic skills has been taking on increased importance in higher education over the last decade (Tuononen et al., 2022; Van Damme & Zahner, 2022). Generic skills refer to universal expert skills applied across different disciplines and contexts of jobs (Tuononen et al., 2022). These skills together with domain-specific knowledge enable students to draw on their field-specific knowledge in a variety of situations (Ursin et al., 2021). There is no one definitive list of generic skills; instead, this is a unifying term under which sets of skills belong. While remarkable variation in concepts and operationalization of generic skills have been found (Braun et al., 2012; El Soufi & See, 2019; Tuononen et al., 2022), researchers have acknowledged the importance of learning generic skills in the context of higher education. For example, there is evidence that generic skills are related to adjustment and adaptation to higher education (Kleemola et al., 2022; Van der Zanden et al., 2019) and progress in studies and study success (Tuononen & Parpala, 2021). Additionally, earlier research has suggested that a student’s background—both educational and socioeconomic—has a strong influence on the level of generic skills (Arum & Roksa, 2011; Kleemola et al., 2022; Ursin et al., 2021).
Since generic skills research is mainly conducted using self-assessments, the research field has stressed the need for more performance-based research (Tuononen et al., 2022). Performance-based assessment aims to evoke authentic performance, covering aspects of generic skills through situations that resemble the real world (Hyytinen et al., 2023; Van Damme & Zahner, 2022). Previous research on performance-based assessment has shown that test-taking effort and engagement have a substantial impact on test performance (Hyytinen et al., 2023; Liu et al., 2016).
Objectives
Generic skills of higher education students have been assessed through the CLA+ (Collegiate Learning Assessment) in a number of countries, including the United States (where it is used extensively) and Finland. CLA+ is a performance-based assessment of generic skills such as critical thinking and written communication, directed particularly to higher education students. The assessment is accompanied by a survey of students’ demographic background, attitudes, and fields of studies.
Comparisons of the assessment results of entering and exiting higher education students in Finland and the US have shown that while students from both countries exhibited learning gains in generic skills (i.e., there was a significant difference in the overall gain of these skills between entering and exiting students), this overall gain was clearly larger among the American students (Ursin et al., 2021). The purpose of this study is to further investigate a reason behind this finding. We consider variables measuring students’ effort and engagement in the CLA+ as well as sociodemographic variables such as students’ gender and parental level of education, and whether students’ primary home language is the same as the instructional language of the institution.
This research attempts to answer two questions:
- In what way are the overall gains in generic skills between the entering and exiting students different in the two countries?
- Are there background variables that can explain the possible difference in gain between the two countries?
Method
Measures The CLA+ is a 90-minute performance-based assessment of critical-thinking and written-communication skills comprising a 60-minute performance task (PT) and a 30-minute set of 25 selected-response questions (SRQs). The PT measures performance in three areas: Analysis and Problem Solving (making a logical decision and supporting it by analyzing, evaluating, and synthesizing the appropriate information); Writing Effectiveness (constructing an organized and cohesive essay with support for positions); and Writing Mechanics (demonstrating command of Standard Written English). The SRQ section is aligned to the same construct as the analysis and problem-solving subscore of the PT. Ten items measure Data Literacy (e.g., making an inference); ten measure Critical Reading and Evaluation (e.g., identifying assumptions); and five measure Critiquing Arguments (e.g., detecting logical fallacies). Both the PT and SRQ sections are document based. The supporting documents include a range of information sources, such as letters, memos, photographs, charts, and newspaper articles. After completing the CLA+, the students answer a questionnaire pertaining to their background. Sample Since the participating Finnish institutions were all research universities, a subset of only competitive (Schmitt, 2009) higher education institutions were selected for this study in the United States. For this study, approximately 51,000 students across 185 institutions of higher education in the United States were included in the analyses. Similar to the US, 18 participating higher education institutions from Finland tested entering first-year students in the fall semester and exiting third-year students in the spring semester. The Finnish sample consisted of 2,384 students (1,524 entering and 860 exiting students) from the 2019–2020 academic year. Two translated and adapted versions of the CLA+, one in Finnish and the other in Finland Swedish, were used for the Finnish students. Data sources The data presented in the results section are from 29,187 entering and 22,109 exiting American students, and 1,524 entering and 860 exiting Finnish students. All analyses were performed on the scaled and equated CLA+ Total score, which is a composite of the PT and SRQ subscores. The data were analyzed with descriptive statistics and two-level regression models. In calculating standard errors and, consequently, significance tests, the clustering of students within institutions was considered by introducing a random institution effect in the models, to avoid overly liberal inference. Measured with intra-cluster correlation (ICC), the homogeneity of students within an institution was considerable: the value of ICC estimate was 0.17 in the Finnish data and 0.23 in the American data.
Expected Outcomes
Conclusions and Discussion This study investigated entering and exiting higher education students’ performance on the CLA+ in Finland and the United States. Overall, exiting students significantly outperformed entering students, but the overall learning gains were greater for the American students, despite entering Finnish students having a higher average score than their American counterparts. The literature suggests that effort and engagement might be factors that influence performance (e.g., Hyytinen et al., 2023; Liu et al., 2016). However, we found that test-taking effort and engagement did not explain the observed differences in learning gains. Other demographic variables such as gender, primary home language, and parental level of education also did not explain the observed difference. One possible explanation is that the exiting students in the United States are fourth-year students as opposed to third-year students in Finland (Ursin et al., 2021). This could have been further investigated by comparing third-year students to each other. However, one of the limitations of this comparative study is that the model for assessing student learning gains at almost all participating higher education institutions is to compare entering and exiting students, so we do not have a dataset containing any third-year students in the United States. The results of this study are puzzling because individually within country, the variables we investigated such as effort and engagement and the other demographics were predictive and explain the variance in CLA+ performance. However, none can explain why the American students had a larger average difference on CLA+ between entering and exiting students than the Finnish students. A second study, which includes a longitudinal component, is forthcoming. Future studies should assess students in the United States in their third year of studies as well as collect additional common demographic variables.
References
Arum, R., & Roksa, J. (2011). Academically adrift: Limited learning on college campuses. University of Chicago Press. Braun, E., Woodley, A., Richardson, J. T. E., & Leidner, B. (2012). Self-rated competences questionnaires from a design perspective. Educational Research Review, 7(1), 1–18. https://doi.org/10.1016/j.edurev.2011.11.005 El Soufi, N., & See, B. H. (2019). Does explicit teaching of critical thinking improve critical thinking skills of English language learners in higher education? A critical review of causal evidence. Studies in Educational Evaluation, 60, 140–162. https://doi.org/10.1016/j.stueduc.2018.12.006 Hyytinen, H., Nissinen, K., Kleemola, K., Ursin, J., & Toom, A. (2023). How do self-regulation and effort in test-taking contribute to undergraduate students’ critical thinking performance? Studies in Higher Education. https://doi.org/10.1080/03075079.2023.2227207 Kleemola, K., Hyytinen, H., & Toom, A. (2022). Critical thinking and writing in transition to higher education in Finland: Do prior academic performance and socioeconomic background matter? European Journal of Higher Education, 1–21. https://doi.org/10.1080/21568235.2022.2075417 Liu, O. L., Mao, L., Frankel, L., & Xu., J. (2016). Assessing critical thinking in higher education: The HEIghtenTM approach and preliminary validity evidence. Assessment & Evaluation in Higher Education, 41(5), 677–694. https://doi.org/10.1080/02602938.2016.1168358 Schmitt, C. M. (2009). Documentation for the restricted-use NCES-Barron's Admissions Competitiveness Index data files: 1972, 1982, 1992, 2004, and 2008. National Center for Education Statistics, Institute for Education Sciences, US Department of Education. Tuononen, T., Hyytinen, H., Kleemola, K., Hailikari, T., Männikkö, I., & Toom, A. (2022). Systematic review of learning generic skills in higher education—Enhancing and impeding factors. Frontiers in Education, 7. https://doi.org/10.3389/feduc.2022.885917 Tuononen, T., & Parpala, A. (2021). The role of academic competences and learning processes in predicting Bachelor’s and Master’s thesis grades. Studies in Educational Evaluation, 70. https://doi.org/10.1016/j.stueduc.2021.101001 Ursin, J. (2020). Assessment in higher education (Finland). In J. Kauko & J. W. James (Eds.), Bloomsbury Education and Childhood Studies. Bloomsbury Academic. Ursin, J., Hyytinen, H., & Silvennoinen, K. (Eds.). (2021). Assessment of undergraduate students’ generic skills in Finland: Findings of the Kappas! Project (Report No. 2021: 31). Finnish Ministry of Education and Culture. Van Damme, D., & Zahner, D. (Eds.). (2022). Does higher education teach students to think critically. OECD Publishing. https://doi.org/10.1787/cc9fa6aa-en Van der Zanden, P., Denessen, E., Cillessen, A., & Meijer, P. (2019). Patterns of success: First-year student success in multiple domains. Studies in Higher Education, 44(11), 2081–2095. https://doi.org/10.1080/03075079.2018.1493097
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.