Session Information
Paper Session
Contribution
Generic skills can be considered important during higher education studies. Previous studies have indicated that generic skills have impact on students’ learning, study success and retention in higher education (Badcock ym. 2010; Arum & Roksa 2011; Tuononen et al. 2019). Previous studies have also shown that undergraduate students have challenges, for instance, in argumentation, interpreting and evaluating information and drawing conclusions (Badcock, Pattison & Harris 2010; Arum & Roksa 2011; Evens, Verburgh & Elen 2013). Furthermore, there is some evidence that student’s educational and socioeconomic background has impact on how well students master generic skills (Arum & Roksa, 2011; Kleemola et al., 2022) and the way of which generic skills are assessed differentiates students in their mastery of generic skills (Hyytinen et al. 2015).
There are numerous generic skills needed during in higher education and in working life. In higher education, typically, focus is on higher-order cognitive skills such as critical thinking and argumentation, analytic reasoning, decision making and writing (e.g. Zoller & Tsaparlis 1997; Arum & Roksa 2011; Lemons & Lemons 2017). The generic skills assessed in this study are analytic reasoning and evaluation (how students can identify the strengths and weaknesses of alternative argumentation and how they can differentiate between trustworthy and untrustworthy sources), problem-solving (how students can recognise a problem and solve it by using argumentation), writing effectiveness (how logically and clearly the answer has been constructed), and writing mechanics (how students use conventions of standard written language and control of language). The aim of the study is to examine
(1) how well Finnish undergraduate students master various generic skills, and
(2) what background factors relate to these skills.
Method
Instrument: The study used performance-based instrument called Collegiate Learning Assessment International (CLA+). The USA-based instrument was translated and adapted into Finnish and Swedish (official languages in Finland) according to the International Translation Committee guidelines for translating and adapting tests (Bartram et al., 2018). The translated instrument was pre-tested in 20 cognitive laboratories with think-alouds and interviews to make sure that the construct or difficulty of the instrument was not altered in the translation and adaptation phase. CLA+ included three sections: an open-ended written task (performance task, PT), 25 selected response questions (SRQs), and background information survey including 37 questions. Students had 60 minutes to complete the PT, followed by 30 minutes for the SRQs. Thereafter, students filled in a background survey. In total, the computer-based and monitored test lasted for 2 hours 15 minutes. The PT measured analysis and problem solving, writing effectiveness and writing mechanics. To successfully complete the PT, students needed to familiarise themselves with the materials available in an electronic document library and then write an answer to the question, which dealt with the differences in life expectancies in two cities. The SRQs measured critical reading and evaluation, scientific and quantitative reasoning, and critiquing an argument. The SRQs in each section were based on one or more documents. The materials and questions in the sections covered the topics of brain protein, nanotechnology and women in combat. Participants, data collection and analysis. The participants (n = 2402) were students at initial and final stages of Bachelor degree programmes in seven universities of applied sciences (UASs) and eleven universities in Finland. The participants were selected by cluster sampling of programmes to obtain a nationally representative sample across the disciplines provided in the Finnish higher education institutions. The data were collected between August 2019 and March 2020. The participation rate was 25 per cent. The adopted statistical methodology included linear and logistic regression, and structural equation modelling, and the analyses were conducted in the design-based framework, utilizing survey weights and accounting for clustered data. The distortions in the eventual sample data were corrected by using survey weights derived from the Finnish student registers.
Expected Outcomes
First findings show that the variation in students’ generic skills was explained mainly by factors pertaining to student’s educational and socioeconomic background. Profiling of students suggests three types of competence groups: high performers who perform well independent of task type (whether a PT or SRQs), low performers who perform poorly in both types of tasks, and mixed performers who perform well in one task type but poorly in the other. The mixed performers can be divided into those who perform better in PT than in SRQs, and those who perform better in SRQs than in PT. A detailed comparative analysis on the background characteristics of these competence groups is yet to be done.
References
Arum, R. & Roksa, J. 2011. Academically adrift: Limited learning on college campuses. Chicago: University of Chicago Press. Badcock, P. B. T., Pattison, P. E. & Harris, K-L. 2010. Developing generic skills through university study: a study of arts, science and engineering in Australia. Higher Education 60 (4), 441–458. Bartram, D. et al. (2018), “ITC Guidelines for Translating and Adapting Tests (Second Edition)”, International Journal of Testing, 18. 101-134. Evens, M., Verburgh, A. & Elen, J. 2013. Critical thinking in college freshmen: The impact of secondary and higher education. International Journal of Higher Education 2 (3), 139–151. Hyytinen, H., Nissinen, K., Ursin, J., Toom, A. & Lindblom-Ylänne, S. 2015. Problematising the equivalence of the test results of performance-based critical thinking tests for undergraduate students. Studies in Educational Evaluation 44, 1–8. Kleemola, K., Hyytinen, H. & Toom, A. 2022. Critical thinking and writing in transition to higher education in Finland: do prior academic performance and socioeconomic background matter. European Journal of Higher Education. Lemons, P. P. & Lemons, J. D. 2017. Questions for Assessing Higher-Order Cognitive Skills: It's Not Just Bloom’s. Life Sciences Education 12 (1), 47–58. Tuononen, T., Parpala, A. & Lindblom-Ylänne, S. 2019. Graduates’ evaluations of usefulness of university education, and early career success–a longitudinal study of the transition to working life. Assessment & Evaluation in Higher Education, 1–14. Zoller, U., & Tsaparlis, G. 1997. Higher and lower-order cognitive skills: The case of chemistry. Research in Science Education 27, 117–130.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.