Session Information
Contribution
The efforts of creating European higher education area have paved way to the inclusion of generic skills as part of higher education studies. European Union has, for example, highlighted that professional skills are not sufficient to tackle with the changes in the working life. This also requires acquisition of generic skills which are essential competences needed in the era of digitalisation (European Commission 2013; 2019). Generic skills are also important during higher education studies. Previous studies have indicated that generic skills have impact on students’ learning, study success and retention in higher education (Badcock ym. 2010; Arum & Roksa 2011; Tuononen et al. 2019). Nonetheless, previous studies have shown that undergraduate students have challenges, for instance, in argumentation, interpreting and evaluating information and drawing conclusions (Badcock, Pattison & Harris 2010; Arum & Roksa 2011; Evens, Verburgh & Elen 2013; Hyytinen, et al. 2015).
There are numerous generic skills needed during in higher education and in working life. In higher education, typically, focus is on skills such as critical thinking and argumentation, analytic reasoning, decision making and writing (e.g. Zoller & Tsaparlis 1997; Arum & Roksa 2011; Lemons & Lemons 2017). Therefore, the assessed generic skills in this study were analytic reasoning and evaluation (how students can identify the strengths and weaknesses of alternative argumentation and how they can differentiate between trustworthy and untrustworthy sources), problem-solving (how students can recognise a problem and solve it by using argumentation), writing effectiveness (how logically and clearly the answer has been constructed), and writing mechanics (how students use conventions of standard written language and control of language). The aims of the study were to (1) identify the level of Finnish undergraduate students’ generic skills, and (2) to investigate what factors are connected with the level of generic skills.
Method
Instrument. The study used performance-based instrument called Collegiate Learning Assessment International (CLA+). The USA-based instrument was translated and adapted into Finnish and Swedish (official languages in Finland). Thereafter, the translated instrument was pre-tested in 20 cognitive laboratories with think-alouds and interviews in order to make sure that the construct or difficulty of the instrument was not altered in the translation and adaptation phase. CLA+ included three sections: an open-ended written task (performance task, PT), 25 selected response questions (SRQs), and background information survey including 37 questions. Participants and data collection. The participants (n = 2402) were students at initial and final stages of their Bachelor degree programmes from seven universities of applied sciences (UASs) and eleven universities in Finland. The participants were selected so that it was possible to draw nationally as representative sample as possible by the field of study. The data were collected between August 2019 and March 2020. The computer-based and monitored test lasted for 2 hours 15 minutes. The participation rate was 25 percentage. Analysis. The approach for statistical analyses (descriptive statistics, linear and logistic regression) was design-based, utilizing survey weights and accounting for clustered data. Partly due to the sampling design, partly due to nonresponse, there was considerable variation in inclusion probabilities between student subgroups (as defined by gender, field of study, institution and study program). The distortions in the eventual sample data were corrected by using survey weights derived from the Finnish student registers. As the individuals in a specific study program tend to be correlated, all variance estimates and resulting confidence intervals and significance tests were computed with methods taking this intra-cluster correlation appropriately into account.
Expected Outcomes
According to the results, for nearly 60 percent of the Finnish undergraduate students, the generic skills were on a satisfactory or lower level while for the rest, about 40 percent, these were on a good or higher level. Only handful of Finnish students reached the highest level of mastery. There was a clear difference in the mastery of generic skills between UAS and university students; university students’ generic skills were on a higher level than their counterparts in UASs. In the skills measured by the PT female students scored significantly higher than males. In the skills measured by the SRQs the result was the opposite: male students outperformed females. The variation in students’ generic skills was explained mainly by factors pertaining to student’s educational and socioeconomic background. First, success in mother tongue in the Finnish matriculation examination (a national examination generally taken at the end of the Finnish upper secondary school) was the most important factor in explaining the level of generic skills of undergraduate students. Second, students who grew up with books at home tended to have higher level of generic skills. The third important factor related to the effort a student invested while taking the test: the more effort student invested, the better results s/he got. Based on the findings it can be concluded that attention should be paid to the learning of generic skills already at the lower educational levels and also in learning environments outside school contexts. Moreover, the role of generic skills in student selection should be investigated, and in efforts to develop generic skills in higher education, the different goals of UAS and university education should be considered and learning of generic skills supported in a goal-oriented fashion.
References
Arum, R. & Roksa, J. 2011. Academically adrift: Limited learning on college campuses. Chicago: University of Chicago Press. Badcock, P. B. T., Pattison, P. E. & Harris, K-L. 2010. Developing generic skills through university study: a study of arts, science and engineering in Australia. Higher Education 60 (4), 441–458. European Commission. 2013. High Level Group on the Modernisation of Higher Education. Report to the European Commission on improving the quality of teaching and learning in Europe's higher education institutions. Luxembourg: Publications Office of the European Union. European Commission. 2019. Key competences for lifelong learning. Luxembourg: Publications Office of the European Union. Evens, M., Verburgh, A. & Elen, J. 2013. Critical thinking in college freshmen: The impact of secondary and higher education. International Journal of Higher Education 2 (3), 139–151. Hyytinen, H., Nissinen, K., Ursin, J., Toom, A. & Lindblom-Ylänne, S. 2015. Problematising the equivalence of the test results of performance-based critical thinking tests for undergraduate students. Studies in Educational Evaluation 44, 1–8. Lemons, P. P. & Lemons, J. D. 2017. Questions for Assessing Higher-Order Cognitive Skills: It's Not Just Bloom’s. Life Sciences Education 12 (1), 47–58. Tuononen, T., Parpala, A. & Lindblom-Ylänne, S. 2019. Graduates’ evaluations of usefulness of university education, and early career success–a longitudinal study of the transition to working life. Assessment & Evaluation in Higher Education, 1–14. Zoller, U., & Tsaparlis, G. 1997. Higher and lower-order cognitive skills: The case of chemistry. Research in Science Education 27, 117–130.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.