Session Information
09 SES 13 A, Exploring Students’ Civic Knowledge, ICT Competencies and (Further) 21st Century Skills
Paper Session
Contribution
In the age of increasing technological progress, the demands on educational systems and institutions and the factual learning and teaching in educational institutions are in a state of tension. In particular, individual characteristics of learners and contextual factors play a role. Consequently, it is necessary to adapt to the changing needs and demands of the social and learning community and to further develop and investigate new competencies such as computational thinking with regard to individual characteristics and in the school context. Acquiring advanced digital skills, such as computational thinking, is in high demand, as emphasised in the Digital Education Action Plan (2021-2027), which sets out the European Commission’s vision for the future of digital education in Europe, aiming to adapt education systems to the digital age (European Commission, 2020).
Computational thinking has been gaining ground in educational research in recent years (e.g. Voogt et al., 2015) and has for the first time been tested in the context of large-scale assessments and international comparisons with IEA-ICILS 2018 (International Computer and Information Literacy Study; Eickelmann et al., 2019; Fraillon et al., 2019). It involves a set of thought processes that are used to model problems and their solutions in a way that enables algorithmic processing (Aho, 2012). Since computational thinking is considered as a key competence of the 21st century, it is thought to be learnt and practiced by everyone (Wing, 2006). Consequently, it ought to be supposed that every student can acquire sufficient competencies in computational thinking in his or her school career to be able to successfully participate in society. By 2016, computational thinking was already anchored in the curricula of eleven European educational systems in various ways (Bocconi et al., 2016) and other educational systems have begun to follow suit since then. However, there are also many countries in which computational thinking is not taught at all. The challenge to assess students’ competencies in computational thinking and to draw international comparisons among the findings is therefore enormous.
IEA-ICILS 2018 addressed this point. For the first time the IEA (International Association for the Evaluation of Educational Achievement) study ICILS 2018 included the additional module of computational thinking as an international option (Eickelmann, 2019; Fraillon et al., 2019). Through international comparisons, the competencies of eighth-graders were examined on the basis of the representative student sample of ICILS 2018 using computer-based student tests developed specifically for this area. The conditions for the acquisition of these competencies were surveyed using background questionnaires. The six European countries, Denmark, Finland, France, Germany, Luxembourg, and Portugal, took part in the computational thinking module (Eickelmann et al., 2019; Fraillon et al., 2019).
When it comes to the topic of students’ competencies in computational thinking, most will readily agree that there might be variances between students of different European educational systems due to different conditions of competence acquisition and anchorage in curricula. However, a sober analysis of a potential predictor reveals the relevance of the students’ background, including their social background and their gender (e.g. Labusch & Eickelmann, 2020). Besides the individual characteristics of students, the theoretical framework of ICILS 2018 also focusses students’ classroom activities as a predictor of variance in computational thinking skills (Fraillon et al., 2019), which seems to be crucial for the acquisition of these skills in other contexts as well (Caeli & Bundsgaard, 2019).
The above-mentioned essential arguments result in the following research question, which will be addressed below: To what extent can differences in students’ average competencies in computational thinking be explained by students’ social background, by learning computational thinking tasks at school, and by students’ gender in different European countries?
Method
Data from the internationally comparative large-scale assessment ICILS 2018 serves to answer the research question. In this study, computer-based tests were used to measure the students’ competencies in computational thinking. The theoretical framework model of the study not only covers the students’ competencies but also the conditions of the acquisition. Accordingly, information on school and individual prerequisites and processes has been collected from the tested students, teachers, school principals, and ICT coordinators via background questionnaires (Eickelmann et al., 2019; Fraillon et al., 2019). For all European countries participating in the ICILS 2018 computational thinking module, in-depth analyses are carried out, which are reported in the following, thus answering the research question by means of regression analysis. The regression analysis conducted in this contribution comprises four models, in which the competencies in computational thinking are represented with each dependent variable in turn. In model I, cultural capital is taken as an indicator of the social background, operationalized by the number of books at home in students’ families. In educational research, the number of books at home has proven to be a particularly effective indicator of students’ cultural capital (Hatlevik, Throndsen, Loi & Gudmundsdottir, 2018). Model II considers the medium and high HISEI values (Highest International Socio-Economic Index of Occupational Status), which consider the economic resources of the parental home as a further indicator of the students’ social background. In model III, the internationally developed index (Fraillon et al., 2019) for learning computational thinking tasks at school (Cronbach’s α = .90) is used as an independent variable. In Model IV, the gender of the students is considered. The coefficient of determination R² indicates how well the independent variables are suited to explain the variance of the dependent variable or to predict their values. The sampling procedure in ICILS 2018 corresponded to the design of a two-stage cluster sample, in which the standard errors of determined characteristic values are initially underestimated. In ICILS 2018, therefore, the standard errors of a relevant statistic were estimated by the Jackknife Repeated Replication Technique (Johnson & Rust, 1992; Rust, 2014). The analyses were carried out using the IEA IDB-Analyzer, an IEA-specific analysis software for data sets from international comparative large-scale school assessments (Rutkowski et al., 2010), which was used as an add-on program to IBM SPSS Statistics 25 software and the estimates with corresponding sample weights at student level.
Expected Outcomes
The first model indicates that, on average, students with high cultural capital achieve a score that is between 34.1 points (Denmark) and 64.6 points (Luxembourg) which is significantly higher than those of students with low cultural capital. Model I explains between 4 percent (Denmark and Finland) and 10 percent (Germany) of the variance in computational thinking competencies. Model II implies that in all countries students from economically privileged homes achieve significantly higher scores in computational thinking than those who are less privileged. The differences vary between 17.2 points (Portugal) and 32.2 points (Luxembourg) for the medium HISEI values and between 30.8 points (Denmark) and 65.7 points (Luxembourg) for the high HISEI values. Statistically controlling for the study’s index of students’ learning of computational thinking tasks at school (Fraillon et al., 2019), there are (model III) significant negative relationships to students’ competencies, ranging between 0.4 points (Denmark) and 1.3 points (Portugal). Considering both the students’ social background and the learning of computational thinking tasks at school, there remains no significant difference in performance between girls and boys in Denmark and Finland but, however, in France (12.0 points), Germany (13.9 points), Luxembourg (14.0 points), and Portugal (20.4 points) boys perform better. The overall model explains between 6 percent (Denmark) and 16 percent (Luxembourg) of the variance. In summary, students’ competencies in computational thinking are strongly related to their social background. Surprisingly, in all countries, there is a negative correlation between the students’ competencies in computational thinking and the school learning of respective tasks. Although there are also performance differences between girls and boys in favour of boys, these do not exist in Denmark and Finland. In all countries the variance in computational thinking is mainly explained by the students’ individual characteristics, but also to some extent the learning tasks at their school.
References
Aho, A.V. (2012). Computation and computational thinking. Computer Journal, 55(7), 833–835. Bocconi, S., Chioccariello, A., Dettori, G., Ferrari, A. & Engelhardt, K. (2016). Developing computational thinking in compulsory education – Implications for policy and practice. Luxembourg: Publications Office of the European Union. Caeli, E. & Bundsgaard, J. (2019). Computational thinking in compulsory education: a survey study on initiatives and conceptions. Educational Technology Research and Development. Eickelmann, B. (2019). Measuring secondary school students’ competence in computational thinking in ICILS2018 – Challenges, concepts and potential implications for school systems around the world. In S.C. Kong & H. Abelson (Eds.), Computational Thinking Education (p. 53–64). Singapore: Springer. Eickelmann, B., Bos, W., Gerick, J., Goldhammer, F., Schaumburg, H., Schwippert, K. et al. (Eds.) (2019). ICILS 2018 #Deutschland. Computer- und informationsbezogene Kompetenzen von Schülerinnen und Schülern im zweiten internationalen Vergleich und Kompetenzen im Bereich Computational Thinking. [ICILS 2018 #Germany – Students’ computer and information literacy in second international comparison and competences in computational thinking]. Münster, Germany: Waxmann. European Commission (2020). Digital Education Action Plan. 2021-2027. Resetting education and training for the digital age. Retrieved: https://ec.europa.eu/education/sites/default/files/document-library-docs/deap-communication-sept2020_en.pdf Fraillon, J., Schulz, W., Friedman, T., & Duckworth, D. (2019). Assessment Framework of ICILS 2018. Amsterdam: IEA. Hatlevik, O.E., Throndsen, I., Loi, M. & Gudmundsdottir, G.B. (2018). Students’ ICT self-efficacy and computer and information literacy: Determinants and relationships. Computers & Education, 118, 107–119. Labusch, A. & Eickelmann, B. (2020). Computational Thinking Competences in Countries from Three Different Continents in the Mirror of Students' Characteristics and School Learning. In S.C. Kong, H.U. Hoppe, T.C. Hsu, R.H. Huang, B.C. Kuo, K.Y. Li et al. (Eds.), Proceedings of International Conference on Computational Thinking Education 2020 (pp. 2–7). Hong Kong: The Education University of Hong Kong. Rust, K.F. (2014). Sampling, weighting and variance estimation in international large-scale assessment. In L. Rutkowski, M. von Davier & D. Rutkowski (Eds.), Handbook of international large-scale assessment. Background, technical issues and methods of data analysis (pp. 117–153). London: Chapman & Hall/CRC Press. Rutkowski, L., Gonzalez, E., Joncas, M., & von Davier, M. (2010). International Large-Scale Assessment Data: Issues in Secondary Analysis and Reporting. Educational Researcher, 39(2), 142–151. Voogt, J., Fisser, P., Good, J., Mishra, P. & Yadav, A. (2015). Computational thinking in compulsory education: Towards an agenda for research and practice. Education and Information Technologies, 20(4), 715–728. Wing, J.M. (2006). Computational thinking. Communications of the ACM, 49(3), 33–35.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.