ERG SES C 01, ICT and Education
The information and communication technology (ICT) is a high-speed evolving sphere increasing in significance. ICT knowledge and skills are necessary for performing a growing number of jobs. It is not just about jobs directly connected to ICT (such as programming, graphic design, web design, etc.). It is also about seemingly distant jobs such a doctor, teacher or shop assistant. Likewise, in private life ICT has been gaining even a greater role thanks to the development of social networks, smart electronics or electronic communication devices and the Internet of things, smart homes or artificial intelligence are discussed more and more these days.
Schools have to respond this development in ICT and integrate it into the curriculum. However, the quality of teaching and acquired knowledge and skills may vary considerably for a variety of reasons. It is therefore necessary to find out what results students have achieved in the field of ICT. This is, of course, very difficult. Nowadays, ICT is a multidimensional concept composed of many independent fields like graphics, computer networks, programming and many others. It is therefore difficult to find out objectively the general level of knowledge and skills of students by using didactic tests or practical tasks. However, it is necessary to find this level to ensure the quality of teaching and an effective education policy.
The level of ICT knowledge and skills can be measured by different methods. One way is testing. This was used in International Computer and Information Literacy Study (ICILS 2013). (Fraillon, Ainley, Schulz et al., 2014) Another method is using self-assessment questions. (Danner, & Pessu, 1995; Hakkarainen, Ilomäki, Lipponen, Muukkonen, Rahikainen, Tuominen et al., 2000; Ilomäki, & Rantanen, 2007; Lau, & Yuen, 2014) Self-assessment questions were also used e.g. in ICILS 2013. (Fraillon, Ainley, Schulz et al., 2014)
Researchers ask students about the level of ICT knowledge and skills they have. Students evaluate their level using a scale from the lowest to the highest level. Similarly, potential employers try to find out the level of knowledge and skills in ICT and jobseekers report the level in their CVs (self-assessment).
However, self-assessment (and assessment in general) on a given scale can be influenced by many factors. For example: socio-economic status, peers and friends, parents and their jobs, rich variety of personality characteristics etc. And what’s more, everyone uses the scale differently. Self-assessment is therefore very subjective and comparing is difficult.
Researchers can use the Anchoring Vignette Method to increase the comparability of subjective assessment of ICT knowledge and skills level. Even in the current research, the author is going to use this method for correcting subjective self-assessment. This method was introduced by G. King et al. (2004) and used in many surveys focused for example on health (Bago d’Uva, Lindeboom, O’Donnell, & van Doorslaer, 2008; Peracchi, & Rossetti, 2012; Vonkova, 2013; Vonkova, & Hullegie, 2011), life satisfaction (Angelini, Cavapozzi, Corazzini, & Paccagnella, 2012; Kapteyn, Smith, & van Soest, 2010), job satisfaction (Kristensen, & Johansson, 2008) etc. In these studies, it has been shown that the Anchoring Vignette Method can effectively help with the objectification of self-assessment, thereby increasing the validity and reliability of research in education and others.
The research problem in this survey is verifying the assumptions required to use the Anchoring Vignette Method in self-assessment of ICT knowledge and skills (response consistency and vignette equivalence). Moreover, the author focuses on the suitability of length of anchoring vignettes or clarity of terms.
The method used in this research is the Anchoring Vignette Method introduced by G. King et al. (2004). The Anchoring Vignette Method has been proposed to avoid misleading results caused by differences in respondents scale usage. Respondents assess short descriptions of hypothetical people (anchoring vignettes) that reach certain levels of the measured concept – in this case the level of ICT knowledge and skills. All respondents evaluate their level and also the same hypothetical people in anchoring vignettes. It is possible to find out how respondents use the scale. Then their self-assessment can be cleared from the use of the scale and comparability increased. A difficult part of the research while using the Anchoring Vignette Method is to formulate anchoring vignettes. The basic assumptions of the Anchoring Vignette Method must be met and the rules for formulating must also be complied with. Among the basic assumptions of the Anchoring Vignette Method are the response consistency and vignette equivalence. The first of these assumptions says that respondents assess hypothetical people in the same way as themselves. The second is that all respondents interpret the vignettes in the same way. The formulation of vignettes must be unambiguous without the use of indefinite or evaluation words, adequate to respondents (content and also length of text) etc. In the current research, the author verifies the fulfilment of the assumptions of the Anchoring Vignette of newly formulated vignettes. The new vignettes are focused on five basic ICT areas according to the international document DigComp 2.0 – Information and data literacy, Communication and collaboration, Digital content creation, Safety and Problem solving. (Vuorikari, Punie, Carretero, & Van den Brande, 2016). The research is administrated in the Czech Republic in the first and fourth year of upper secondary schools of various types. In addition to self-assessment, assessment of hypothetical people in anchoring vignettes and completing of the questionnaire, there is a qualitative part of our research which is designed to verify the assumptions of the Anchoring Vignettes Method and quality assurance of new anchoring vignettes for ICT knowledge and skills. Individual semi-structured interviews with students are administered.
Now we are testing our questionnaire before we start with the main research. The author is doing an interview with students too. On the base of the data collected and processed, it can be assumed that the assumptions – response consistency and vignette equivalence – have been respected. It turned out that vignettes for five DigComp 2.0 domains and vignettes for general ICT knowledge and skills are reasonable for students. In the area of Digital content creation, however, formulated vignettes are more professional than in other areas. It seems that the length of vignettes is challenging for students. This is apparently due to the total number of vignettes (3 vignettes for each of the 5 areas and 3 vignettes for the general vignettes) and the length of vignettes. The terms used in the anchoring vignettes are generally known. Only some students have identified a few terms as unfamiliar (graphic editor, cloud, styles or filters). However, these were exceptional cases. Also, student interviews have shown that the vignettes lacking themes like hardware, computer programming and computer management.
Angelini, V., Cavapozzi, D., Corazzini, L., & Paccagnella, O. (2012). Age, health and life satisfaction among older Europeans. Social Indicators Research, 105(2), 293-308. Bago d’Uva, T., Lindeboom, M., O’Donnell, O., & van Doorslaer, E. (2008). Does reporting heterogeneity bias the measurement of health disparities? Health Economics, 17(3), 351-375. Danner, R. B., & Pessu, C. O. A. (1995). A survey of ICT competencies among students in teacher preparation programmes at the University of Benin, Benin City, Nigeria. Journal of Information Technology Education, 12, 33-49. Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Gebhardt, E. (2014). Preparing for life in a digital age: The IEA international computer and information literacy study, international report. Hakkarainen, K., Ilomäki, L., Lipponen, L., Muukkonen, H., Rahikainen, M., Tuominen, T., et al. (2000). Students’ skills and practices of using ICT: results of a national assessment in Finland. Computers & Education, 34, 103-117. Ilomäki, L., & Rantanen, P. (2007). Intensive use of ICT in school: Developing differences in students’ ICT expertise. Computers & Education, 48, 119-136. Kapteyn, A., Smith, J. P., & van Soest, A. (2010). Life satisfaction. In E. Diener, J. F. Helliwell, & D. Kahneman, International differences in well-being, (pp. 70-104). Oxford: Oxford University Press. King, G., Murray, C. J. L., Salomon, J. A., & Tandon, A. (2004). Enhancing the validity and cross-cultural comparability of measurement in survey research. American Political Science Review, 98(1), 567-583. Kristensen, N., & Johansson, E. (2008). New evidence on cross-country differences in job satisfaction using anchoring vignettes. Labour Economics, 15(1), 96-117. Lau, W. W. F., & Yuen, A. H. K. (2014). Developing and validating of a perceived ICT literacy scale for junior secondary school students: Pedagogical and educational contributions. Computers & Education, 78, 1-9. Peracchi, F., & Rossetti, C. (2012). Heterogeneity in health responses and anchoring vignettes. Empirical Economics, 42(2), 513-538. Vonkova, H. (2013). Subjektivní hodnocení problémů s pohybem: užití parametrického modelu metody ukotvujících vinět. [Subjective assessments of problems with moving around: Use of the parametric model of the anchoring vignette method]. Orbis Scholae, 7(1), 49-66. Vonkova, H., & Hullegie, P. (2011). Is the anchoring vignette method sensitive to the domain and the choice of the vignette? Journal of the Royal Statistical Society A, 174, 597-620. Vuorikari, R., Punie, Y., Carretero, S., & Van den Brande, L. (2016). DigComp 2.0: The Digital Competence Framework for Citizens. Joint Research Centre.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.