Session Information
22 SES 08 A, Digital Learning and Teaching
Paper Session
Contribution
Digital competences have become increasingly crucial in the educational sector. They refer to the safe, responsible, confident, and open-minded, yet critical usage of digital technologies in different contexts, including work, learning, and teaching (Voogt et al., 2013). For educators, digital competences are necessary for instructional aims in all their facets, be they technological, communicative, collaborative, or ethical (Basilotta-Gómez-Pablos et al., 2022). In the classroom, they involve tasks such as providing both existing and self-created digital content, as well as monitoring students’ progress using data, among many others. Digital competences are multidimensional, encompassing skills, knowledge, and attitudes (Ferrari, 2012; Redecker, 2017), with behavior often considered a fourth dimension (Joint Research Centre, 2018).
Educational institutions, particularly in the tertiary sector, are responsible for equipping both today’s and future leaders and professionals with digital competences. To do so effectively, educators must first possess these competences themselves. However, recent research has highlighted general shortcomings in this area, suggesting that educators' digital competence levels need to be improved and emphasizing the need for strategies to enhance them (Basilotta-Gómez-Pablos et al., 2022). A key first step is to assess current competence levels in detail, mapping strengths and weaknesses and identifying areas for improvement. Several frameworks and tools have already been established to assess digital competences in educators (e.g., Eichhorn et al., 2017; Hall et al., 2014; Müller et al., 2016), the most influential being the DigCompEdu framework (Redecker, 2017). It consists of six competence areas, namely Professional Engagement (using digital media to communicate with others and develop teaching skills), Digital Resources (using and creating digital resources), Teaching and Learning (using digital media in class), Assessment (evaluating and monitoring students’ progress through data), Empowering Learners (integrating students and creating personalized content), and Facilitating Learners’ Digital Competence (teaching students how to create and responsibly use digital content). In the DigCompEdu, participants select the most appropriate of six statements for each item, indicating their current level - ranging from Newcomer to Expert and Pioneer - in a specific subskill within a competence area. The framework provides a valuable foundation for applied research but also has its limitations. For instance, the competence areas lack comparability due to the absence of a unified Likert-type rating scale. More importantly, it does not account for the multidimensional nature of digital competences, as mentioned above.
The present study aimed to design a new instrument for the self-assessment of digital competences among educators that enables the mapping of customized competence profiles. The instrument is closely related to the DigCompEdu framework (Redecker, 2017) and comprises the same six competence areas. In contrast to the original questionnaire, addressing the limitations mentioned above, multiple Likert-type scales were introduced to measure each item on four separate dimensions: Attitude, Skill, Knowledge, and Behavior. This allows for the computation of multi-dimensional scores on which customized competence profiles can be built. Such profiles can then trigger reflective processes that foster the development of new competences. Additionally, tailored feedback can target specific competence areas, helping educators build on their strengths while effectively addressing areas for improvement.
Method
Our questionnaire covers six competence areas, identical to the areas in the DigCompEdu framework (Redecker, 2017). To render them more comparable, we opted to restrict the number of items per area to three, in contrast to the original three to five items. Each item refers to a specific competence and is formulated as a first-person statement (e.g., ‘I actively develop my digital teaching skills’). Participants are asked to rate them on four Likert-type scales, one for each dimension (Attitude, Skill, Knowledge, and Behavior), ranging from 1 (low level) to 5 (high level). Scale labels vary for each dimension. As part of the validation process, the questionnaire was administered to 252 teachers from three different institutions in higher education. The questionnaire was developed in French and German language. Confirmatory factor analysis (CFA) was used to determine factorial validity, thereby assessing whether the new factor structure could be statistically confirmed. This was done separately for each of the four dimensions, resulting in four CFA models. To determine how well the data fit our models, four indicators were consulted, namely the comparative fit index (CFI), the Tucker-Lewis index (TLI), the Root Mean Square Error of Approximation (RMSEA), and the Standardized Root Mean Square Residual (SRMR). Results showed that the factor structure of the questionnaire is statistically viable. To further assess the psychometric properties of the questionnaire, indicators for internal consistency, item difficulty, and item discrimination were computed. Values of Cronbach’s α (for each competence area in each model) ranged from acceptable to good. Item difficulty, ideally settled between 0.2 and 0.8 (Lüdecke et al., 2021), was partially above the threshold, indicating some items showcased a strong tendency to be agreed with. Item discrimination was consistently above the minimum threshold of 0.2 (Lüdecke et al., 2021). Other notable insights emerging from our analysis include a very high correlation between Knowledge and Skill, a considerable difference between Attitude and Behavior, and comparatively low scores in Assessment, a result which is consistent with previous studies (Santos et al., 2021). Finally, linear regression analysis was used to detect potential differences in average competence levels among institutions, as well as to investigate the effects of personal and institutional variables on individual levels of digital competences. Significant differences in competence levels among institutions were found. Furthermore, several institutional resources and personal features showed significant effects on individual levels of digital competences, such as gender, institutional support, and working hours.
Expected Outcomes
Overall, the results indicate that our questionnaire is a valid self-assessment tool for digital competences in educators. It demonstrates enough heterogeneity between competence areas and dimensions to enable the creation of customized competence profiles. By analyzing the feedback, educators can critically reflect on their own current level, setting further developments in motion, such as the improvement of existing digital competences or even acquiring new ones. The present study also comes with its limitations: A larger, more diverse sample could improve the validation process, as this study solely focused on Swiss tertiary educators. However, the questionnaire’s theoretical basis (DigCompEdu framework) suggests international applicability, while other levels of education, such as primary and secondary, could also be considered. Another limitation is a high approval tendency, indicated by high item difficulty values (easiness). Associated issues include limited predictive validity and reduced discriminatory power, among others. Successfully applying competences in practice requires more than just feedback; it also depends on self-regulated learning skills and reflective learning (Daumiller & Dresel, 2019; Fessl et al., 2023). One way to achieve this is through metacognitive prompts in the form of self-reflective questions, which can lead educators to think about their digital competences, analyzing their strengths and weaknesses, which can in turn inspire the acquisition of new competences, e.g., through the recommendation of suitable further education programs. This could be realized through automated feedback systems (e.g., integrated into an e-learning platform) or online workshops, where teachers analyze their profiles in supervised groups. Importantly, this can contextualize the questionnaire scores and allows for the provision of customized recommendations. Given limited evidence on how various support strategies interact with each other (Tondeur et al., 2021), we emphasize the need for further research before such frameworks can be implemented on a large scale.
References
Basilotta-Gómez-Pablos, V., Matarranz, M., Casado-Aranda, L. A., & Otto, A. (2022). Teachers’ digital competencies in higher education: A systematic literature review. International Journal of Educational Technology in Higher Education, 19(8), 1–16. https://doi.org/10.1186/s41239-021-00312-8 Daumiller, M., & Dresel, M. (2019). Supporting self-regulated learning with digital media using motivational regulation and metacognitive prompts. Journal of Experimental Education, 87(1), 161–176. https://doi.org/10.1080/00220973.2018.1448744 Eichhorn, M., Müller, R., & Tillmann, A. (2017). Entwicklung eines Kompetenzrasters zur Erfassung der "Digitalen Kompetenz" von Hochschullehrenden. In C. Igel (Ed.), Bildungsräume (pp. 209-219). Waxmann. https://doi.org/10.25656/01:16147 Ferrari, A., Punie, Y., & Redecker, C. (2012). Understanding digital competence in the 21st century: An analysis of current frameworks. In A. Ravenscroft, S. Lindstaedt, C. D. Kloos, & D. Hernández-Leo (Eds.), 21st century learning for 21st century skills (pp. 79–92). Springer. https://doi.org/10.1007/978-3-642-33263-0_7 Fessl, A., Divitini, M., & Maitz, K. (2023). Transferring digital competences for teaching from theory into practice through reflection. In O. Viberg, I. Jivet, P. J. Muñoz-Merino, M. Perifanou, & T. Papathoma (Eds.), Responsive and sustainable educational futures (pp. 554–559). Springer. https://doi.org/10.1007/978-3-031-42682-7_41 Hall, R., Atkins, L., & Fraser, J. (2014). Defining a self-evaluation digital literacy framework for secondary educators: The DigiLit Leicester project. Research in Learning Technology, 22, 1-17. https://doi.org/10.3402/rlt.v22.21440 Joint Research Centre. (2018). DigComp into action: Get inspired, make it happen. A user guide to the European Digital Competence Framework (JRC Publication No. JRC110624). Publications Office of the European Union. https://doi.org/10.2760/112945 Lüdecke, D., Ben-Shachar, M. S., Patil, I., Waggoner, P., & Makowski, D. (2021). performance: An R Package for assessment, comparison and testing of statistical models. Journal of Open Source Software, 6(60), Article 3139. https://doi.org/10.21105/joss.03139 Müller, C., Di Giusto, F., Gross, S., & Koruna, S. (2016). Teaching in the digital age: Adaptation and competency development for academics. Zeitschrift für Hochschulentwicklung, 11(5), 187-203. https://doi.org/10.3217/zfhe-11-05/11 Redecker, C. (2017). European framework for the digital competence of educators: DigCompEdu. Publications Office of the European Union. https://doi.org/10.2760/159770 Santos, C., Pedro, N., & Mattar, J. (2021). Digital competence of higher education professors: Analysis of academic and institutional factors. Obra Digital, (21), 69–92. https://doi.org/10.25029/od.2021.311.21 Tondeur, J., Howard, S. K., & Yang, J. (2021). One-size does not fit all: Towards an adaptive model to develop preservice teachers’ digital competencies. Computers in Human Behavior, 116, Article 106659. https://doi.org/10.1016/j.chb.2020.106659 Voogt, J., Erstad, O., Dede, C., & Mishra, P. (2013). Challenges to learning and schooling in the digital networked world of the 21st century. Journal of Computer Assisted Learning, 29(5), 403-413. https://doi.org/10.1111/jcal.12029
Update Modus of this Database
The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.