Session Information
Paper Session
Contribution
As digital literacy is among key results of education in the 21st century, understanding the relationship between patterns of digital tool use and levels of digital literacy become crucial as it helps to design and implement effective educational interventions (Bundsgaard & Gerick, 2017; Gerick, 2018; Kastorff et al., 2023). Despite its importance, it remains underexplored especially for various school students population. Moreover, while variable-oriented research fail to capture relations between technology access and levels of digital literacy (Fraillon et al., 2020), person- oriented approaches become more preferrable as it can reveal distinct patterns of multifaceted use of digital technologies for learning.
There is an idea that modern young people who grew up surrounded by digital devices spontaneously master digital skills, and their formation does not require special attention from the school. Teachers' observations and research results show that this is not the case (Kirschner & Bruyckere, 2017). Most schoolchildren are not able to effectively solve problems in the digital environment, for example, correctly construct a search query, or ensure their information security (Avdeeva et al., 2022). The purpose of the study is to assess the relationship of digital literacy, including its individual components (for example, the ability to work with information in a digital environment), with different patterns of digital tools use among students and teachers. The research toolkit consisted of two specially developed question naires: a student questionnaire, a teacher questionnaire, a digital school passport questionnaire, and a test to assess the level of digital literacy of students (DL test). Participants voluntarily completed the questionnaires in a controlled online environment. The questionnaires for students and teachers, exploring various aspects related to the use of digital tools, were developed based on current research and tested through expert review and pilot testing. DL test consists of scenario-based items, which simulate the main features of students' real-life situations. The innovative items were developed according to an Evidence-Centered Design methodology (Mislevy R.J., 2013; Ferrara, S. et al., 2016). All scenario-based items simulate features of students’ real-life situations (De Klerk S. et al. 2016). The DL assessment in our study was a low-stakes one. It just informs test-takers, and their teachers about students' actual digital literacy levels and gaps in their development found.
We used multilevel latent class analysis (MLCA) to uncover distinct patterns of schools from the perspectives of teachers’ and students’ use of digital tools in teaching and learning. It allowed us identify large groups of schools highlighting tendencies in digital tools adoption in school settings. The largest group of schools (84.2%) had teachers who experimented with digital tools, though this was not widespread. Similarly, in the largest group of schools, students primarily used digital tools for presentations and information search. Notably, 27% of schools had teachers who did not actively use digital tools, but students were still engaging with them.
By identifying actionable patterns of digital tool use and their links to digital literacy, this study provides critical insights for education practitioners, policymakers, and researchers. The disparities in digital literacy factors revealed contributes to the second digital divide theory and help to understand nuances of multifaceted nature of digital tools use. These findings highlight the need for targeted interventions in school settings and for families to address gaps in digital skills, fostering a more inclusive digital future.
The results of the study may be of interest in the implementation of the European Digital Education Action Plan 2021–2027, given the emphasis on the development of digital literacy of schoolchildren, the digital transformation of schools, and the active use of digital tools in their educational process.
Method
Participants and Procedure. We used data obtained from 125 schools. In each school three types of surveys were distributed and response data collected: a survey with information about school ICT infrastructure that was filled by administrator, surveys of secondary school students (n = 4353) included questions about digital tools use in learning and for leisure purposes, and surveys of teachers (n= 939) who worked in classes of students surveyed included questions about different aspects of ICT integration in teaching. In order to ensure that teachers’ answers reflect different subjects and variety in teaching practices, from 4 to 10 teachers were surveyed in each school. Each students’ survey response data was merged with digital literacy level assessed through DL test). Data collection took place during the winter of 2023-2024. This study analyzed school profiling based on the results of Multilevel Latent Class Analysis (MLCA) build on teachers and students survey. For this study, key variables for MLCA were selected from teachers and students’ questionnaires. Seven questions were selected from students’ questionnaire regarding the use of different digital tools at the teacher's request for learning. Five questions were derived from teachers questionnaire regarding the fact how often teachers ask students to use computers and gadgets for different digital tools use (in class, to do homework, etc.). The questions were designed to comprehensively address how school students are working with the popular digital tools and using digital tools such as simulators and virtual laboratories. Data Analysis After removing missing cases, data from 3780 students and 787 teachers across 107 schools were included in the analysis. For the first research question, MLCA was performed to identify latent school profiles based on student and teachers-level indicators. We use Akaike Information Criterion (AIC), adjusted Bayesian Information Criterion (aBIC), Bootstrap Likelihood Ratio Test (BLRT), and entropy values to to determine the quality of the model and optimal number of clusters and classes. MLCA was performed using the "poLCA" and “glca” packages in R. After MLCA analysis performance we assigned clusters obtained from MLCA to each school deriving a profile of each school in sample which included information about students’ digital literacy status, assignment of tasks required the use of digital tools by teachers and initiative use of digital tools by students. In order to test differences in the digital literacy levels of students across the schools in the sample, Pearson Chi-squared test was used.
Expected Outcomes
We identified four types of digital service use in teaching and learning work, it helped us to profile schools from two perspectives: teachers’ practices of tasks assignment in classroom and students’ experience of digital tools use in learning. Interpreting groups of schools obtained in MLCA, we managed to identify key differences in ICT use between schools. For example, the largest group of schools from the point of view of teachers’ practices is where only some teachers experiment with digital tools (84.2% of schools). At the same time the in the largest group of schools from the point of view of students there is no universal adoption of digital tools with focus on presentations preparation and information search. It is explainable from previous research of ICT use in schools (Shear et al., 2011). Moreover, 27% of schools sample are schools where teachers are not using digital tools actively, however their students are engaging with digital tools in classroom. This observation suggests that students’ digital tool usage may be influenced by factors beyond direct teacher guidance. Preliminary results of our analysis shows that while the out-of-school environment influences the formation of digital literacy of students in the digital environment noticeably, stimulating of active tasks that involve creating and producing content, such as text and presentation development, online research, working with digital learning materials, conducting experiments, and using simulators or virtual labs helps to shape digital literacy. However, the findings highlight that there are distinct school profiles concerning digital tools use in learning from the teachers’ and students’ points of .view. Moreover, significant differences between schools’ profiles of digital literacy of students is proven by Pearson Chi-squared test (X-squared = 4929,1, p-value < 2.2e-16). Further qualitative-driven studies are needed that explore links between the use of digital tools and levels of digital literacy.
References
1.Avdeeva, S., Uvarov, A., & Tarasova, K. (2022). Digital Transformation of Schools and Student’s Information and Communication Literacy. Voprosy Obrazovaniya / Educational Studies Moscow, 1, 218–243. https://doi.org/10.17323/1814-9545-2022-1-218-243 2.Avdeeva S.M., Tarasova K.V. (2023) On measuring digital literacy: methodology, conceptual model and measurement tool. Educational Studies Moscow. № 2. С. 8-32. DOI: https://doi.org/10.17323/1814-9545-2023-2-8-32 3.Bundsgaard, J., & Gerick, J. (2017). Patterns of students’ computer use and relations to their computer and information literacy: Results of a latent class analysis and implications for teaching and learning. Large-Scale Assessments in Education, 5(1), 16. https://doi.org/10.1186/s40536-017-0052-8 4.Digital Education Action Plan (2021-2027), https://education.ec.europa.eu/focus-topics/digital-education/action-plan?utm_source=chatgpt.com (Date of access: 28.01.2025). 5.Ferrara, S. et al. (2016). Principled Approaches to Assessment Design, Development, and Implementation. In A.A. Rupp & J.P. Leighton (Eds.), The Handbook of Cognition and Assessment. Hoboken, NJ, USA: John Wiley & Sons, Inc., pp. 41–74. 6.Fraillon, J., Ainley, J., Schulz, W., Friedman, T., & Duckworth, D. (2020). Preparing for Life in a Digital World: IEA International Computer and Information Literacy Study 2018 International Report. Springer International Publishing. https://doi.org/10.1007/978-3-030-38781-5 7.Gerick, J. (2018). School level characteristics and students’ CIL in Europe – A latent class analysis approach. Computers & Education, 120, 160–171. https://doi.org/10.1016/j.compedu.2018.01.013 8.Kastorff, T., Sailer, M., & Stegmann, K. (2023). A typology of adolescents’ technology use before and during the COVID-19 pandemic: A latent profile analysis. International Journal of Educational Research, 117, 102136. https://doi.org/10.1016/j.ijer.2023.102136 9.Kirschner, P. A., & De Bruyckere, P. (2017). The myths of the digital native and the multitasker. Teaching and Teacher Education, 67, 135–142. https://doi.org/10.1016/j.tate.2017.06.001 10.De Klerk S., Eggen T.J.H.M., Veldkamp B.P. A Methodology for Applying Students’ Interactive Task Performance Scores from a Multimedia-based Performance Assessment in a Bayesian Network // Computers in Human Behavior. 2016. Vol. 60. Issue C. P. 264–279. DOI: 10.1016/j.chb.2016.02.071 11.Shear, L., Gallagher, L., & Patel, D. (2011). Innovative teaching and learning research 2011 Findings and implications. Microsoft Partners in Learning. 12.Prensky, M. (2001). Digital natives, digital immigrants. On the Horizon, 9(5), 1–6. https://doi.org/10.1108/10748120110424816
Update Modus of this Database
The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
 This database will be updated with the conference data after ECER. 
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.