Session Information
16 SES 07 A, ICT, Language Learning and Media Literacy
Paper Session
Contribution
When our interaction with the world becomes more and more mediated by screens, digital and physical realities are intertwined. It is important to understand how the nature of this new reality affects us in our everyday lives. In this paper we explore third-grade school children’s level of understanding of the algorithmic nature of the digital platforms they use daily and influence on their behavior. The growing use of Artificial Intelligence (AI), algorithms and machine learning, in applications popular among children, are changing the ways they see the world and themselves. To understand how the applications are affecting their experience we wanted to study precisely children’s understanding of the role of algorithms in their use of digital media content. Therefore, we wanted to study one aspect of media and digital literacy, algorithmic literacy.
Understanding of digital literacy lies beyond mere use of digital application, simple ability to use them. To be literate, to read more than what is seen, one should be aware of the underlying algorithms affecting our experiences of interaction with the applications. Recent research, dedicated to the distinctions between multi-platform and single-platform users, has demonstrated how diverse platform engagement significantly enhances algorithmic understanding (Espinoza-Rojas et al., 2023; Shin et al., 2020; Andersen, 2020). These studies underline the factor of users’ adaptive behaviors in response to algorithmic outputs and highlight the importance of emotional and ethical considerations of digital interactions.
Algorithm literacy (AL) can be defined as having an understanding of the utilization of algorithms in online applications, platforms, and services. It involves knowledge of the functioning of algorithms, the ability to critically assess algorithmic decision-making, and possessing the skills necessary to navigate and potentially impact algorithmic operations (Andersen, 2020; Dogruel, 2021; Shin et al., 2022). Algorithmic literacy can be considered the informed ability to critically examine, interrogate, propose solutions for, contest and agree with digital services (Long & Magerko 2020). At the core of algorithmic literacy is explicability, which shapes individuals’ attitudes towards and views on algorithmic decision-making technologies (Hermann 2021).
To explore childrens as users of algorithmic media we conducted a study with a teaching experiment in a third-grade classroom (9 to 10 years old) in [nation]. In the beginning of the experiment the students (N=18) filled a questionnaire measuring the awareness of algorithmic media content. The same questionnaire was filled after the teaching experiment.
In the core of the teaching experiment was the student's own project work done in small teams (2-3 in each). During the classes the students designed advertisements consisting of two photos taken by them and two slogans invented by them and attached to the photos. The task was (1) to design a good advertisement of carrots and (2) a bad advertisement of carrots. To work on their photos each team got a bag of carrots.
In the second class the students voted for the best five advertisements. Then children were provided with a calculation of votes and selection of the top five advertisements with a number of votes each got. Based on the results, the students were asked to share media time for each advertisement. This way the children in teams were acting like a human-algorithm. For the task we didn’t give them any math examples for calculating the shares, but rather let them figure it out (or not) by themselves. The small team discussions were audio recorded during the design of the advertisements as well during making decisions on how long each advertisement should get media time. In the end of the second class we demonstrated how a computer-algorithm would share the media time, based on the votes given.
Method
Children’s understanding of the algorithmic media content was studied with the Algorithmic Media Content Awareness scale (AMCA-scale) (Zarouali et al., 2021) and by collecting qualitative data, audio recordings from their work in small teams. Through the AMCA questionnaire — localized for the purpose — we assessed the dimensions of the children’s algorithmic awareness: ‘content filtering’, ‘automated decision-making’, ‘human-algorithm interplay’, and ‘ethical considerations’. In the questioner we used statements and a simple scale: “yes”, ”no”, “I don’t know”. The 13 questions were related to the role of algorithms in media content recommendation, content tailoring, automated decision-making, and their ethical implications ((e.g. “YouTube makes independent decisions about which videos to show me”). Combining the results from the questionnaire and analysis of the audio recording we aimed to know how children perceive ethical considerations in algorithmic media by assessing their understanding of transparency, potential biases, and privacy concerns. With the teaching experiments we wanted to explore if working with the advertisement task and as a human-algorithm would have any effect on their understanding about algorithmic media and its logics. Therefore the questionnaire was done by the students twice, before starting the teaching experiment and after the teaching experiment. The audio recordings from each teams’ two working sessions — during designing their advertisements and when acting as a human-algorithm and making decisions on the media time — was conducted to analyze the children’s thinking process. In the analysis of the qualitative data we will apply Thematic Content Analysis (TCA) (Anderson, 2007; Smith, 1992. The results of the content analysis will be combined with the results from the questionnaire, although recognizing all the individual students from the audio recordings has been found impossible. The Principal of the school approved the research plan and informed consent was addressed to the children’s guardians and the children. The nature of research was explained to children by their teacher and the researchers. The questionnaire data was stored in a secure server and the audio recordings were stored in a harddisk accessible only for the researchers. The research applied the guidelines and recommendations of the [nation] National Board on Research Integrity and followed their ethical principles of conducting research with children participants: participant consent, right to self determination, prevention of harm and privacy and data protection.
Expected Outcomes
Students' initial understanding of how algorithms affect their media content and how data is collected and used was very limited. In the pre-questionnaire, almost 80% of the students answered “yes” to the statement “YouTube knows how to recommend videos for me”. On the other hand, 45% of the students answered “no” or “I don’t know” to the statement “YouTube can estimate how interested I am in any video”. The answers are possibly demonstrating mystification with their thinking. Same time students know that YouTube is able to “know” and recommend videos for them, but they do not understand how it happens. With the questions related to ethics and privacy, the answers to the pre-questionnaire did not include many signs of concerns, but again, rather lack of understanding. To the statement “Videos YouTube shows for me, may be inaccurate or biased. They may increase prejudices” 30% answered “yes”, 50% “I don’t know”, and 20% “no”. The large number of not being sure, may demonstrate that the students have never thought about the issue. The results from the post-questionnaire demonstrate a slight change in the students' understanding of algorithms. In their answers to the privacy issues students were a bit more concerned. When in the pre-questionnaire 50% of the students answered “I don’t know” 22% “no" and 28% “yes” to the statement “computer programs on YouTube use information collected about me in order to recommend certain types of videos to me. This affects my privacy”iIn the post-questionnaire 40% were still answering “I don’t know" and 20% “no”, but 40% answered “yes”. The similar patterns exist in the students' answers to other questions, too. These first insights from the pre- and post-questionnaire will guide us in the qualitative data analysis to understand the students' thinking before, during and after the teaching experiment.
References
Andersen, J. (2020). Understanding and interpreting algorithms: Toward a hermeneutics of algorithms. Media, Culture & Society, 42(7–8), 1479–1494. https://doi.org/10.1177/0163443720919373. Anderson, R. (2007). Thematic content analysis (TCA). Descriptive presentation of qualitative data, 3, 1-4. Dogruel, L. (2021). What is algorithm literacy? A conceptualization and challenges regarding its empirical measurement. 75898, 9, 67-93. Espinoza-Rojas, J., Siles, I., & Castelain, T. (2023). How using various platforms shapes awareness of algorithms. Behaviour & Information Technology, 42(9), 1422-1433. https://doi.org/10.1080/0144929X.2022.2078224. Hermann, E. (2022). Artificial intelligence and mass personalization of communication content—An ethical and literacy perspective. New Media & Society, 24(5), 1258-1277. Long, D., & Magerko, B. (2020, April). What is AI literacy? Competencies and design considerations. In Proceedings of the 2020 CHI conference on human factors in computing systems (pp. 1-16). Shin, D., Rasul, A., & Fotiadis, A. (2022). Why am I seeing this? Deconstructing algorithm literacy through the lens of users. Internet Research, 32(4), 1214-1234. Shin, D., Zhong, B., & Biocca, F. A. (2020). Beyond user experience: What constitutes algorithmic experiences?. International Journal of Information Management, 52, 102061. Smith, C. P. (Ed.). (1992). Motivation and personality: Handbook of thematic content analysis. Cambridge University Press. Zarouali, B., Boerman, S. C., & de Vreese, C. H. (2021). Is this recommended by an algorithm? The development and validation of the algorithmic media content awareness scale (AMCA-scale). Telematics and Informatics, 62, 101607.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.