Session Information
06 SES 02 A, Focussing Media Literacies and Competencies: Data Privacy, Fake News and Algorithms
Paper Session
Contribution
Algorithms are a central structural element of digitalized environments. In particular, so-called recommendation engines are a key type of algorithm that is especially relevant to everyday media use (Schrage, 2020): They are implemented in platforms such as YouTube, Instagram or TikTok, which are routinely accessed by many children in German-speaking Switzerland (Waller et al., 2019). The output of these systems are recommendations that suggest and continuously adapt the retrieval of specific content, for example videos, to the user (Louridas, 2020). Recommendation systems are not only used for the selfless reason of paving paths through the unmanageable number of possible retrieval options. Algorithmic recommender systems are closely intertwined with commercial interests (Beer, 2017), reflect dominant social categories (Noble, 2018), and can shape how users construct reality (Just & Latzer, 2017).
From a media educational perspective, therefore the ability to critically reflect on algorithms, to self- determinedly act in relation to them and thus to constructively shape the societies in which they are embedded has become increasingly important: algorithm literacy. Sharing basic assumptions of the digital and media literacy approach such as proposed by Hobbs (2021), algorithm literacy refers to “the ever-changing set of knowledge, skills, and habits of mind” (ibid., p. 4) in relation to algorithms. On this basis, an ideal "algorithm literate" person demonstrates an awareness of the operation of algorithms, has knowledge of how algorithms work, is able to critically evaluate algorithms or their results, and has skills to actively engage with them (Dogruel et al., 2022; Swart, 2021). Becoming literate in algorithms must not only be understood as an effect of pedagogical efforts. Based on socio-phenomenological assumptions about the importance of everyday experience for the formation of competencies (Berger & Luckmann, 2016), the acquisition of algorithm literacy can also be seen as rooted in everyday interaction with algorithm-driven platforms. Although algorithms are not directly visible to people in everyday use, they make sense of their output – even without knowing the mathematical-technical details or the term "algorithm" (Bucher, 2018). In this sense, several empirical studies have addressed aspects of algorithm literacy in adolescents and adults (Bell et al., 2022; Brodsky et al., 2020; Swart, 2021). However, while the role of algorithms, including recommender systems, in children's digital "ecosystems" has been extensively discussed (Cotter & Reisdorf, 2020), there are few empirical studies that contribute to knowledge about children's algorithm literacy.
Against this background, I conducted a qualitative study on children's algorithm literacy in primary school. The aim of the present study is to address this desideratum by empirically investigating children's algorithm literacy based on the following research questions:
- What are children's everyday experiences with algorithmic recommendations? (RQ1)
- In which ways do they explain algorithmic recommendations? (RQ2)
- How and to what extent do they criticize algorithmic recommendations? (RQ3)
The presentation will cover key findings of the study and implications for the teaching of algorithms in the context of media education in primary schools.
Method
To investigate children’s algorithm literacy, I chose a qualitative research approach. In total, I have conducted about 26 group discussions with 120 children between the ages of 11 and 13. The groups have been recruited in cooperation with different primary schools in the canton of Zurich. To create a diverse sample with respect to the region, the schools have been selected from urban, sub-urban as well rural districts. In the absence of empirical research specifically on children's algorithm literacy that would have allowed the formulation of falsifiable hypotheses, and in the absence of a suitable instrument for quantification, this study is exploratory. Instead of deductively applying a set of skills, algorithm literacy was analyzed from the bottom up, starting with children's everyday experiences and their life-world situated assessments. Understanding the acquisition of algorithm literacy as located in everyday media usage, was reflected in the design of the focus groups: Conceptualizing algorithms as “experience technologies” (Cotter & Reisdorf, 2020), the discussions focused on a specific phenomenon where algorithmic systems appear in users' daily lives: video recommendations on the platform YouTube, which continuously enjoys great popularity among the majority of children in German-speaking Switzerland (Waller et al., 2019). This included authentic screenshots of recommendations as well as recommendation bars. Furthermore, in contrast to other studies (e.g. Bell et al., 2022; Dogruel et al., 2020; Gran et al., 2021), the moderators did not explicitly ask about "algorithms". Instead, the focus groups discussed everyday experiences with the phenomenon of video recommendations, possible explanations for their genesis, and wishes for change related to these. All focus group sessions have been video-taped. The analysis is carried out through a combination of open coding of the video material and ethnomethodologically oriented fine analysis of selected sequences transcribed for this purpose (Garfinkel, 1984).
Expected Outcomes
The presentation will outline the results of the study. So far, the analysis has revealed the following key experiences that children have with algorithmic recommendations on a daily basis (RQ1): First, children in all groups report experiencing both pleasant and unpleasant emotions when interacting with algorithmic recommendations. While pleasure and fun are described when recommendations match one's interests and situational motives for use, negative emotions occur when recommendations do not match expectations. Second, children report that they observe certain quasi-algorithmic "logics" of recommendations: Orders and hierarchies in the way recommendations relate to each other on the surface of a platform. These experiences are described as platform specific. Children's explanations of algorithmic recommendations (RQ2) focus on the activities of different actors: The appearance of a recommendation is explained in terms of (a) their own media use, (b) the use of parents or siblings, or (c) the actions of more vague 'others' such as 'YouTube'. In the discussion of algorithmic recommendations, criticism also arises (RQ 3): On the one hand, negative effects of age-inappropriate video recommendations on children were discussed. In addition to "better" personalization through the platform, there was also a discussion about regulation and the platform's responsibilities. On the other hand, the entanglement of one's own time and attention with the commercial functionality of the platforms also became an issue. Overall, the results point to a variety of manifestations of algorithm literacy. In the focus groups children show awareness not only for algorithmic operations, but also for the affects that recommendations might trigger. Also, children demonstrate knowledge on how recommendation algorithms might work, especially their socially intertwined character. Furthermore, the questions raised in the discussions about the attention economy of platforms and the protection of children from harmful effects also demonstrate the ability to critically evaluate algorithms in societal contexts.
References
Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(1), 1–13. https://doi.org/10.1080/1369118X.2016.1216147 Bell, A. R., Tennfjord, M. K., Tokovska, M., & Eg, R. (2022). Exploring the role of social media literacy in adolescents’ experiences with personalization: A Norwegian qualitative study. Journal of Adolescent & Adult Literacy, n/a(n/a). https://doi.org/10.1002/jaal.1273 Berger, P. L., & Luckmann, T. (2016). Die gesellschaftliche Konstruktion der Wirklichkeit: Eine Theorie der Wissenssoziologie (M. Plessner, Trans.; 26th ed.). Fischer. Brodsky, J., Zomberg, D., Powers, K., & Brooks, P. (2020). Assessing and fostering college students’ algorithm awareness across online contexts. Journal of Media Literacy Education, 12(3), 43–57. https://doi.org/10.23860/JMLE-2020-12-3-5 Bucher, T. (2018). If...Then. Algorithmic Power and Politics. Oxford University Press. Cotter, K., & Reisdorf, B. C. (2020). Algorithmic Knowledge Gaps: A New Horizon of (Digital) Inequality. International Journal of Communication, 14(0), Article 0. Dogruel, L., Facciorusso, D., & Stark, B. (2020). ‘I’m still the master of the machine.’ Internet users’ awareness of algorithmic decision-making and their perception of its effect on their autonomy. Information, Communication & Society, 25(9), 1311–1332. https://doi.org/10.1080/1369118X.2020.1863999 Dogruel, L., Masur, P., & Joeckel, S. (2022). Development and Validation of an Algorithm Literacy Scale for Internet Users. Communication Methods and Measures, 16(2), 115–133. https://doi.org/10.1080/19312458.2021.1968361 Garfinkel, H. (1984). Studies in Ethnomethodology (1st ed.). Polity. Gran, A.-B., Booth, P., & Bucher, T. (2021). To be or not to be algorithm aware: A question of a new digital divide? Information, Communication & Society, 24(12), 1779–1796. https://doi.org/10.1080/1369118X.2020.1736124 Hobbs, R. (2021). Media Literacy in Action. Questioning Media. Rowman & Littlefield. Just, N., & Latzer, M. (2017). Governance by algorithms: Reality construction by algorithmic selection on the Internet. Media, Culture & Society, 39(2), 238–258. https://doi.org/10.1177/0163443716643157 Louridas, P. (2020). Algorithms. MIT Press. Mascheroni, G., & Siibak, A. (2021). Datafied Childhoods. Data Practices and Imaginaries in Children’s Lives.: Vol. Vol. 124. Peter Lang. Noble, S. U. (2018). Algorithms of Oppression. How Search Engines Reinforce Racism. New York University Press. Schrage, M. (2020). Recommendation Engines. MIT Press. Swart, J. (2021). Experiencing Algorithms: How Young People Understand, Feel About, and Engage With Algorithmic News Selection on Social Media. Social Media + Society, 7(2), 20563051211008828. https://doi.org/10.1177/20563051211008828 Waller, G., Gregor, W., Lilian, S., Jael, B., Céline, K., Isabel, W., Nicolas, M., & Daniel, S. (2019). Ergebnisbericht zur MIKE-Studie 2019. Willson, M. (2019). Raising the ideal child? Algorithms, quantification and prediction. Media, Culture & Society, 41(5), 620–636. https://doi.org/10.1177/0163443718798901
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.