16 SES 12 B, Digital Literacy
Information seeking is one of the most common activities in the Swedish school (National Agency for Education, 2013), and the national curriculum requires compulsory schools to ensure that pupils “can use modern technology as a tool in the search for knowledge, communication, creativity and learning” (NAE, 2011, p. 16). There is a need for teachers to learn more about how to teach and also assess the competencies related to this. Qualitative studies have contributed greatly to the knowledge area (e.g. Enochsson, 2001; 2005; Limberg, 1998; 2014), and over the past 15 years, there has also been an increasing number of quantitative studies aimed at measuring different aspects of competencies related to information technology (e.g. Fraillon, Ainley & Schulz, 2015).
One criticism against previous research is the lack of instruments to measure digital skills in everyday life (Calvani, Fini & Ranieri, 2009). Another criticism is that many studies on digital competence only measure respondents’ self-evaluated skills (Samuelsson & Olsson 2014). One of few answers to this criticism, is a performance test developed in a Dutch context by van Deursen, van Dijk and Peters (2011). The test measured a particular form of digital competence through the level of separate Internet skills assessed on the basis of the respondent’s ability to carry out different tasks, in this case searching for, evaluating and using information.
Researchers agree that information literacy is not only about localising information but also about evaluating it with a critical eye and using it, preferably wisely – however wisely is defined. Focus on how to use information has been important especially in educational settings: Eisenberg and Berkowitz (1992) presented the well-known model ‘Big6’, Kuhlthau (1993) involved the users’ feelings in her model, Bruce (1997) described the seven faces of information literacy and Limberg (1998) pointed at how content affects process. All these ‘pioneers’ talk about meaningfulness and how information literacy is part of lifelong learning (e.g. Limberg, 2013; 2014; Bruce, Hughes & Somerville, 2012).
Different researchers focus on different aspects and use different concepts. Many concepts are, however, similar, involving formulating needs,finding, analysing and evaluating information and using the information (for an explanatory background, see for example Catts, 2012). The European Commission approved an updated digital competence framework in 2017, DigComp 2.1, including five competence areas and detailed proficiency levels for each of them (Carretero, Vourikari, & Punie, 2017). The competence area “Information and data literacy” includes all the above-mentioned aspects. However, the medium is not particularly stressed in this specific area, since the whole competence framework concerns digital environments. This study focuses on information literacy in relation to a specific digital environment - the Internet, which also adds a specific dimension, and information literacy differs depending on the medium (Forsman, 2014).
The test developed by van Deursen, van Dijk and Peters (2011) covers both information literacy aspects described in research as well as related digital competence. The aim of this paper is to study how well Swedish teenagers, in their last year of compulsory school, carry out different information searching tasks on the Internet, by using van Deursen, van Dijk and Peters’ test, revised for a Swedish context.
To solve the assignments, the participants had to master the medium, e.g. a digital device connected to the Internet and mainly a web browser. Examples of medium-related items are opening websites by entering a URL or saving content as a favourite. These items could not be controlled for otherwise than some assignments could not be solved if the test person could not do it. For example, it was easier to answer the final questions in the assignments if a separate window or tab was open in the web browser. Content-related items, like defining search options, could not be evaluated systematically in this study, but observations and follow-up talks with teachers gave an indication of the success-rate of these items. Evaluating sources from certain aspects appeared as both multiple-choice questions and open-ended questions. In total, the results from 125 participating pupils in their last year of Swedish compulsory school were analysed. At this age, participants are considered capable of deciding upon participation in a research study themselves. Apart from written and oral information to the pupils, their parents and the schools, there were links to written information at the end of the text, and a question if they still agreed on letting us use their answers for research purposes. Teachers were present during the test sessions, and they gave written consent for short follow-up interviews. Depending on the school, the test was carried out on iPads or laptops. The assignments were described in a web-based software, where the pupils also gave the answers. The searches were performed on the Internet, without restrictions or predefined solutions. Some data required written explanations, while other questions had multiple choice answers. Pupils who normally used for example text-to-speech software were allowed to use that type of resource. The aim was to keep the working environment as similar to the pupils' normal setting as possible. After some rounds, some of the assignments had to be changed, because the accesible information had moved. Questions guiding the analysis were: Are the pupils able to organise searches, describe how to access data, access data and to navigate between them, organise search strategies, interpret information, store and retrieve information, argue from facts, values and perspectives, sort in a large body of information, test sources reliability? All of these aspects, are included in the DigComp 2.1 framework, and they are also requirements in the Swedish curriculum for this age-group.
All in all, the pupils at the four schools did not solve the assignments very well. Although the final answers in the test were in focus, it was clear that all pupils did not have the digital skills needed to search for them, one example of this was store and retrieve. Sixty-three percent of the pupils found information about how much subsidiaries a pupil can get in upper secondary school if he or she enters a school in a certain, quite long, distance from home. This was the most easily solved assignment, although it included reading a long text and interpreting a table (access data, sort in a large body of information). It was more difficult to carry out several steps, before information could be found or to compare information from different sites (navigate between data, organise searches and strategies). Searching for images was one of the most difficult tasks, and also finding the original source for news (test sources reliability, interpret data). A last assignment concerned choosing a brand, caring about the environment. Only very few pupils found a solution to this, some of them did well, while others did not show they could use argumentation skills based on facts, values and various perspectives (argue from facts, values and perspectives). The result differed in the participating classes, which points at the importance of covering all aspects of information literacy when teaching.
Bruce, C. (1997). The seven faces of information literacy. Adelaide: Auslib Press. Bruce, C., Hughes, H., & Somerville, M. M. (2012). Supporting informed learners in the Twenty-first century. Library Trends, 60(3), 522-545. doi:10.1353/lib.2012.0009 Calvani, A., Fini, A. & M. Ranieri. (2009). Assesing digital competence in secondary education. In Learning, M. (Eds.), Issues in information and media literacy. Santa Rosa, CA: Informing Science Press. Carretero, S., Vourikari, R. & Punie, Y. (2017). The digital competence framework for citizens: With eight proficiency levels and examples of use. Luxembourg: Publications Office of the European Union. Catts, R. (2012). Indicators of adult information literacy. Journal of Information Literacy, 6(2). doi:10.11645/6.2.1746 van Deursen, A. J. A. M., van Dijk, J. A. G. M., & Peters, O. (2011). Rethinking Internet skills. Poetics, 39(2), 125-144. doi:10.1016/j.poetic.2011.02.001 Eisenberg, M. & Berkowitz, R. (1992). Information Problem-Solving: The Big6 Skills Approach. School Library Media Activities Monthly, 8(5): 27-29, 37, 42. Enochsson, A. (2005). The development of children's web searching skills - a non-linear model. Information Research, 11(1). Forsman, M. (2014). Medie- och informationskunnighet i Sverige - En kartläggning av aktörer. Retrieved from http://www.statensmedierad.se/download/18.6e2654261506810579b2ec6a/1452243731603/MIK-kartlaggning-Sverige-2014.pdf Fraillon, J., Ainley, J., & Schulz, W. (2015). Preparing for life in a digital age. Switzerland: Springer International Publishing AG. Kuhlthau, C. (1993). Seeking meaning. Norwood, NJ: Ablex. Limberg, L. (1998). Att söka information för att lära. Gothenburg: Gothenburg University. Limberg, L. (2013). Informationskompetens i undervisningspraktiker. In Carlsson, U. (Ed.) Medie- och informationskunnighet i nätverkssamhället. Skolan och demokratin. (pp. 67-76). Gothenburg: Nordicom. Limberg, L. (2014). Informationsaktiviteter och lärande i skola och bibliotek (pp. 27-38). In Rivano Eckerdal, J. & Sundin, O. (Eds.) Medie- och informationskunnighet. Stockholm: Svensk biblioteksförening. National Agency for Eduaction (NAE) (2011). Curriculum for the compulsory school, preschool class and the recreation centre 2011. Stockholm, SE: National Agency for Education. National Agency for Eduaction (NAE) (2013). Oxstrand, B. (2013). Från Media Literacy till Mediekunnighet. Doctoral dissertation. Department of Journalism, Media and Communication. Gothenburg: Gothenburg University. Samuelsson, U. & T. Olsson. (2014). Digital inequality in primary and secondary education. M. Stocchetti (Ed.), Media and Education in the Digital Age. Bern: Peter Lang Publishing Group. Vourikari, R., Punie, Y., Carretero, S. & Van den Bande, L. (2016). DigComp 2.0: The digital Competence framework for citizens (JRC Science for Policy Report). Brussels: EU.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.