Session Information
06 SES 13 A, Open Learning in Higher Education
Paper Session
Contribution
In today's society, which is marked by information overload (Batista & Marques, 2017; Bawden & Robinson, 2020), one of the key competencies is the ability to navigate vast amounts of information. At the European policy level, it is emphasized that “education should more actively support learners in developing the skills to critically assess, filter, and evaluate information, particularly in identifying disinformation and managing information overload” (European Commission, 2020).
Although there are diverse terms and concepts attempting to unify a set of skills related to information management, this skill set is also commonly referred to as Information literacy (IL). It was first defined as a set of skills for utilizing information sources and tools in "shaping" information solutions to problems (Zurkowski, 1974, as cited in Landoy et al., 2020). Since then, a number of authors have studied the concept further. A better understanding of what comprises IL is offered by process models (e.g., Bruce, 1997; Eisenberg & Berkowitz, 2000; Herring, 2011; Kuhlthau et al., 2007; Pappas & Tepe, 2002), which recognize the following processes involved in solving information problems: defining and understanding the problem, conducting initial research on a topic, selecting information, and synthesizing information. Although information literacy is not limited to a digital context, in practice, information searching predominantly relies on the Internet. In this context, it is interesting to explore how information is selected from different online sources and integrated into an answer to the initial problem.
Several studies explored users approach to sources of information in the context of solving different information problems. One such study (Walraven et al, 2009) aimed to explore students’ evaluation strategies for selecting sources for school tasks. Results showed that students spent most of their time on searching and scanning and only a small amount of time on processing and organizing information and they rarely evaluate results, information and sources although post-task group interview showed that they are aware of the importance of reliability of sources.
The qualitative study by Kiili et al. (2008) provided in-depth insight as it used think-aloud protocol and more comprehensive essay-writing task. The five evaluation profiles emerged from the data: 1) versatile evaluators 2) relevance-oriented evaluators 3) limited evaluators 4) disorientated readers 5) uncritical readers. Similar results were found by List et al. (2016) who identified 1) comprehensive source users (similar to versatile), 2) critical-analytical source users (relevance-oriented), 3) accessibility-driven users (focused on efficiently finding an answer and selecting sources) and 4) non-critical source users. Additionally, as the study focused also on the number of sources used, profile of disengaged source users (who used the fewest sources) as opposed to engaged source users (who used a relatively high number of sources) emerged.
The study by List et al. (2016) was significant as it considered different types of sources and their reliability (e.g., Wikipedia articles, which provide highly accessible information but moderate reliability, compared to journal articles). However, it relied on a limited, pre-defined library of sources selected by the researchers. It would be interesting to explore further which types of sources are primarily used when there are practically unlimited options available. Additionally, it would be valuable to investigate the patterns that emerge when the task involves debating a controversial topic that is widely discussed on the internet, with a vast amount of both information and disinformation. In line with this, the aim of this study was to explore what types of information sources (in terms of accessibility and reliability) are used to argue what can be considered as controversial topic, and to identify potential patterns in how information from these sources is utilized to support arguments.
Method
The study used an argumentative open-ended task related to the topic of child vaccination practices. Respondents were asked to take a stance on this topic and provide three key arguments. The topic was chosen due to the ongoing global debate over the importance versus the potential harm of vaccines. There are several common vaccination arguments present in the public discussion, such as medical evidence supporting the idea that vaccines are essential for protecting against serious and potentially fatal diseases (on the pro-vaccine side). On the other hand, there are common anti-vaccination claims, such as beliefs that vaccines contain toxic substances, cause autism, are unnecessary because the diseases they protect against are harmless, and that natural immunity is sufficient for the body to fight disease. A key part of the task was for participants to provide links to information supporting their arguments. Even if they already had a formed opinion and a repertoire of arguments, they were encouraged to use the internet to support their position and provide relevant links. If they did not have a clear stance, they were encouraged to research the topic, provide arguments and a conclusion, and include supporting links. The sample consisted of 110 respondents with a predominant female representation (Nf = 97, Nm = 13). The participants were third- and fourth-year secondary school students (ages 17-19) and university students (ages 19+). Data collection was part of a broader study focusing on the operationalization of Information literacy through the development of an instrument consisting of a series of graduated closed-ended and open-ended questions and tasks to measure different components of Information literacy. The intention behind the argumentative task, like the one described here, was to mimic real-life information problems. To gather the data, an online questionnaire was used, with respondents contacted through their schools, faculties, and teachers/professors. The data were analyzed using MAXQDA software and a general inductive approach (Thomas, 2006). In analyzing the referred sources, the unit of analysis was a single link, while for the analysis of arguments and conclusions, the unit of analysis was defined as meaningful parts of sentences conveying specific claims.
Expected Outcomes
In the analysis, four types of Internet source users are identified. The first type relied predominantly on official health institutions and provided elaborated responses with multiple pro-arguments. The second type relied on thematic portals or online media, providing slightly less diverse arguments. The third type gave responses without citing sources, often relying on previous knowledge or anecdotes, with more anti-arguments. The fourth type, less common, referred to research articles or media that cite research; these users provided pro-vaccine stance and predominantly listed more than one source which was uncommon among the other types. Respondents in this study favoured easily understandable, comprehensive sources (e.g., official portals, online media, thematic portals), apparently without cross-checking, as most provided only a single source, which resembles accessibility-driven users (List et al., 2016). Research-based content was used less frequently, possibly due to its complex language and details. However, the profile of these users suggests they are comprehensive users – relying on highly relevant sources and engaging in the use of multiple sources. Nevertheless, caution should be exercised when concluding that decisions about which sources to use are based solely on the content they offer, as other studies have shown that users also rely heavily on the rankings provided by search engines (Haas & Unkel, 2017; Kammerer & Gerjets, 2014; Pan et al., 2007). Given these findings, some implications for educational practice can be drawn. Information literacy is a cross-curricular competency, so it should be addressed through various learning content, with an emphasis on using multiple sources, cross-checking information, understanding source differences, and recognizing the writer's intentions. In the context of managing controversial topics, it is important to address major concerns rather than avoid them, with educators facilitating discussions on the genesis and dissemination of (mis)information.
References
Batista, J., & Marques, R. P. (2017). An overview on information and communication overload. In R. P. Marques & J. Batista (Eds.), Information and communication overload in the digital age (pp. 1–19). IGI Global. DOI: 10.4018/978-1-5225-2061-0.ch001 Bawden, D., & Robinson, L. (2020). Information overload: An introduction. Oxford Research Encyclopedia of Politics. https://doi.org/10.1093/acrefore/9780190228637.013.1360 Bruce, C. (1997). The seven faces of information literacy. Auslib Press. Eisenberg, M. B., & Berkowitz, R. E. (2000). Teaching information & technology skills: The Big 6 in secondary schools. Linworth Publishing. European Commission. (2020). Communication from the Commission to the European Parliament, The Council, the European Economic and Social Committee and the Committee of the Regions. Digital Education Action Plan 2021-2027. Resetting Education and Training for the Digital Age. Publications Office of the European Union. Haas, A., & Unkel, J. (2017). Ranking versus reputation: perception and effects of search result credibility. Behaviour & Information Technology, 36(12), 1285–1298. https://doi.org/10.1080/0144929X.2017.1381166 Herring, J. E. (2011). Improving students’ web use and information literacy: A guide for teachers and teacher librarians. Facet Publishing. https://doi.org/10.29085/9781856048811 Kammerer, Y., & Gerjets, P. (2014). The Role of Search Result Position and Source Trustworthiness in the Selection of Web Search Results When Using a List or a Grid Interface. International Journal of Human–Computer Interaction, 30(3), 177–191. https://doi.org/10.1080/10447318.2013.846790 Kiili, C., Laurinen, L., & Marttunen, M. (2008). Students evaluating internet sources: From versatile evaluators to uncritical readers. Journal of Educational Computing Research, 39(1), 75–95. https://doi.org/10.2190/EC.39.1.e Kuhlthau, C. C., Maniotes, L. K., & Caspari, A. K. (2007). Guided inquiry: Learning in the 21st century school. Libraries Unlimited. DOI:10.5040/9798400660603 Landoy, A., Popa, D., & Repanovici, A. (2020). Collaboration in designing a pedagogical approach in information literacy. Springer Cham. https://doi.org/10.1007/978-3-030-34258-6 List, A., Grossnickle, E. M., & Alexander, P. A. (2016). Profiling students’ multiple source use by question type. Reading Psychology, 37(5), 753–797. https://doi.org/10.1080/02702711.2015.1111962 Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., & Granka, L. (2007). In Google we trust: Users’ decisions on rank, position, and relevance. Journal of Computer-Mediated Communication, 12(3), 801–823. https://doi.org/10.1111/j.1083-6101.2007.00351.x Pappas, M. L., & Tepe, A. E. (2002). Pathways to knowledge and inquiry learning. Libraries Unlimited. DOI:10.5040/9798400695575 Thomas, D. R. (2006). A General Inductive Approach for Analyzing Qualitative Evaluation Data. American Journal of Evaluation, 27(2), 237-246. https://doi.org/10.1177/1098214005283748 Walraven, A., Brand-Gruwel, S., & Boshuizen, E. (2009). How students evaluate sources and information when searching the World Wide Web for information. Computers and Education, 52(1), 234-246. https://doi.org/10.1016/j.compedu.2008.08.003
Update Modus of this Database
The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.