Session Information
16 SES 01 A, Information Literacy
Paper Session
Contribution
A pivotal part of being an informed citizen today is to be able to read digital news in constructive ways. Today false and biased news are created and spread across the globe at great speed (Del Vicario et al., 2016; Lazer et al., 2018; Vosoughi, Roy, & Aral, 2018). Propaganda and disinformation has been noted as a major challenge to democracy (Wardle and Derakhshan, 2017). Technology for automated fact checking has been found to hold important limitations meaning that citizens need to be able to discern credible news from biased and false information (Babakar and Moy, 2016). However, determining credibility of online information has proven to be difficult. Even history professors and students at elite universities may struggle to separate trustworthy digital information from false and biased information and young people, born in a digital era, are not very good at evaluating online information (McGrew, Breakstone, Ortega, Smith, & Wineburg, 2018; Wineburg and McGrew, 2017; Nygren & Guath, 2019). Noting how news consumption is a pivotal part of democracy in an age of digital disinformation (Carlsson, 2018; Kahne and Bowyer, 2017) it is central to understand how people with different backgrounds, education and attitudes are able to determine the credibility of different types of digital news. To better understand how people’s abilities to determine credibility of news is associated with education, mind-set and background variables we study the following:
To what extent common citizens are able to differentiate credible digital news from biased and false information.
How differences regarding gender, age, education and personal beliefs and attitudes regarding digital information relate to people’s abilities to determine credibility of digital news.
Knowledge, skills and attitudes to navigate digital news has been described as an important part of what UNESCO (2011) calls media and information literacy and the EU describes media literacy as a key competence for democratic participation and lifelong learning. In theory, media literacy has been described as an ability to access, analyze, evaluate and create information in digital environments (Livingstone, 2004). Citizens need to be able to consider who is creating the news; the purpose of the information; manipulative strategies; and navigate information to underpin informed arguments regarding civic issues in society. Civic online reasoning is defined as “the ability to effectively search for, evaluate, and verify social and political information online” (McGrew et al. 2018, p.1); a pivotal skill in a digital world where digital media news are used to spread biased information, rumours and lies. Even if ”the call for more news literacy programs has been deafening recently” (Wardle and Derakhshan, 2017, p. 68) we know little about how skilled different people in Sweden and across Europa actually are at evaluating digital news and how this relates to for instance education and self-reported media literacy skills.
In our presentation we will present findings from an online survey and performance test designed to investigate the abilities to determine the credibility of false, biased and credible digital news among 1022 Swedish adults in relation to their background, education and self-reported habits and abilities. We find that most respondents struggle to separate between credible, biased and false information. Skills of civic online reasoning are associated with higher education (especially in the humanities), age, and appreciating credible news. We find that civic online reasoning is complex and may relate to disciplinary literacy, a self-reflective attitude, a mind-set of science curiosity, and in contrast problems with overconfidence. Our study is a call for more detailed research on how people (mis)understand digital news and how education may support people with different backgrounds and mind-sets.
Method
A representative sample (N = 1222) of the Swedish adult population, age 19-65, was given a survey online that was administered in Survey Monkey. Three-hundred-and-fifty-nine participants were classified as outliers from the criteria that time on task should be more than 4 minutes and less than an hour (M = 13.24 min, SD = 8.27 min). The age categories were 19-29 years (N = 206), 30-44 years (N = 212), 45-60 years (N = 206), and >60 years (N =127) with 370 women and 381 men (112 participants did not wish to reveal their gender). Participants had diverse levels of education from 9 years of schooling to PhD. Participation was voluntary and all participants were informed about the purpose of the study and their rights to withdraw from the study at any point. No traceable data was collected in line with ethical guidelines. The test was inspired by previous research on civic online reasoning (McGrew et al., 2017, 2018). The survey consisted of 20 questions. The participants were also asked to self-rate their ability to fact-check online information, rate the importance of consuming credible news, rate how reliable information on the internet is, and how much they have worked with fact checking. The predictor variables are native advertisements, unknown commenters, and scientific evidence. The variables reflect three basic skills for assessing credibility of information on the Internet: i) sourcing – identify where the news come from ii) corroborate information – what other sources say about the news; iii) evidence – evaluate the presented evidence. To assess the skills three different tasks with multiple test items were used: i) detecting sponsored material in newspapers, ii) comparing articles with credible and biased information, and iii) scrutinising credible and biased comments and authentic and manipulated images. Each category of civic online reasoning was tested in at least two test items. For the self-rated abilities, internet-info reliability, fact-checking ability, credibility importance, and sourcing at work, we conducted ANOVAs (variance analysis) with education, orientation, age and gender as independent variables. For the objective abilities, we conducted a number of regressions. We performed Poisson regressions, with the number of correct as dependent variable and education, work orientation, sourcing at work, credibility importance, internet info reliability, search ability, language spoken at home, party score, and gender as predictor variables. The results describe the probability of number of correct items.
Expected Outcomes
We will present results indicating how highly appreciating access to credible information and higher education seem to help civic online reasoning, while working with fact-checking (as reported by participants) may not provide knowledge, skills and attitudes necessary to determine news credibility. In addition, we also find an advantage of having an educational orientation in humanities or social sciences, being older and having more conservative party sympathies. Higher education was associated with better performance on half of the items and total number of correct. The fact that especially studies in the humanities was associated with a good overall performance on the test highlights how civic online reasoning may be a disciplinary literacy closely connected to knowledge, skills and attitudes developed in the humanities (Moore, 2011; T. Shanahan and Shanahan, 2012). The concept credibility importance seems to capture a mind-set that is associated with high performance on the overall score, sourcing an online newspaper and debunking a manipulated image as evidence. One explanation may be that this is a mind-set of openness towards the knowledge of others and an interest in news. This may indicate that people who actively look for news to become more informed, so called news-seekers (Strömbäck, et al., 2013), are better at navigating digital information. Another tentative explanation is that valuing credibility in news is linked to science curiosity (Dan M. Kahan, et al., 2017). Science curiosity is characterized by a propensity to seek out for surprising information, even when it contradicts people’s political views. We also note that there were no results pointing at an advantage for the youngest age group, “digital natives”. Our results speaks against the assumption that growing up in a digital world automatically leads to a better ability to navigate and interact online.
References
Babakar, M., & Moy, W. (2016). The State of Automated Factchecking: How to Make Factchecking Dramatically More Effective with Technology We Have Now. Full Fact, 28 Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., . . . Quattrociocchi, W. (2016). The spreading of misinformation online. Proc Natl Acad Sci U S A, 113(3), pp. 554-559. doi:10.1073/pnas.1517441113 Retrieved from https://www.ncbi.nlm.nih.gov/pubmed/26729863 Kahne, J., & Bowyer, B. (2017). Educating for democracy in a partisan age: Confronting the challenges of motivated reasoning and misinformation. American Educational Research Journal, 54(1), pp. 3-34. Lazer, D. M., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., . . . Rothschild, D. (2018). The science of fake news. Science, 359(6380), pp. 1094-1096. McGrew, S., Breakstone, J., Ortega, T., Smith, M., & Wineburg, S. (2018). Can students evaluate online sources? Learning from assessments of civic online reasoning. Theory & Research in Social Education, pp. 1-29. McGrew, S., Ortega, T., Breakstone, J., & Wineburg, S. (2017). The Challenge That's Bigger than Fake News: Civic Reasoning in a Social Media Environment. American Educator, 41(3), p 4. Nygren, T., & Guath, M. (2018). Swedish teenagers’ problems and abilities to determine digital news credibility. Nordicom review(forthcoming) Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an interdisciplinary framework for research and policymaking. Wineburg, S., & McGrew, S. (2017). Lateral Reading: Reading Less and Learning More When Evaluating Digital Information. Retrieved from https://papers.ssrn.com/abstract=3048994 https://papers.ssrn.com/sol3/Delivery.cfm?abstractid=3048994 Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), pp. 1146-1151.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.