ERG SES E 10, Evaluation in Education
As stated by The National Research Council report (NRC, 2007), main aspects of science proficiency involves knowledge, use, and interpretation of scientific explanations; production and evaluation of scientific evidence and explanations; comprehension of development and nature of scientific knowledge; and participation in scientific practices and discourse in a productive manner. Indeed, proficiency in science entails the understanding of the practices lying behind the development of scientific knowledge as well as abilities to engage in scientific practices which involve the processes of how scientists work and how scientific knowledge is produced and justified (Lederman, Lederman, Bartos, Bartels, Meyer, Schwartz, 2014; Strimaitis, Schellinger, Jones, Grooms & Sampson, 2014). Accordingly, scientific practices include two main aspects: the discovery aspect requires making observations, formulating hypotheses, obtaining, analyzing, interpreting, and presenting data, as well as engaging in inductive and deductive reasoning. The justification side, on the other hand, requires argumentation about how the data are related to different hypotheses or theories, or why certain evidences should be favored against other evidences (Matthews, 2015; Okasha, 2002). According to Strimaitis, et al., (2014), these “scientific practices” are necessary for individuals to critically evaluate the scientific claims appearing in the popular media which is an expected ability of scientifically literate people. Scientifically literate people critically evaluate and discuss the reports about science in media. In a study conducted by Leung (2013), quality of the non-science major college students’ evaluation of scientific reports in the media was examined in relation to their understanding of nature of science which is a core concept of scientific literacy. As a sample the researcher intentionally chose the non-science major college students because they represent the majority of the society. Results of the study indicated very weak or no relations between quality of evaluation of scientific news and nature of science understanding. Based on the results, it was suggested that the findings might be influenced by the media literacy level of the students since media literacy has role in the ability to access, analyze, evaluate and create written or unwritten information throughout several channels such as internet, television, newspaper etc. (Kellner & Share, 2005; Thoman &Jolls, 2003). Accordingly, the current study aims to address following research question: is there a relationship between non-science major students’ knowledge of scientific practices to critically evaluate scientific reports in popular media and their media literacy level?
Sample A total of 266 undergraduate students (185 female and 78 male) from various departments including philosophy, history, Turkish Language and Literature, sociology, geography, mathematics, and economics participated in the study. They ranged in age from 21 to 27 years (M = 23.20, SD= 1.12). Instruments Evaluating Scientific Claims: It is a 12-item two-tier multiple choice diagnostic instrument developed by Strimaitis, et al., (2014). The items were about two articles from the popular media. One of the articles was about the dangers of high heels and another was about dangers in energy drinks. The items targeted specific scientific practices necessary to evaluate scientific claims critically: First tier of each item was developed to assess students’ ability to evaluate the claim and the second tier was developed to assess logic behind responding to the first tier. Students were assigned one point only if their responses to both tiers are correct. Media Literacy Scale: It is a 17 item instrument on a 5-point Likert scale (1= never to 5 = always) developed by Karaman and Karataş (2009). Media literacy scale consists of 3 dimensions, namely; being knowledgeable with 7 items (e.g., “I examine and criticize the messages given in mass media”), analyzing and reacting with 6 items (e.g., “I give positive or negative reactions to the messages in the mass media”), and judging, being aware of implicit messages with 4 items (e.g., “I realize hidden advertisements in mass media”). In the present study, Cronbach’s alpha coefficient for the whole scale was .82 and reliability coefficients ranged from .61 to .70 for the sub-scales.
Frequency distribution tables were created to determine whether undergraduate students have adequate knowledge of scientific practices to critically evaluate scientific reports in popular media. Results, in general revealed that, the participants knowledge was insufficient. For example, majority of participants (91 %) could not correctly identify the controlled, randomized experimental study design necessary for establishing cause-and-effect relationship. The participants appeared to think that correlation implies causation. Indeed, 98.9 % of the participants responded both tiers of the related item incorrectly. In addition, the participants appeared to have difficulty in realizing the importance of peer review in scientific claims. For instance, only 35.3 % of them correctly identified that publication of the article about the -dangers of high heels- in the Journal of Applied Physiology in January 2012 issue was the most essential for evaluating the claims. Moreover, related to measurement errors, about one quarter of students (26.3 %) could correctly think that 95 % confidence interval used in a separate study was suitable since complete control of human error is inevitable. Results also showed that majority of participants were not aware that hypotheses are never proven but supported and scientific claims have to be based on evidences. For instance, only 14.7 % of participants correctly identified that the claim made in the article about - dangers in energy drinks- was not validated because at present long-term side effects to human health are not known. In order to examine the role of media literacy in students’ ability to evaluate scientific reports in popular media, multiple regression analysis was conducted. Dimensions of media literacy (i.e. being knowledgeable, analyzing and reacting and judging, being aware of implicit messages) were used as independent variables. Results showed that the model with these 3 independent variables was not significant R = .13, F(3,244)= 1.30, p = .274.
Karaman, M. K., & Karataş, A. (2009). Media literacy levels of the candidate teachers. Elementary Education Online, 8(3), 798-808. Kellner, D., & Share, J. (2005). Toward critical media literacy: core concepts, debates, organizations, and policy. Discorse: Studies in The Cultural Politics of Education, 26(3), 369-386. Lederman, N. G., Antink, A., & Bartos, S. (2014). Meaningful assessment of learners' understandings about scientific inquiry—The views about scientific inquiry (VASI) questionnaire. Journal of Research in Science Teaching, 51, 65-83. Leung, S. J. (2013). Understanding of nature of science and evaluation of science in the media among non-science majors (Unpublished doctoral dissertation). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5016262 Matthews, M.R. (2015). Science Teaching: The Contribution of History and Philosophy of Science (20th Anniversary Revised and Expanded Edition). New York: Routledge National Research Council. (2007). Taking Science to School: Learning and Teaching Science in Grades K-8. Committee on Science Learning, Kindergarten Through Eighth Grade. R.A. Duschl, H.A. Schweingruber, and A.W. Shouse (Eds.). Board on Science Education, Center for Education. Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press. Okasha, S. (2002). Philosophy of science: A very short introduction. New York, Oxford University Press. Strimaitis, A. M., Schellinger, J., Jones, A., Grooms, J., & Sampson, V. (2014). Development of an instrument to assess student knowledge necessary to critically evaluate scientific claims in the popular media. Journal of College Science Teaching, 43(5), 55-68. Thoman, E., & Jolls, T. (2003). Literacy for 21st century an overview and orientation guide to media literacy education. Center For Media Literacy.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.