Session Information
12 SES 16 A JS, Open Epistemologies. Open Science, Open Truth, Open Data and the Age of Uncertainty
Joint Sesion with NW 06 and NW 12. Full details in NW 12, 12 SES 16 JS
Contribution
The so-called "replication crisis" in (social) psychology a good 10 years ago showed how uncertain scientific findings can sometimes be. In many cases, it was not possible to replicate seemingly undisputed effects that had been published in high-ranking journals following peer review and taught in university studies (Open Science Collaboration, 2015). This phenomenon is not limited to psychology and resulted in the dictum ‘Why Most Published Research Findings Are False’ (Ioannidis, 2005) what can also be expected for educational research (Makel et al., 2021). This raises the question of how scientific knowledge can be improved and made more reliable. There are indications that Open Science, or more precisely its components Open Materials and Open Data, can make a significant contribution (e.g. Krammer & Svecnik, 2021). Open data in particular can be seen as an opportunity to generate stable findings in educational research, but it also raises a number of related questions. For example, the sequence of theory - hypotheses - data collection - analysis and conclusion required as good practice in the classic NHST paradigm (Neyman & Pearson, 1928) is disrupted by the data basis already available. On the one hand, this threatens the validity of statistical hypothesis tests and, on the other hand, encourages HARKing (hypothesizing after the results are known; Kerr, 1998). Both endanger the value and validity of scientific findings. Furthermore, the re-use of data, among others, raises the question of comparability and consistency of the constructs recorded. These and other questions of gaining knowledge through empirical research are discussed in the contribution.
References
Ioannidis, J.P.A. (2005). Why Most Published Research Findings Are False. PLoS Medicine, 2(8), e124. Kerr, N.L. (1998). HARKing: hypothesizing after the results are known. Personality and Social Psychology Review, 2(3), 196–217. Krammer, G. & Svecnik, E. (2021). Open Science als Beitrag zur Qualität in der Bildungsforschung. Zeitschrift für Bildungsforschung, 10(3), 263-278. https://doi.org/10.1007/s35834-020-00286-z Makel, M. C., Hodges, J., Cook, B. G., & Plucker, J. A. (2021). Both questionable and open research practices are prevalent in education research. Educational Researcher, 50(8), 493-504. https://doi.org/10.3102/0013189X211001356. Neyman, J. & Pearson, E. S. (1928). On the use and interpretation of certain test criteria for purposes of statistical inference: part I. Biometrika 20A:1/2, 175-240. https://doi.org/10.2307/2331945 Open Science Collaboration (2015). Estimating the reproducibility of psychological science. Science. https://doi.org/10.1126/science.aac4716.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.