30 SES 06 C JS, Challenges and Risks in Open Access, Open Educational Resources and Open Learning
Joint Paper Session NW 02, NW 06, NW 12, NW 30
Since the last two decades we can state a tremendous expansion of educational research activities, especially in terms of quality of school. This trend is boosted by founding organisations on an international level and on national levels as well. Producing knowledge about the quality of school and instruction is accompanied by the development of research instruments to measure and compare the quality of schools and of educational systems. Various studies focus on a large variety of school related outcomes, processes and conditions of schooling as well as on individual and institutional background information. These questionnaire instruments are collected and documented in the “Database for Quality of Schools (DaQS)” for charge free reuse issues. The service is part of the Research Data Centre (RDC) Education hosted by the DIPF | Leibniz Institute for Research and Information in Education. It offers central access to questionnaire instruments via search options and by a browsing option using a framework of theoretical constructs for the assessment of schooling and instruction.
Every questionnaire-scale in the database is linked up to three theoretical constructs of schooling and instruction. The resulting construct scheme is the theoretical backbone of the database. It is inspired by several meta-studies to identify trusted constructs of educational research (see Scheerens & Bosker 1997, Ditton 2000, Seidel 2008, Wang et al. 1993). The identified constructs were arranged in an input-process-output model following four major categories on the first level. In the next step a second level of constructs was added to allocate scales to the constructs. From this starting point the construct scheme was extended by using an inductive approach: after a critical amount of allocated scales was reached, the constructs were reviewed and split off to extend the framework. Finally a third level was implemented into the scheme.
In the beginning the scheme was developed and validated out of a pool of studies like “Deutsch Englisch Schülerleistungen International – DESI” (Klieme et al. 2008) or “Studie zur Entwicklung von Ganztagsschulen (StEG)” (Fischer et al. 2011) located in Germany. Including international large scale assessments like the “Programme for International Student Assessment - PISA 2015” (Kuger, Jude & Klieme 2016) or “Trends in International Mathematics and Science Study – TIMSS 2007” (Bos et al. 2009) the scheme was opened up for an international dimension. Using this systematic allocation we were able to include every scale taken from these international studies into our theoretical framework. Even historical studies from 1978 like the study „Wissenschaftliche Begleitung von Gesamtschulen“ (Haenisch et al 1979) fit well into the scheme.
The database actually comprises the questionnaire instruments of 50 studies and a total of approx. 6,500 scales allocated to the theoretical constructs. Every scale is allocated to a construct on the second or - if available - on the third level, for example: the scale “loss on individual level – truancy” taken from PISA 2015 was allocated on the third level to the construct “deviant behaviour” which is linked on the second level to the construct “school climate” which belongs to the major construct “processes at the school level”. Four major constructs are located on the first level. For the major construct “Individual and institutional background” we have identified seven constructs on the second level like “socio-demographic-background” or “professional background of the staff”. On the third level we find another 31 constructs making the allocation more precise. The second major construct “processes at the school level” is divided into six constructs for example “school climate” or “co-operation”. We can find another ten constructs on the third level for this major category. The major construct “processes of teaching and learning” consists of five constructs like “instruction” and “classroom management”. On the third level we find twelve constructs for this major category. For the construct “learning outcomes” we can identify three constructs on the second level and four constructs on the third level. The database takes only non-cognitive and metacognitive outcomes into account. Cognitive outcomes are assessed by test instruments which are not part of the questionnaire material. We can conclude a bias by the lack of cognitive learning outcomes. Nevertheless, the presence of third level-constructs is an indicator for a large variety of available instruments and for a high level of assessment activities.
Using this data we can identify major fields of single studies e.g. of quantitative educational activities research by analysing the database itself and the reuse behaviour as well. Which are the most assessed constructs and subjects? Which constructs are less assessed or do not even appear in the scheme? What does the analysis tell us of the reuse behaviour in terms of current usage? Answers to these questions we will find in descriptive analysis of the database. Another approach picks up visualisation via word clouds of the constructs scheme, to gain insights in the theoretical arrangement of studies and the research field at a glance. Furthermore download statistics and view statistics will be reviewed to analyse the reuse behaviour of the service. The further validation of the scheme depends on the integration of new studies. The database is open for expansion, which leaves room for discussion and cooperation to include new constructs in the scheme or discuss the allocation of specific scales to constructs as well. Additionally statistics and insights as a result of this meta-analysis can be used to show up blind spots in educational research to inspire new studies und enhance the development of new instruments for a better assessment of school and instruction.
Bos, W., Bonsen, M., Kummer, N., Lintorf, K. & Frey, K. (2009): TIMSS 2007 Dokumentation der Erhebungsinstrumente zur Trends in International Mathematics and Science Study , New York, München, Berlin: Waxmann 2009 Ditton, H. (2000). Qualitätskontrolle und Qualitätssicherung in Schule und Unterricht. Ein Überblick zum Stand der empirischen Forschung. In A. Helmke, W. Hornstein & E. Terhart (Hrsg.), Qualität und Qualitätssicherung im Bildungsbereich: Schule, Sozialpädagogik, Hochschule (S. 73 92). Weinheim: Beltz. Fischer, N., Holtappels, H. G., Klieme, E., Rauschenbach, T., Stecher, L. & Züchner, I. (2011): Ganztagsschule: Entwicklung, Qualität, Wirkungen Längsschnittliche Befunde der Studie zur Entwicklung von Ganztagsschulen (StEG) , Weinheim und Basel: Beltz Juventa 2011 Haenisch, H., Lukesch, H., Klaghofer, R. & Krüger-Haenisch, E.-M. (1979): Gesamtschule und dreigliedriges Schulsystem in Nordrhein-Westfalen - Schulleistungsvergleiche in Deutsch, Mathematik Englisch und Physik , In: Schule und Weiterbildung. Arbeitsmaterial und Berichte zur Sekundarstufe I, Vergleichsuntersuchung in Nordrhein-Westfalen durchgeführt am Zentrum I-Bildungsforschung der Universität Konstanz unter Leitung von Prof. Dr. Helmut Fend (8), Paderborn: Schöningh 1979 Klieme, E., Eichler, W., Helmke, A., Lehmann, R., Nold, G., Rolff, H.-G., Schröder, K., Thomé, G. & Willenberg, H. (2008): Unterricht und Kompetenzerwerb in Deutsch und Englisch Ergebnisse der DESI-Studie , Weinheim: Beltz 2008 Kuger, S., Jude, N. & Klieme, E. (2016): Documentation of Scales (online): PISA 2015 Field Trial – Core 6, Context , In: Datenbank zur Qualität von Schule (DaQS) , Frankfurt am Main: Deutsches Institut für Internationale Pädagogische Forschung (DIPF) 2016 Scheerens, J. & Bosker, R. J. (1997). The foundations of educational effectiveness. Oxford: Pergamon Press. Seidel, T. (2008). Stichwort: Schuleffektivitätskriterien in der internationalen empirischen Forschung. Zeitschrift für Pädagogik, 54, 1-20. Wang, M. C., Haertel, G. D. & Walberg, H. J. (1993). Toward a knowledge base for school learning. Review of Educational Research, 63, 249 294.
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.