Session Information
00 SES 12, Educational Assessment in an Era of Innovative Learning Technologies: new Challenges, new Solutions?
Symposium
Contribution
Baker argues that innovations in assessment are routinely touted as the next major solution to finding out what students know and can do, of providing trusted and useful information to teachers and to policy makers, and to influencing the practices of design, administration, analyses and reporting. Unfortunately, most proclaimed changes have had little actual impact on the development of dependable assessments for the full range of purposes. This paper describes innovations that can fit within many existing approaches as well as those that could result in their wholesale rejection. Two major innovations on the horizon are certain to influence testing in the next 30 years. The first of these is the infusion of highly systematic qualitative methods and tools to remake the process of current test development. This strategy engages design approaches using ontologies (or content maps) to make visible the range and relationships of content to be learned, the employment of domain-independent design components, such as cognitive demands, text features, and task and response complexity. Domain independent attributes allow systematic comparisons and predictions among other examinations as well as relevant instructional material with the goal of predicting (and then interrupting the prediction) of student performance. Tools have been developed to conduct such studies and they are in use in cross national collaborations, e.g., Spain, Korea, US. These approaches will provide smarter methods for national and cross-national comparisons over time or over populations. Also underway are systematic architectures for generating comparable, and therefore, more trustworthy performance assessments. The second innovative driver is technology itself. The use of analytics to find out what people like and wish to buy on social media and e-commerce has, in a preliminary way, come to education. Analysis of students’ patterns of learning have first been used as supplements and to give teachers feedback in addition to reliance on end of course or standardized tests. One obvious and current application is to use artificial intelligence approaches to judge quality of either in process or outcome learning. This work includes neural nets as well as approaches that move beyond structural analyses and the earlier latent semantic analysis to design and to score learner performance. Current limitations are that such approaches are descriptive and have limited ability to discern meaning unless they are systematically linked to human raters. Our vision is that such in-process measures can predict with accuracy outcome measures, vitiating the need for end of course or program measures.
References
Baker, E. L. (2016, March). Research to controversy in 10 decades. Educational Researcher, 45, 122-133. Baker, E. L., Chung, G. K. W. K., & Cai, L. (2016). Assessment gaze, refraction, and blur: The course of achievement testing in the past 100 years. Review of Research in Education (Centennial Issue), 40, 94-142.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.