Session Information
Paper Session
Contribution
This paper is about the use of virtual reality (VR) in education. Our study was guided by the question What would an instrument look like with which the educational effectiveness of virtual reality can be easily estimated?
Forms of VR (including eXtended, Mixed and Augmented Reality, and educational games) show a strong development over the last decades. Research shifted from the realism of the virtual environments to the (dis)comfort users experience. These studies revealed insights about, for instance, presence, immersion, motion sickness, tired eyes, and emotion arousal (see e.g., Bareišytė et al., 2024). Nowadays, VR is more and more used in educational settings. High expectations of its usefulness are held (e.g., OECD, 2024). The OECD suggest the DICE-framework to decide when the application of VR is prescribed, but this does not say anything about an application’s effectiveness. Educational effectiveness goes beyond the affordances of the technology, it is especially about the effects of the technology on learning. Indeed, several organizations call for research that addresses the thorough pedagogical design and use of technologies such as VR (Ex et al., 2023; Unesco, 2018).
We regard educational effectiveness of VR as the amount to which an application supports the realization of the (learning) goals. In order to evaluate educational effectiveness of VR, insights from both the educational sciences and information technology (IT) are relevant. From the educational sciences, insights about the design of environments that foster learning are relevant. Education is always oriented towards specific learning outcomes and thus, aims at specified learning goals. Insights about the function of feedback and appropriate ways of evaluation are useful in this regard. Furthermore, effective learning stresses the importance of interaction, active learning and collaboration (e.g., Baume & Scanlon, 2028; OECD, 2017). With this in mind, learning content, pedagogical approach and (technological) tools should be composed in such a way that together they form a logical path towards the desired outcomes. Research from the field of IT reveals that VR-designers have to make choices about the so-called 6 P’s (i.e., presence, perspective, participation, proximity, point of view, and place ) in order to define the user’s experience (Fuchs et al., 2011). Design of P’s that contribute to a positive experience activates learning processes and, thus, may foster learning (Fokides, 2023). According to Fokides, characteristics of the technical affordances, the emotional response of the user and characteristics associated with the learning content contribute to a positive learning experience. However, research has not reached consensus yet about which characteristics contribute the most to a positive experience, and when (Antonopoulos et al., 2024).
Insights from educational sciences and IT can be brought together in the CIMO-model (Context, Intervention, Mechanism, Outcome; Denyer et al., 2008) in order to form a solid basis for establishing the educational effectiveness of VR. In educational design, this model is regularly used to ensure a consistent design towards learning goals. This means that the elements (C, I, M, and O) each contain correct, actual and relevant information and no information is omitted. Also, the elements are developed according to the educational and technological insights about effective learning. Furthermore, the individual components are logically related towards each other. As an illustration, a social goal triggers social learning mechanisms through a targeted pedagogical approach and matching 6P-characteristics, and shows a social effect and no undesirable side effects.
Based on the CIMO-model, we aim at developing a scale that can be used to estimate whether it is reasonable to expect that a VR-application supports realizing the learning goals. Also, the scale aims at identifying the elements of the VR-design that should be optimized for educational effectivity.
Method
A mixed method design was used for answering the main question, which we divided into two sub questions: (RQ1) Which guidelines for each CIMO-element contribute to an educational effective VR-design? (RQ2) How do users perceive the usefulness of the instrument that was the outcome of RQ1? First, we developed a questionnaire based on the CIMO-model, in which insights from the educational sciences (particularly about digital pedagogy, learning psychology and educational designs), and IT are used. The table contains some examples from each field for the four design elements. Example statements (Base) C: There is a specified, clearly described learning goal that is understandable to all involved (developer, teacher, user). (educational sciences) I: For users who need to refresh prior knowledge, an auxiliary route (pre-instruction) has been built in. (educational sciences) M: The perspective the user takes in the virtual world supports learning (from yourself, from another person, or from a third person). (IT) O: The evaluation fits seamlessly with the learning goal. In other words, for a knowledge goal, the evaluation focuses on that knowledge, ... (educational sciences) A draft scale was sent for feedback to an international expert on digital pedagogy and digital literacy. After reformulating some items in order to make them more understandable and removing overlapping items, version 1 was set up. The questionnaire consists of 4 parts, each containing 11 or 12 items, for which a 5-point Likert scale was used (1 totally disagree to 5 totally agree, which stood for educational ineffective to educational effective). Then, five teacher-students of our university in their role as a consultant for five VR design projects used the scale for their advice. The scale was used twice: during a mid-term review for giving advice about optimizing the CIMO-elements, and for a final review for evaluating the educational effectiveness of the VR-application. The use of the scale was followed by semi-structured interviews in order to enrich the descriptives. For the five consultants, questions concerned overall ease of use, usefulness of items, and impact of the advice given. Five product owners were interviewed about initial expectations and the outcomes they identified. The scale was analyzed quantitatively, for which descriptive statistics were calculated in order to get an overall impression of the quality of the VR-designs. At the time of this proposal, the interviews take place. These qualitative data will be analyzed intuitively, using the four CIMO elements as categories.
Expected Outcomes
At this moment, the results from the mid-term review are available. During the conference, the results from the final review and the interviews will also be presented. The VR-designs showed a 2.8 on average, which tends to ineffective designs (RQ1), mainly due to the scores on parts C and I. The items of parts M and O were mostly evaluated as neutral. This could be caused by insufficient understanding of this component and therefore a safe evaluation might be chosen. There was no part that especially pointed at an effective design (scores totally agree and agree). Overall, the scale gave indications of the quality of each CIMO-part and of the steps that can be taken for improving the final VR-design. We expect from the interviews (RQ2) that the consultants appreciate the scale’s ease of use with a more than satisfactory rating. From the fact that the Mechanism and Outcome guidelines were less specifically formulated, we expect that both the consultants and the developers will be most challenged by items of this part. Consultants must have knowledge of stimulating specific learning processes that is not automatically part of their teacher training. Developers then presumably get too little targeted information to translate this information independently into the design of the 6 P’s. Thinking from goals and goal realization will also require a perspective for developers that is not automatically taken. As a preliminary conclusion, we can say that there is now a nice-looking CIMO scale that covers the main insights from the educational sciences about the design of educational environments and that is usable to feed the design dialogue about the educational effectiveness of a VR-application. However, differences in communication between the stakeholders involved pose a challenge to also conduct the design dialogue at a deeper level.
References
Bareišytė, L., Slatman, S., Austin, J., Rosema, M., van Sintemaartensdijk, I., Watson, S., & Bode, C. (2024). Questionnaires for evaluating virtual reality: A systematic scoping review. Computers in Human Behavior Reports, 16(2024), 100505. https://doi.org/10.1016/j.chbr.2024.100505 Baume, D., & Scanlon, E. (2018). What the research says about how and why learning happens. In R. Luckin (Ed.). Enhancing learning and teaching with technology: What the research says. IOE Press, University College London. Denyer, D., Tranfield, D., & van Aken, J. E. (2008). Developing Design Propositions through Research Synthesis. Organization Studies, 29(3), 393-413. https://doi.org/10.1177/0170840607088020 Ex, L., Nieuwenhuizen, W., Hijstek, B., Roolvink, S. & M. van Huijstee, M. (2023). Immersieve technologieën. [Immersive technologies]. Rathenau Institute. Fokides, E. (2023). Development and testing of a scale for examining factors affecting the learning experience in the Metaverse. Computers & Education: X Reality, 2023(2), 100025. https://doi.org/10.1016/j.cexr.2023.100025 Fuchs, P., Moreau, G., & Guitton, P. (Eds.). (2011). Virtual Reality: Concepts and Technologies. Routledge. OECD (2017). The OECD Handbook for Innovative Learning Environments. OECD Publishing. http://dx.doi.org/9789264277274-en OECD. (2024). OECD Digital Economy Outlook 2024 (Volume 1): Embracing the Technology Frontier. OECD Publishing. https://doi.org/10.1787/a1689dc5-en Unesco. (2018). Digital pedagogies for building peaceful & sustainable societies. Issue 8. Unesco MGIEP/The Blue Dot.
Update Modus of this Database
The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
 This database will be updated with the conference data after ECER. 
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.