Session Information
22 SES 08 A, Digital Learning and Teaching
Paper Session
Contribution
The primary aim of graduate education is to raise independent researchers (Gardner, 2008). With this aim, research methods courses are offered in many graduate programs. These courses predominantly focus on quantitative research methods, where statistical analyses play a significant role (Leech & Haug, 2015). However, students, particularly in the fields of education and social sciences, often experience high levels of statistical anxiety (Pan & Tang, 2004) in these courses, leading to challenges in learning inferential statistics methods.
As a result, numerous studies have been conducted on statistical achievement. Literature mostly focuses on factors predicting statistical achievement, such as statistical anxiety (e.g., Ciftci et al., 2014), self-efficacy beliefs (e.g., Huang & Mayer, 2019), and attitudes toward statistics (e.g., Lavidas et al., 2020). More recent studies have examined the use of digital tools and platforms in quantitative research methods education and conducted quantitative and qualitative research to measure their effectiveness. These recent studies are limited in number and focus on basic statistical knowledge rather than the inferential statistics methods covered under research methods. Additionally, the scope of technology use in these studies is generally narrow, often centered on a single tool (e.g., Arjomandi et al., 2023; Yilmaz et al., 2023). Also, the very limited studies for environment designs (e.g., Jiang et al., 2019; Ritzhaupt et al., 2020), for postgraduate students do not take students’ insights into their design.
Given the advancing nature of technology and its potential to provide a more resilient and flexible structure in extraordinary circumstances, there is a need for broader technology-focused studies on quantitative research methods education for postgraduate students in education and social sciences. Thus, the current study aims to design an innovative learning environment for teaching quantitative research methods by combining pedagogical foundations, design principles, and students’ insights.
In this study, the theoretical framework integrates Sociocultural Theory and Situated Cognition, emphasizing the role of cultural and community interactions in shaping knowledge and behavior (Nathan & Sawyer, 2014). Learning is mediated by culturally constructed tools, meanings, and social contexts, with culture and community defining what is possible within educational practice (Wilson & Myers, 2000). In this sense, the environment will be designed using the Community of Inquiry (CoI) framework (Garrison et al., 2000) both as pedagogical foundation and design principles. CoI highlights the importance of communication within a shared-goal community and reflective, critical discussions to construct personal meaning and shared understanding. Therefore, the content and features of the environment will support collaborative learning. To enhance the cognitive and social presences of this model, users will be provided with well-structured, problem-based scenarios developed within the framework of Problem-Based Learning (PBL) theory (Lu et al., 2014).
Research questions are:
RQ1) What are the insights of graduate students regarding the necessary skills and ways to increase those skills to be successful in a quantitative research methods course?
RQ2) What are the insights of graduate students regarding the sources of and the coping strategies with statistics anxiety in a quantitative research methods course?
RQ3) What are the design suggestions of graduate students for an online collaborative environment that can decrease statistics anxiety and increase self-efficacy beliefs?
RQ4) How do the findings of the case study can be integrated into social, cognitive, and teaching presence of CoI to inform the design elements?
Method
This mixed methods design follows a three-phase exploratory sequential design. First phase starts with qualitative data collection through a case study. In the second phase, the online learning environment will be designed with the findings from the case study and literature. In the final phase, the effectiveness of the environment is analyzed quantitatively. The first phase of the study was completed, the second phase will be completed by August, and the third phase is not in the scope of this conference. The instruments are semi-structured interviews, Statistics Anxiety Scale (SAS, Vigil-Colet et al., 2008), and Current Statistics Self-Efficacy Scale (CSSE, Finney & Schraw, 2003). Phase 1: A case study was conducted on the experiences of graduate students about statistical anxiety and self-efficacy beliefs. Participants were 5 graduate students chosen with critical sampling. They are in the same Ph.D. major but differ in their M.A.s. The first interview was openly coded by two independent coders. They compared their codes, resolved discrepancies, and created a codebook. The second interview was coded by the same coders using this codebook. Inter-coder consistency was calculated, and any inconsistencies were identified and resolved. The remaining interviews were analyzed by a single coder. To ensure the reliability and validity of the data, in addition to face-to-face interviews, data was triangulated with SAS and CSSE. Phase 2: The findings from the case study will be combined with the literature to design the online environment. The design elements of the environment will reflect the presences of CoI. The content will be given through ill-structured problem scenarios based on Problem-Based Learning theory.
Expected Outcomes
The results of the first research question showed graduates believe mastering statistical analysis steps -selecting the correct test, interpreting and reporting results, and understanding foundational terminology is important. To improve them, they use books, videos, sample reports, and AI, and practice with datasets, research, and past mistakes. The findings of the second research question showed anxiety stems from course-related factors like exams, selecting and interpreting analyses, cognitive overload (terminology, topics, mathematical background), and non-course factors (general anxiety and interactions). Strategies include planning, using visual aids and teacher notes, practicing with small datasets and creating summary-tables. For the third research question, suggestions included theory and practice differentiation, using real-life data, simulations, visualizations, and excluding software. For assessment, students proposed peer-reviewed homework and research projects. Scaffolding suggestions included incorporating feedback mechanisms, journal interpretation tasks, and individualized learning features. Triangulation: In SAS, exam anxiety had the highest mean, similarly, exams were mentioned as major anxiety sources during interviews. The highest-rated item was taking a statistics exam without sufficient time to revise. Interviewees mentioned feeling overwhelmed by the volume of topics and self-blame for delayed studying. CSSE results showed differences. Interviewees highlighted applying correct statistical procedures and interpretation as essential skills they struggled with; however, rated them as high confidence in the CSSE. For the fourth research question, preliminary design features can include emphasizing interpreting and reporting findings through ill-structured problems for SP and CP, chatbots or interactive checklists as feedback or control mechanisms for CP and TP, asynchronous videos for theory and application for CP and TP, and article interpretation tasks for TP, CP, and SP. Future Implications: While focused on graduate students in Turkey, findings may apply to (under)graduates globally due to shared experiences and encourage the creation of communities. The design can serve as a flexible framework adaptable to diverse contexts.
References
Arjomandi, A., Paloyo, A., & Suardi, S. (2023). Active learning and academic performance: The case of real-time interactive student polling. Statistics Education Research Journal,22(1). Ciftci, S., Karadag, E., & Akdal, P. (2014). Instruction of statistics via computer-based tools: Effects on statistics' anxiety, attitude, and achievement. Journal of Educational Computing Research,50. Finney, S.J. & Schraw, G. (2003). Self-efficacy beliefs in college statistics courses. Contemporary Educational Psychology, 28(2). Garrison, D.R., Anderson, T.,& Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education,2(2-3). Gardner, S. K. (2008). What’s too much and what’s too little?: The process of becoming an independent researcher in doctoral education. Journal of Higher Education,79. Huang, X., & Mayer, R. E. (2019). Adding self-efficacy features to an online statistics lesson. Journal of Educational Computing Research, 57(4). Jiang, M., Ballenger, J., & Holt, W. (2009). Educational leadership doctoral students' perceptions of the effectiveness of instructional strategies and course design in a fully online graduate statistics course. Online Learning,23(4). Lavidas, K., Barkatsas, T., Manesis, D., & Gialamas, V. (2020). A structural equation model investigating the impact of tertiary students' attitudes toward statistics, perceivedcompetence at mathematics, and engagement on statistics performance. StatisticsEducation Research Journal. 19. Leech, N. L., & Haug, C. A. (2015). Investigating graduate level research and statistics courses in schools of education. International Journal of Doctoral Studies, 10. Lu, J., Bridges, S.,& Hmelo-Silver, C.E. (2014). Problem-based learning. In R.K. Sawyer (Ed.), The Cambridge handbook of the learning sciences. Nathan, M., & Sawyer, R. (2014). Foundations of the learning sciences. In R.K. Sawyer (Ed.),The Cambridge handbook of the learning sciences. Pan, W. & Tang, M. (2004). Examining the effectiveness of innovative instructional methods on reducing statistics anxiety for graduate students in the social sciences. Journal ofInstructional Psychology,31. Ritzhaupt, A. D., Valle, N., & Sommer, M. (2020). Design, development, and evaluation of an online statistics course for educational technology doctoral students: A design and development case. Journal of Formative Design in Learning,4. Vigil-Colet, A., Lorenzo-Seva, U.,& Condon, L. (2008). Development and validation of the Statistics Anxiety Scale. Psicothema,20(1). Wilson, B. G., & Myers, K. M. (2000). Situated cognition in theoretical and practical context. In D. H. Jonassen & S. M. Land (Eds.), Theoretical foundations of learning environments. Yilmaz, Z., Ergul, K., & Asık, G. (2023). Role of context in statistics: Interpreting social and historical events. Statistics Education Research Journal, 22(6).
Update Modus of this Database
The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.