Session Information
22 ONLINE 23 B, Perspectives on Academics Teaching Skills and Experiences
Paper Session
MeetingID: 875 9290 5086 Code: ur5bip
Contribution
The pandemic period we are facing have forced the universities to go online, and to use the digital learning and assessment formats intensively. The experience gathered during this period is to be capitalized in the post-pandemic period, as generations of students got used with almost entirely online education and asks with a wider extension for an intensive digital education, learning and assessment. However, the digital skills of academic staff to manage digital tools during the remote teaching and beyond showed very imbalanced levels of competences and uncovered the need for continuing professional development in this regard.
Going green and digital is the landmark of developing European higher education area for the next period, to ensure that the graduates are enabled with the digital competences needed in the digital society and for sustainable development. Almost 90% of the European HEIs (Higher Educational Institutions) have a strategy for digitally enhanced teaching, learning and evaluation, with special measures for enabling their academic staff to provide a secured and qualitative digital learning environment, in line with the European Digital Education Action Plan, for 2021-2027 (Humpl, Andersen, 2022).
The digital assessment, the ”new normal” of the disruptive period, has challenged the traditional models of assessment, and forced the academic staff, more or less used with adoption of technologies, to look for tools and solutions for authentic and reliable evaluation. However, the digital embedded learning environments require moving from acquiring skills to mastering certain tools or technological competencies, including on new digital technologies such as artificial intelligence or virtual and augmented reality, so they are able to take a more active role in the design and implementation of assessment with technologic tools (Bennett et all, 2017), and ”to ensure their effective, desirable, and inclusive use in the future” (Humpl, Andersen, 2022).
Such training needs were underlined by the empirical research as well, before and during pandemic period. Thus, the systematic literature review highlights that, irrespective if we are talking about assessment for/of/as learning teachers are running (Brady, Devitt, Kiersey, 2019), about ensuring the security, integrity and honesty of evaluation and avoiding of cheating (Garg, Goel, 2021), about a digitally embedded culture, or about different pedagogic approaches, beliefs and attitudes of teachers towards using the digital tools (Slade et al, 2021, the quality of assessment in HEIs (Liu, Geertshuis & Grainger, 2020; Bekmanova et al., 2021) and its effectiveness (Deeley, 2018; Montenegro-Rueda et al., 2021), cannot be imagined without ensuring adequate digital competencies of the academic staff.
Thus, drawing on the pedagogic models of technology knowledge for assessment (Akram et al, 2021) and on Technology Acceptance Model (TAM) (Venkatesh, Davis, 2000), we aimed to identify the variables influencing the use of digital assessment by academic teachers, their perceived usefulness and ease of use, their attitudes, intentions and actual using of it, as well as training needs the academic staff has, in order to improve and extend the digital assessment tools in didactic, learning and evaluation settings.
Trying to expand the research findings of Brady, Devitt, Kiersey (2019), we focused our systematic review on the empiric research published in the last five years (2017-ian. 2022), trying to answer to the following research questions: How the academic staff perceive and use the digital and online assessment? What challenges, competencies and needs do they have for carrying out the digital evaluation /assessment? What is the impact of digital and online assessment on academic staff, during the pandemic period and beyond?
The findings can be of use for the management of HEIs, researchers, teachers, and curriculum developers of tailored trainings for academic staff.
Method
The research was conducted through a systematic literature review, in order to identify relevant research findings about the views, experiences, needs and competencies of academic staff towards designing and running digital assessment. As these practices and needs became more intensive during the pandemic period, we have restricted our analysis to the last five years, building on previous findings (Brady, Devitt, Kiersey, 2019). Trying to systematically map the research findings on the academic teachers views and use of the technology for (digital) assessment, we decided to apply a mixed a mixed methods review (Gough, Thomas, 2016). Within a review context, it refers to a combination of review approaches, quantitative with qualitative research, so that the analysis may characterize both literatures and look for correlations between characteristics, to identify gaps, or emerging topics, research trends etc. (Gough, Thomas, 2016; Alexander, 2020; Zawacki-Richter et al., 2020). The search process was carried out in two most prestigious databases for high quality peer-reviewed research, Web of Science (WoS) and Scopus. There were found 110 documents from Scopus and 175 in WoS, while using the research algorithm: ("digital assessment" OR "online assessment" OR e-assessment OR "virtual assessment" OR "computer assisted assessment" OR "technology aided assessment" OR "technology enhancement assessment" OR "technology assisted assessment" OR “online evaluation” OR “digital evaluation” OR “technology for assessment”) (All Fields) AND ("higher education" OR universit* OR colleg*) (All Fields) AND (“academic staff” OR "faculty staff” OR “college staff” OR “university staff” OR teacher*) (All Fields) AND needs OR competenc* OR abilit* OR skill* (All Fields) The research was conducted in November 2021, and updated in January 2022. Following the research steps of systematic analysis, as stated in PRISMA guidelines (Page et al, 2021), we excluded the common articles in the two databases, also the articles not fitting to our research interest, excluding further articles after reading the abstracts, adding other relevant articles with snowball search and updating with newly published articles from November till on January, etc., so 37 articles met in the end the eligibility criteria. We worked with Biblioshiny (a software package web-based on R language), with VoSViewer to analyze and visualize the research status and trends in the field and with EPPI Reviewer Web for systematic review (EPPI Centre, 2017), suitable for both small- and large-scale reviews. These software programs are freely available online and effective in performing literature review.
Expected Outcomes
Our study aims to inform both theoretic discussion and practical implications for improved, extended and qualitative use of the digital tools for assessment in HEIs. The views, experiences and challenges of the academic staff with regard to using the digital assessment were less researched than the ones of the students (Brady, Devitt, Kiersey, 2019). During the COVID-19 pandemic, as the teachers were exposed to intensive use of online teaching and assessment, their skills and competences for (digital/e-)assessment became even more important to be identified and improved. Thus, we map the research findings on their competencies, experiences, perspectives, and training needs for performing a qualitative assessment for/as/of learning, highlighting the technologic assessment mechanisms and strategies as well that can lead to a fair, relevant and honest assessment process (Akram et al, 2021; Deeley, 2018; Slade et al, 2021). Besides the traditional components of assessment as a learning tool in HEIs, our study focused on different needs and competences related to different assessments methods in online learning, such as projects, digital portfolios, self or peer assessments, feedback for/as learning, timed tests and quizzes, asynchronous discussions, simulations, exams, etc. (Slade et al, 2021; Montenegro-Rueda et al., 2021). The findings highlight the needs of academic staff both in relation with mastering the technologic tools (as their competencies are at low level quite often), and in relation to the pedagogical use and integration of digital tools for assessment. They point as well on the need to encourage teachers to thoughtfully design student assessments, being aware in the same time about the technical solutions which can prevent dishonest behaviours (Garg, Goel, 2021). The study can play a significant role in the integration of technology for evaluation in HEIs and for enhancing the technological and pedagogical competencies of the academic staff for doing qualitative digital assessment.
References
Akram, H., Yingxiu, Y., Al-Adwan, A. S., & Alkhalifah, A. (2021). Technology Integration in Higher Education During COVID-19: An Assessment of Online Teaching Competencies Through Technological Pedagogical Content Knowledge Model. Frontiers in Psychology, 12. https://doi.org/10.3389/fpsyg.2021.736522 Alexander, P.A. (2020). Methodological Guidance Paper: The Art and Science of Quality Systematic Reviews. Review of Educational Research. 90(1): 6-23. doi:10.3102/0034654319854352 Brady, M., Devitt, A., Kiersey, R.A., (2019). Academic staff perspectives on technology for assessment (TfA) in higher education: a systematic literature review. British Journal of Educational Technology.50(6): 3080-3098. 10.1111/bjet.12742 Bekmanova, G., et al. (2021). Personalized training model for organizing blended and lifelong distance learning courses and its effectiveness in Higher Education. J Comput High Educ. 33, 668–683. https://doi.org/10.1007/s12528-021-09282-2 Bennett, S., Dawson, P., Bearman, M., Molloy, E., & Boud, D. (2017). How technology shapes assessment design: Findings from a study of university teachers. British Journal of Educational Technology, 48(2), 672-682. doi:10.1111/bjet.12439 Deeley, S. J. (2018). Using technology to facilitate effective assessment for learning and feedback in higher education. Assessment & Evaluation in Higher Education, 43(3), 439-448. EPPI Centre. (2017). EPPI-Reviewer 4: Software for Systematic Reviews. Retrieved from http://eppi.ioe.ac.uk/eppireviewer4/eppireviewer4.aspx Gough, D., Thomas, J. (2016). Systematic reviews of research in education: Aims, myths and multiple methods. Review of Education, 4(1), 84–102. https://doi.org/10.1002/rev3.3068 Humpl, S., Andersen, T. (2022). The future of digital and online learning in higher education. European Commission. https://data.europa.eu/doi/10.2766/587756 Garg, M., Goel, A. (2021). A systematic literature review on online assessment security: Current challenges and integrity strategies. Computers & Security. DOI: 10.1016/j.cose.2021.102544 Liu Q., Geertshuis S. & Grainger R. (2020). Understanding academics' adoption of learning technologies: A systematic review. Computers & Education. (2020), doi: https://doi.org/10.1016/ j.compedu.2020.103857 Montenegro-Rueda M, Luque-Rosa, A., Sánchez-Serrano S.J.L, Fernández-Cerero J. (2021). Assessment in Higher Education during the COVID-19 Pandemic: A Systematic Review. Sustainability. 13(19):10509. https://doi.org/10.3390/su131910509 Page MJ, McKenzie JE, Bossuyt PM, et al. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ;372:n71. DOI: 10.1136/bmj.n71. Slade, C., Lawrie, G., Taptamat, N., Browne, E., Sheppard, K., Matthews, K.E. (2021). Insights into how academics reframed their assessment during a pandemic: disciplinary variation and assessment as afterthought. Assessment & Evaluation in Higher Education. DOI: 10.1080/02602938.2021.1933379 Venkatesh, V., Davis, F. (2000). A theoretical extension of the techology acceptance model: four longitudinal fields of study. Management science, 46(2): 186-204. https://doi.org/10.1287/mnsc.46.2.186.11926 Zawacki-Richter, O., Kerres, M., Bedenlier, S.,Bond, M., Buntins, K., (2020). Systematic Reviews in Educational Research. Methodology, Perspectives and Application. Wiesbaden: Springer. https:///doi.org/10.1007/978-3-658-27602-7
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.