Session Information
99 ERC SES 03 D, Ignite Talks
Ignite Talk Session
Contribution
Online peer editing is the process of commenting on or editing peers’ writing, with the intention of improving the competencies of both the receiver and editor through discussing ideas and questions, reviewing, critiquing or offering advice to each other’s work (Ebadi & Rahimi, 2017; Liu & Edwards, 2018). Students not only gain audience awareness by having their writing reviewed, but also practice their ability to think about issues from multiple perspectives by critically reading their peers’ comments (MacArthur, 2018). Therefore, peer editing is a way to enhance students’ learning by closing the gap between their current level of knowledge and expected performance (Hattie & Clarke, 2018). Students judge the information received through self-reflection and can take some steps to narrow the aforementioned gap (Zhu & Carless, 2018).
Previous research has not broken down the content of peer feedback into detailed categories and has not conducted any empirical investigations into how the different focuses of peer feedback affects students' writing performance. Therefore, this present study explores how different categories of peer feedback affect student academic writing performance in online collaborative learning from both individual and dyadic perspectives. To achieve this purpose, the present study reports on the results of a ten-week study concerned with the online peer editing of 76 students at a Korean university. Specifically, this study draws data extensively from students who engage in online peer feedback and collects all comments that authors received in five academic manuscript chapters in order to gain insight into what impact the type of feedback has on groups and individuals in each of the five academic essay chapters.
Therefore, the following five research questions are proposed, with sub-questions “a” and “b” for each set of student-written chapters:
RQ1: What type of (a) broad, (b) general, and (c) specific types of peer feedback is associated with improved (i) dyadic and (ii) individual academic writing performance of introductions?
RQ2: What type of (a) broad, (b) general, and (c) specific types of peer feedback is associated with improved (i) dyadic and (ii) individual academic writing performance of methodologies?
RQ3: What type of (a) broad, (b) general, and (c) specific types of peer feedback is associated with improved (i) dyadic and (ii) individual academic writing performance of results?
RQ4: What type of (a) broad, (b) general, and (c) specific types of peer feedback is associated with improved (i) dyadic and (ii) individual academic writing performance of discussion and conclusion chapters?
RQ5: What type of (a) broad, (b) general, and (c) specific types of peer feedback is associated with improved (i) dyadic and (ii) individual academic writing performance of abstracts?
Method
The present study explored the peer editing of 76 students from four course sections of a graduate-level scientific writing course at a Korean university over 10 instructional weeks. A total 49 participants were involved in masters-level programs while 27 were enrolled in doctoral-level programs. The 10 instructional weeks were divided into five two-week units aiming to provide teaching related to the five main parts of journal manuscripts. They were 1) Introduction, 2) Methodology, 3) Results, 4) Discussion & Conclusion, and 5) Abstract. In the second week of each two-week-long unit, students were required to review another set of lecture videos pertaining to writing style, language, and grammar specific to the same focal section of the journal manuscript. Students were asked to write a first draft of the manuscript section before the second Zoom meeting. There were no specific word count requirements, and students were advised to determine their own format and length based on journal style guides and published papers in their field of study. During these Zoom sessions, the instructor led brief discussions and answered questions, then provided guidance for peer editing sessions. Over the course of the program, all dyads were asked to conduct peer editing for each of the five sections, each corresponding to a major chapter of the academic paper. All peer editing was recorded by way of dyad-specific Google Docs created by the course instructor. The first draft of student journal manuscript sections was copied and pasted into the relevant Google Docs. Students were required to share the documents with their peers and provide them access to edit their writing. Instructions on how students should edit their partner's writing were provided in Google Docs, and to better assist students with peer editing, students could watch instructional videos on how to peer edit others’ assignments on the students’ learning management system. Students were required to track changes by using the "Suggesting Mode" instead of "Editing Mode" when proposing any changes to their peers' writing. The purpose of this was to clarify any suggested changes and to allow the author to easily accept or reject any such changes based on his or her own judgment. Besides, using the embedded comments feature on the Google Docs platform was encouraged so that the section of text that the collaborator selected for commenting was highlighted.
Expected Outcomes
This study examined the relationship between different categories of comments and writing performance for different chapters through an in-depth analysis of the online peer editing of documents from students at a Korean university. This present study broadly divides the comments into elaboration, verification, and general feedback, and further divides the three major categories into informative elaboration, suggestive elaboration, positive verification, neutral verification, and negative verification, and more specifically classifies their focus into abstract general, criteria general, criterion specific, and language, examining how the above categories affect writing performance in five different academic chapters for both dyads and individuals. Unlike the results of previous studies, this present study does not find a benefit of elaboration on student writing, possibly because elaboration feedback is too lengthy and complex and thus reduces students’ attention and self-reflection after receiving comments from their peers. The contribution of this study is that it provides empirical evidence that not all types of comments have a positive impact on student writing, specifically, verification type of feedback is negatively associated with student writing scores, and these findings may be valuable to teachers, peer editing instructors, and students.
References
Ebadi, S., & Rahimi, M. (2017). Exploring the impact of online peer-editing using Google Docs on EFL learners’ academic writing skills: A mixed methods study. Computer Assisted Language Learning, 30(8), 787-815. Liu, J., & Edwards, J. H. (2018). Peer response in second language writing classrooms. University of Michigan Press. MacArthur, C. A. (2018). Evaluation and revision. Best practices in writing instruction, 287. Hattie, J., & Clarke, S. (2018). Visible learning: feedback. Routledge. https://doi.org/10.4324/9780429485480 Zhu, Q., & Carless, D. (2018). Dialogue within peer feedback processes: Clarification and negotiation of meaning. Higher Education Research & Development, 37(4), 883-897. https://doi.org/10.1080/07294360.2018.1446417
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.