As one of the main insights working with Chatbots as teacher or instructor is their proper and reflected use. Specifically, one first and decisive step is a thoroughly created prompt. A prompt in generative AI is a specific way of interaction between a human and a large language model that let the model generate the intended output, in this study the constructive feedback for the learner.
One can almost say that this is already a research result, the prompts’ importance, starting to work and apply chatbots systematically and for educational purposes. This is not different from the old proverb that “we reap what we sow” one need to thoroughly consider how to design a prompt. Whereas chat bot applications for learners are implemented and under research for instance in Learning Management Systems (Lee et al., 2020) to assist student learning (Edubots, n.d.), applications for teachers specifically on assessment are less in focus with some exceptions. Just 6 % of the Edubots support assessment activities (Okonkwo & Ade-Ibijola, 2021, p.5-6). Therefore, the prompts and approaches researched here should support teacher’s feedback work on student learning. Beside different types of prompts to address different purposes and styles of answers, one need to respect principles which one can find in publications developed by the experience of language modelers for AI bots (Atlas, 2023). This will have influenced the approaches developed and presented in this paper.
These principles are described differently in the literature but as summarized here one can find the following basic handling principles:
• choose the words carefully
• define the conversation’s purpose
• define the conversations focus
• specify and be concise
• provide context
Other recommendations are to include the following types of components (Research project at our university, n.d.):
• role (the expertise or the perspective which should be taken)
• task (the specific task, objective your bot should conduct)
• format (intended presentation format for the bot answer)
Ekin (2023, p.4) is presenting five factors influencing the so-called “engineering” of prompts which in away include the handling principles and the types of components but add a bigger picture on the understanding of the technology itself used.
User intent: Understand the user’s goal and desired output. This helps in crafting a prompt that aligns with the user’s expectations.
Model understanding: Familiarize yourself with the strengths and limitations of ChatGPT. This knowledge assists in designing prompts that exploit the model’s capabilities while mitigating its weaknesses. Keep in mind that even state-of-the-art models like ChatGPT may struggle with certain tasks or produce incorrect information.
Domain specificity: When dealing with a specialized domain, consider using domain-specific vocabulary or context to guide the model towards the desired response. Providing additional context or examples can help the model generate more accurate and relevant outputs.
Clarity and specificity: Ensure the prompt is clear and specific to avoid ambiguity or confusion, which can result in suboptimal responses. Ambiguity can arise from unclear instructions, vague questions, or insufficient context.
Constraints: Determine if any constraints (e.g., response length or format) are necessary to achieve the desired output. Explicitly specifying constraints can help guide the model towards generating responses that meet specific requirements, such as character limits or structured formats.
Independent which kind of factors to consider, basic principles to follow or components to apply there is a need to make a choice to be able to use the bots purposeful and efficient. One can find literature and training programs for the so-called “prompt engineering” (see Ekin, 2023). The research question is: How will the use of different prompt-types influence the support for teachers’ writing feedback?