Assessing Learning Outcomes Abroad: What should we measure and how?
Author(s):
Asunción Martínez-Arbelaiz (presenting / submitting)
Conference:
ECER 2016
Format:
Paper

Session Information

22 SES 01 B, Assessment and Conditions of Student Learning

Paper Session

Time:
2016-08-23
13:15-14:45
Room:
NM-Theatre O
Chair:
Rosemary Deem

Contribution

The design of appropriate assessment tools for language programs has been a challenge and a source of debate among language practitioners in the past years (Norris, 2006; Norris and Watanabe, 2013). Few examples of foreign language program assessment at the college level have been published (Mathews and Hansen, 2004; Kiely, 2009; Norris, Davis, Sinicrope, and Watanabe, 2009), and there are even fewer assessments of language programs abroad (Arnett, 2013). It is true that there is a growing body of research (Cf., Allen, 2010; Goldoni, 2013; Hernández, 2010; Isabelli-García, 2006; Kinginger, 2013; Segalowitz and Freed, 2004) aiming at documenting the wide variation of students´ experiences while abroad, but little attention has been paid to the design and overall value of the actual classes students attend during their semester abroad. Collentine (2009) refers to this void of information as the “missing study abroad methodology”. Nevertheless, if we want to understand how a language program functions and particularly, how to improve it, language program evaluation and assessment are absolutely mandatory (Davis, 2013).

In our institution, in order to gauge the impact of our Spanish courses in eight programs abroad in Spain, Chile and Costa Rica we have developed a number of instruments and tried several approaches in the past years. We started our assessment process by asking the students to complete the same multiple-choice placement exam that they had taken at the beginning of the semester at the end of the semester. The scores were statistically significant different in all courses, but we realized that although it satisfied the accountability function of assessment, it did not turn out to be useful for program improvement (Wiggins, 1998). The scoring was simple but the interpretation of tests results was challenging; and it shed little light on the processes that produced the score increase.

In subsequent semesters we asked the students to write a composition at the beginning and a similar one at the end of the study abroad. Since the topic was kept constant, we decided to ask for the narration of a trip because this was a task that intermediate students could complete. The comparison of the two compositions proved to be valuable, since it pointed to the grammatical and vocabulary areas that had improved in two of the levels of Spanish. The compositions of the most advanced level, however, did not show any statistically significant changes. After several discussions with the teachers involved, it was deemed that the lack of results in this task was due to a ceiling effect and that a more demanding task could offer a more nuanced reflection of the students´ performance in Spanish as a second language.

After a number of attempts with tests and in-class compositions, in the fall of 2013 we asked students to write a reflection of their overall experience, with specific attention to the people that had an impact on their study abroad experience. The students wrote the essay in Spanish during class time at the end of their program. This could be classified as an alternative form of assessment (Brown & Hudson, 1998) and it aimed at eliciting a critical incident (Ortega and Iberri-Shea, 2005) or a description of the crucial people that had an impact on their language learning
process.

Method

Each teacher asked her students to write a maximum of 250 words using the past tense, without any help and in no more than thirty minutes of class time. 216 students completed the required composition in the different programs. The teachers gave feedback to each student and sent an unmodified copy of each composition to the language coordinator, who analyzed the data both for form and meaning (Purpura, 2004). With respect to form, special attention was paid to the accurate use of orthography, verbal morphology, noun-modifier agreement and syntactic complexity (mean length of utterance) and the results were shared and discussed with the teachers in order to see what could be improved in each class. In addition, this type of composition turned out to be a very informative reflection of students´ experiences and their perceived value, some of which were provided by the program, such as language exchanges with local students, internships, volunteer work, etc. Most of the compositions mentioned their teachers as a person that not only helped them learn the target language but acted as experts in the students´ overall language socialization process (Duff, 2007).

Expected Outcomes

Teachers are positioned in these compositions as language experts, emotional supporters, informants about the the local community and help the students make meaning of the lived and observed experiences. After all, learning a language means to learn the social languages and the related situated meanings which are negotiated between people in communicative social interaction Gee (2004). In sum, the advantages of this form of assessment are the following: i. It is a direct window into learners´ grammatical ability and it measures the ability to use grammatical forms to convey meaning (Purpura 2004: 139). ii. Teachers can provide feedback on the composition, providing learning-oriented assessment (Carless, 2007). iii. It promotes students´ reflection on their learning experience. iv. For teachers, language coordinators and directors, these compositions provide valuable information about the grammar areas that need to be strengthened in the design of future courses. v. It gives a vivid picture of the experiences that students go through and value, some of which are orchestrated by the program. We propose that this type of non-intrusive written task can both satisfy growing demands for accountability (Kiely, 2009) while providing information about particular areas in the instruction abroad that need to be reinforced.

References

Arnett, C. (2013). Syntactic gains in short-term study abroad. Foreign Language Annals 46(4): 705-712. Brown, H. D. (2004). Language assessment: Principles and classroom practices. Longman Pub GroupBrown, J. D., & Hudson, T. (1998). The alternatives in language assessment. TESOL Quarterly, 32(4): 653-675. Carless, D. (2007). Learning-oriented assessment: Conceptual bases and practical implications. Innovations in Education and Teaching International 44(1): 57-66. Collentine, J. (2009). Study Abroad Research: Findings, implications and future directions. In Long; M. H. & Doughty, C. J. (Eds.), The handbook of language teaching. Oxford: Wiley-Blackwell. 218-233. Davis, L. (2013). Language Assessment in Program Evaluation. In Chapelle, C. A. (Ed.), The Encyclopedia of Applied Linguistics. Blackwell Publishing Ltd.: 2936-2945. Duff, P. A. (2007). Second language socialization as sociocultural theory: Insights and issues. Language Teaching, 40: 309-319. Gee, J.P. (2004). Learning Languages as a matter of Learning Social Languages within Discourses. In M. R. Howkins (Ed.), Language Learning and Teacher Education: A sociocultural approach (pp. 13-31). Clevedon: Multilingual Matters. Kiely, R. (2009). Small answers to the big question: Learning from language programme evaluation. Language teaching research, 13(1): 99-116. Kinginger, C. (2013). Identity and Language Learning in Study Abroad. Foreign Language Annals, 46(3): 339-358. Mathews, T. J., & Hansen, C. M. (2004). Ongoing Assessment of a University Foreign Language Program. Foreign Language Annals, 37(4): 630-640. Norris, J. M. (2006). The why (and how) of assessing student learning outcomes in college foreign language programs. The Modern Language Journal, 90(4): 576-583. Norris, J. M., & Watanabe, Y. (2013). Program Evaluation. In Chapelle, C. (Ed.), The Encyclopedia of Applied Linguistics. Cambridge: Blackwell. Norris, J. M., Davis, J. Mc., Sinicrope, C., & Watanabe, Y. (2009). Toward Useful Program Evaluation in College Foreign Language Education. Honolulu, HI: National Foreign Language Resource Center-University of Hawai’i. Ortega, L., & Iberri-Shea, G. (2005). Longitudinal Research in Second Language Acquisition: Recent Trends and Future Directions. Annual Review of Applied Linguistics: 25: 26-45. Purpura, J. (2004). Assessing Grammar. Cambridge: Cambridge University Press.Wiggins, G. (1998). Educative Assessment. Designing Assessments To Inform and Improve Student Performance. San Francisco: Jossey-Bass Publishers.

Author Information

Asunción Martínez-Arbelaiz (presenting / submitting)
University Studies Abroad Consortium, Spain

Update Modus of this Database

The current conference programme can be browsed in the conference management system (conftool) and, closer to the conference, in the conference app.
This database will be updated with the conference data after ECER. 

Search the ECER Programme

  • Search for keywords and phrases in "Text Search"
  • Restrict in which part of the abstracts to search in "Where to search"
  • Search for authors and in the respective field.
  • For planning your conference attendance, please use the conference app, which will be issued some weeks before the conference and the conference agenda provided in conftool.
  • If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.