09 SES 03 A, Assessment in Language Education: Early literacy, oracy and spelling
Spelling ability is a critical literacy skill of sustained concern among educators, parents and employers as it can impact ones capacity to effectively read (Martin-Chang, Ouellette, & Madden, 2014) and write (Daffern, Mackenzie, & Hemmings, 2017a). According to Triple Word Form Theory (TWFT), being able to spell in the English language is also a complex linguistic process involving integration of phonology, orthography and morphology (Daffern, 2015). Phonological processing requires awareness of spoken sounds, at the smallest speech sound (phoneme) level and at the syllable level, and is activated when encoding (spelling) words. Orthographic processing requires sensitivity to letter strings or patterns within words, including knowing plausible alternative grapheme (alphabetic letter) combinations that apply under certain conditions. Morphological processing requires sensitivity to the smallest meaningful units in words, including knowing how suffixes and prefixes attach to base words (Apel, 2014). Breakdowns in any of these linguistic processes can lead to spelling errors (Bahr, 2015).
While spelling assessments should be used to inform teaching priorities (Kohnen, Nickels, & Castles, 2009), the value of an instrument is in its capacity to precisely determine which underlying linguistic skills may be impeding spelling accuracy. A dictation task which provides a framework for ‘spelling error analysis’ can be of benefit ‘for screening, progress monitoring, and diagnostic purposes’ (Al Otaiba & Hosp, 2010, p. 4). By analysing spelling errors, it may be possible to understand which ‘cognitive strategies children are using in their spelling’, and it can ‘provide a wealth of information about children’s phonological, orthographic and morphological knowledge’ (Varnhagen, McCallum, & Burstow, 1997, p. 451).
However, in using a measure that utilizes real words only, it is difficult to confirm whether a test-taker knows the underlying linguistic rules that underpin Standard English spelling (Kohnen et al., 2009). Furthermore, spelling tests should also reflect current theoretical understandings of how children learn to spell. Considering students are capable of integrating phonological, orthographic and morphological skills from the early years of learning to spell (Bahr, 2015; Daffern, 2017; Devonshire & Fluck, 2010; Garcia, Abbott, & Berninger, 2010), assessment instruments of spelling ability should include distinct measures of these three core linguistic features.
Aligning with TWFT, the Components of Spelling Test (CoST) was originally developed as a real-word dictation assessment, designed to measure phonological, orthographic and morphological skills in spelling (Daffern, 2017; Daffern, Mackenzie, & Hemmings, 2015, 2017b). However, as this measure does not include pseudo-words, a degree of caution is needed in interpreting performance results because potential influences of prior word-specific knowledge may lead to an over-estimation of one’s underlying linguistic skills in spelling (Kohnen et al., 2009).
In response to this limitation, a new dictation test, labelled the CoST: Pseudo-word Version, was developed and tested using Classical Test Theory (CTT) and Item Response Theory (IRT). CTT and IRT provides reliability and validity measures that are conventionally reported in psychological test calibration (Hambleton & Jones, 1993). This paper describes the theoretical underpinning and process of developing the items in the CoST: Pseudo-word Version. Following computation of item difficulty, discrimination index and misfit items of this new instrument some items were removed. Some of the reliability and validity results are reported in this paper. Specifically, the following research questions are addressed:
- How is the CoST: Pseudo-word Version informed by Triple Word Form Theory?
- What are the internal consistency results for the CoST: Pseudo word Version?
- What are the correlations in performance as measured by the CoST: Real-word Version and the CoST: Pseudo-word Version?
Five schools from an Australian metropolitan city in Australia were selected to participate in the testing through a convenience sampling method. These schools represent a socio-economic demographic that is marginally higher than the national mean for Australian schools, as determined by the Index of Community Socio-Educational Advantage (ICSEA) (ACARA, 2015). In line with previous applications of the CoST: Real-word Version, it was deemed necessary to involve students in the middle and upper primary school years. The participants in the study were 381 students (178 boys and 203 girls) from Grades 3, 4, 5 and 6, aged between 8 and 13 years. Content validity was addressed by developing items that align with the linguistic word forms underpinning Triple Word Form Theory. Like the CoST: Real-word Version, the pseudo-word version of the instrument was constructed to include three subscales, namely i) phonological; ii) orthographic; and ii) morphological. The internal consistency of each subscale was determined using Cronbach alpha and separation reliability. Predictive validity was demonstrated by computing bivariate correlations between the CoST: Real-word Version and CoST: Pseudo-word Version, for all subscales. Data were analysed in R (version 3.2.3) and SPSS (version 22.0).
A design principle for the phonological subscale [PS] is its capacity to measure accurate encoding of one-, two- and three-syllable pseudo-words containing regular phoneme-grapheme mapping. The initial design stage resulted in the formation of 30 items. The orthographic subscale [OS] is concerned with the sub-lexical conventions that pertain to the typical or constrained arrangement of letter groups (or strings of letters) within words. The initial design stage resulted in the formation of 39 items across two constructs. The morphological subscale [MS] is designed to determine a learner’s ability to apply morphological regularities when spelling. The initial design stage resulted in the formation of 55 items across four constructs. Taking into account students age profile, two versions of the test were produced: Pseudo-word (Grades 3-4) and Pseudo-word (Grades 4-6). The number of items in each scale was reduced based on statistical evidence. Reliability values show strong internal consistency among the three subscales for both versions of the test. For Pseudo-word (Grades 3-4) the Cronbach alpha was as follows: PS=0.885; OS=0.859; and MS=0.908 while the separation reliability was as follows: PS=0.882; OS=0.850; and MS=0.910. For the Pseudo-word (Grades 5-6), the Cronbach alpha was as follows: PS=0.892; OS=0.807; and MS=0.929 while the separation reliability was as follows: PS=0.903; OS=0.791; and MS=0.918. Statistically, this new test correlates significantly (at 0.01 level) with the real-word version, ensuring predictive validity. The correlation values ranged from 0.713 to 0.903, with the highest correlation between the morphological real-word and pseudo-word tests. The CoST: Pseudo-word Version has been designed to help teachers effectively plan for spelling instruction in school contexts as well as for research purposes. This novel instrument fills a gap in spelling ability research literature by providing the first pseudo-word dictation-based metric to assess students’ phonological, orthographic and morphological spelling skills.
References Al Otaiba, S., & Hosp, J. (2010). Spell it out: The need for detailed spelling assessment to inform instruction. Assessment for Effective Intervention, 36(1), 3-6. Apel, K. (2014). A comprehensive definition of morphological awareness: Implications for assessment. Topics in Language Disorders, 34(3), 197-209. doi:10.1097/TLD.0000000000000019 Australian Curriculum, Assessment, & Reporting Authority (ACARA). (2015). ICSEA: My School Fact Sheet. Retrieved 12 November, 2016 https://acaraweb.blob.core.windows.net/resources/About_icsea_2014.pdf Bahr, R. (2015). Spelling strategies and word formation processes. In R. Bahr & E. Silliman (Eds.), Routledge handbook of communication disorders (pp. 193-203). London: Routledge. Daffern, T. (2015). Helping students become linguistic inquirers: A focus on spelling. Literacy Learning: The Middle Years, 23(1), 33-39. Daffern, T. (2017). Linguistic skills involved in learning to spell: An Australian study. Language and Education, 31(1), 307-329. doi:10.1080/09500782.2017.1296855 Daffern, T., Mackenzie, N. M., & Hemmings, B. (2015). The development of a spelling assessment tool informed by Triple Word Form Theory. Australian Journal of Language & Literacy, 38(2), 72-82. Daffern, T., Mackenzie, N. M., & Hemmings, B. (2017a). Predictors of writing success: How important are spelling, grammar and punctuation? Australian Journal of Education, 61(1), 75-87. doi:10.1177/0004944116685319 Daffern, T., Mackenzie, N. M., & Hemmings, B. (2017b). Testing spelling: How does a dictation method measure up to a proofreading and editing format? Australian Journal of Language & Literacy, 40(1), 28-45. Devonshire, V., & Fluck, M. (2010). Spelling development: Fine-tuning strategy-use and capitalising on the connections between words. Learning and Instruction, 20, 361-371. Garcia, N., Abbott, R., & Berninger, V. (2010). Predicting poor, average, and superior spellers in grades 1 to 6 from phonological, orthographic, and morphological, spelling, or reading composites. Written Language and Literacy, 13(1), 61-98. Hambleton, R. K., & Jones, R. W. (1993). Comparison of classical test theory and item response theory and their applications to test development. Educational Measurement: Issues and Practice, 39-47. Kohnen, S., Nickels, L., & Castles, A. (2009). Assessing spelling skills and strategies: A critique of available resources. Australian Journal of Learning Difficulties, 14(1), 113-150. Martin-Chang, S., Ouellette, G., & Madden, M. (2014). Does poor spelling equate to slow reading? The relationship between reading, spelling, and orthographic quality. Reading and Writing, 27(8), 1485-1505. doi:10.1007/s11145-014-9502-7 Varnhagen, C., McCallum, M., & Burstow, M. (1997). Is children's spelling naturally stage-like? Reading and Writing, 9, 451-481. doi:10.1023/A:1007903330463
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.