Session Information
09 SES 11 C, Issues in Computer-Based Assessement
Paper Session
Contribution
It has been a long-standing principle for the measurement of change to not change the measure. However, it is now accepted that changes in methods and content are necessary for assessments to remain relevant (von Davier & Mazzeo 2009). This is especially the case when studying changes in the digital competence of students (called ICT literacy in Australia and Computer and Information Literacy by the IEA). Digital competence is one of the competences for lifelong learning articulated by the European Commission (2006).
This paper reports a computer-based assessment of ICT Literacy in Australia conducted in 2005 and 2008, methods for equating those assessments and the measurement of change over that three-year period (MCEETYA, 2007). ICT Literacy is seen as a set of generalisable and transferable knowledge, skills and understandings concerned with the use of computer technology to investigate, create and communicate information in a variety of contexts (Catts & Lau, 2007; ETS, 2002).
The assessments of ICT literacy in Australia were conducted in Grades 6 and 10 with large nationally representative samples. It was computer-based and combined the performance of specific software functions with the creation of digital products in a rotated set of thematic modules. In 2005 each student completed three of the seven modules generating more than 200 score points. In 2008 the appearance of material was identical and the method of response was the same as in 2005. The 2008 assessment included three modules from 2005 plus four new modules. Each student completed two of the trend modules and two of the new modules. The 2008 tasks included new software contexts but relied on the same fundamental ICT receptive, productive and evaluative processes as the 2005 tasks. In both 2005 and 2008 students completed a questionnaire about their use of ICT at school and at home. For the 2008 cycle some innovations in delivery methods were introduced without altering the assessment experience.
For each cycle IRT methods (Rasch) was used to analyse the student responses and generate a scale for locating items from each module and reporting student achievement. The scale was one-dimensional and reliable (0.93). The logit scale was transformed to a reporting scale which was set to a mean score of 400 and a standard deviation 100 for Grade 6 students in 2005. The scale was characterised by descriptions of proficiency levels based on item difficulties. A proficient standard for each Grade was established by a panel of ICT education experts.
Thirty-seven items were used to compare the relative performance of the Grade 6 and Grade 10 students and 39 items from the trend modules were used to link the 2008 results to those from 2005. The comparisons of achievement over time in such a rapidly developing field were made possible through instruments that reflect relevant technological changes and maintain integrity to the core processes of the ICT Literacy construct.
The principles and experience of the ICT literacy assessment will inform the development of the IEA International Computer Information Literacy Study (ICILS).
Method
Expected Outcomes
References
Catts, R. and J. Lau (2008). Towards Information Literacy Indicators. Paris, UNESCO. European Commission (2006). Key Competences for Lifelong Learning. Official Journal of the European Union. Brussels, Author. Kolen, M. (1999). Equating of tests. In G. Masters and J. Keeves (eds.) Advances in Measurement in Educational Research and Assessment. New York: Pergamon, pp. 164-175. Ministerial Council for Education, Employment, Training and Youth Affairs (MCEETYA) (2007). National Assessment Program - ICT Literacy Years 6 & 10 Report, Carlton: Curriculum Corporation. von Davier, M., & Mazzeo, J. (2009). Review of PISA Test Design: Recommendations for Fostering Stability in Assessment Results. Paper presented at the PISA Research Conference, Kiel, Germany.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.