Session Information
ERG SES E 07, Education Practices
Paper Session
Contribution
Across the national and international post-secondary landscape, the trend towards collecting and analyzing multiple forms of data is evident (Black, 2010; Bresciani, 2006; Gotteil & Smith, 2011; Hollowell, Middaugh, & Sibolski, 2006; Laidler, 2005; Middaugh, 2010; Steele, 2010). This trend has emerged within the broader context of internationalization and globalization across post-secondary institutions (Trilokekar, Shanahan, Axelrod, & Wellen, 2013). There is increasing scrutiny and emphasis on accountability from internal and external stakeholders, including governments, agencies, and the general public; the stakeholders expect evidence to demonstrate improved graduation and retention rates, and enhanced student success including career transition (Bresciani, 2006; Hollowell et al., 2006; Schuh, 2009). Kuk, Banning and Amey (2010) argued that this trend will continue. “As accountability and metrics assume a greater role in institutional decision making, emphasis on being able to measure and to justify the effectiveness of existing organizations, and their programs and services, will increase” (Kuk et al., 2010, p. 204). This emphasis on data is especially important in an environment of increasing competition for scarce resources (Kuk et al., 2010) and the increasing pressure to ensure that higher education institutions are engaging in an internationalization agenda and, therefore, competing on an international scale (Austin & Jones, 2016).
The purpose of this qualitative meta-synthesis (Finlayson & Dixon, 2008; Zimmer, 2006) is to examine the assessment imperative to which universities across the globe must respond. The researcher will use a critical lens to examine the current state of the assessment agenda in higher education institutions. First, this paper outlines, from national and international perspectives, the types of assessment that institutions are expected to engage in, and the purposes of the types of assessment. Second, the nature of the global dialogue regarding comparative institutional data will be examined critically. Third, the researcher highlights how particular assessment projects can be leveraged to inform decision making as individual institutions and across institutions. Lastly, the research proposes how understanding the assessment landscape in the post-secondary environment can assist institutions in preparing for emerging global trends, for meeting external and internal stakeholders’ calls for accountability and transparency, and for responding to emergent issues.
This focused and critical discussion is especially important in meeting the demands of multiple stakeholders who have different agendas. Hollowell et al., (2006) contended that the “escalating emphasis on accountability is related, in part, to perceptions that colleges and universities do not plan carefully or assess their effectiveness” (p. 3). Middaugh (2010) pointed out that this type of assessment is “different from scholarly research, more akin to action research. The primary objective of institutional assessment is to produce information that can be used in decision making and institutional improvement” (p. 124). In addition to measuring student performance, the data can be used to improve instructional practices, to gather feedback on program outcomes, and to enhance the student experience. Not only must data about the current state of the institution be collected and analyzed, but it must be used to inform subsequent planning and action. Hollowell et al., (2006) asserted: “Quantitative and qualitative information about all facets of a college or university’s operations—and how they relate to the institutional mission—is absolutely essential to good planning” (p. 69). Given this context of increasing pressure to measure and report on progress, financial sustainability, student outcomes, and education and research outputs, universities need to develop a thoughtful, strategic approach to meeting these demands. The paper explores the multiple assessment demands through a critical lens in order to address the question of how universities develop an assessment agenda that will meets the most compelling data demands in a globally competitive environment.
Method
Expected Outcomes
References
Austin I. & Jones, G. A. (2016). Governance of higher education: Global perspectives, theories, and practices. New York: Routledge. Black, J. (Ed.). (2010). Strategic enrolment intelligence. London, ON: Academica Group. Bresciani, M. J. (2006). Outcomes-based academic and co-curricular program review: A compilation of institutional good practices. Sterling, VA: Stylus Publishing. Finalyson, K., & Dixon, A. (2008). Qualitative meta-synthesis: A guide for the novice. Nurse Researcher, 15(2), 59-71. Gewurtz, R., Stergiou-Kita, M., Shaw, L., Kirsh, B., & Rappolt, S. (2008). Qualitative meta-synthesis: Reflections on the utility and challenges in occupational therapy. Canadian Journal of Occupational Therapy, 75(5), 301-308. Gottheil, S., & Smith, C. (2011). SEM in Canada: Promoting student and institutional success in Canadian colleges and universities. Washington, DC: American Association of Collegiate Registrars and Admissions Officers. Hollowell, D., Middaugh, M. F., & Sibolski, E. (2006). Integrating higher education planning and assessment: A practical guide. Ann Arbor, MI: Society for College and University Planning. Kuk, L., Banning, J. H., & Amey, M. J. (2010). Positioning student affairs for sustainable change: Achieving organizational effectiveness through multiple perspectives. Sterling, VA: Stylus Publishing. Laidler, D. (2005). Incentives facing Canadian universities: Some possible consequences. In C. M. Beach, R. W. Boadway, & R. M. McInnis (Eds.), Higher education in Canada, pp. 35–49. Kingston, ON: John Deutsch Institute for the Study of Economic Policy. Middaugh, M. F. (2010). Planning and assessment in higher education: Demonstrating institutional effectiveness. San Francisco, CA: Jossey-Bass. Saldaña, J. (2013). The coding manual for qualitative researchers (2nd ed.). Thousand Oaks, CA: Sage.Schuh, J. H. and Associates (2009). Assessment methods for student affairs. San Francisco, CA: Jossey-Bass. Steele, K. (2010). The changing Canadian PSE landscape. In J. Black (Ed.) Strategic enrolment intelligence (pp. 27–50). London, ON: Academica Group. Trilokekar, R. D., Shanahan, T., Axelrod, P., & Wellen, R. (2013). Making post-secondary education policy: Toward a conceptual framework. In P. Axelrod, R. D. Trilokekar, T. Shanahan, & R. Wellen (Eds.) Making policy in turbulent times: Challenges and prospects for higher education (pp. 33-58). Montreal, QC: McGill-Queen’s University Press. Thorne, S., Jensen, I. A., Kearney, M. H., Noblit, G. W., & Sandowlski, M. (2004). Qualitative metasynthesis: Reflections on methodological orientation and ideological agenda. Qualitative Health Research, 14, 1342-1365. Zimmer, L. (2006). Qualitative meta-synthesis: A question of dialoguing with texts. Journal of Advanced Nursing, 53(3), 311-318.
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.