Session Information
12 SES 13 A, Paper Session: Information Literacy and Open Research Practice
Paper Session
Contribution
The question how users interact and use websites, databases and other portals available online is as old as the internet itself (Zheng & Peltsverger 2015). To answer this question, website administrators used data collection technologies that record the use of a website early on, which can be grouped under the term web analytics. According to the still common definition by the Web Analytics Association from 2008, “Web analytics is the measurement, collection, analysis and reporting of web data for purposes of understanding and optimizing web usage”. In contrast to qualitative user research, web analytics is less laborious and costly to execute. More importantly, it does not confine the data collection to a limited period but continuously generates data about usage (Palomino et al. 2021). Web statistics reveal, for example, how often users visit an online portal, which pages are particularly popular, how long visitors remain on the sites, and which paths they have taken across the website (Kaushik 2007). The analysed data is measured against pre-set key performance indicators which are directly tied to the main objective of the website and combined with the business strategy (Hassler 2019, Jyothi 2017) This approach provides web operators with not only insights into user behavior, but also areas for development in their web product, such as missing content or usability issues. Although web analytics is most commonly employed in online marketing, it is also utilized in education in the area of digital information and reference systems as offered by libraries, educational institutions, or research facilities.
A number of papers provide information on the use of web analytics in these settings. Some papers are concerned with operational issues such as data protection (Quintel and Wilson 2020, Chandler & Wallace 2016) or the inventory of web analytics usage in institutional contexts (Redkina 2018, Böhm &Rittberger 2016), whereas others describe usage scenarios for specific web portals based on web analytics (Perifanou & Economides 2022, Keil et al. 2015). Regardless of the findings of these studies, various research objectives for the field of web analytics remain in the context of educational portals. In the technological domain, work on how to measure specific indicators would be useful. Furthermore, it is critical to investigate how cross-portal insights into the use of information systems can be obtained and what benefits this might offer for user research. Studies detailing optimization cycles would be valuable regarding the management field, for instance by approaching the subject with contrasting case studies for various organizations. In the subject of open research data, questions about the prerequisites for making web analytics data available for subsequent use arise.
In my paper, I would like to address some of the questions raised above and underpin them with the case study at DIPF| Leibniz Institute for Research and Information in Education. DIPF operates a number of infrastructures for educational research and practice and the general public, including the German Education Server, which provides edited and curated online information, the “Education Research Portal” offering literature databases, as well as the "Research Data Centre for Education," which processes research data for re-use." DIPF monitors and evaluates the use of these infrastructures through web analytics. To implement web analytics efficiently across portals and thereby gain insights into the web use behaviour of stakeholder groups in education, I propose that web analytics research become an open research practice. Sharing web analytics data, transparent descriptions of data collecting techniques, and an open and collaborative culture are required for expanding research beyond case studies. By offering insights into web analytics practice at DIPF and showing cross-portal collaboration, I aim to exemplify open research in action.
Method
Methodologically, the paper draws on literature on web analytics and open research practice. This analysis is supported by examples of best practices from DIP whose Open Research Practice in the domain of web analytics is used as a case study here. The paper highlights the benefits and constraints of web analytics as an open research practice along the analytics process which consists of four steps as stated in the definition above: collection, measurement, analysis and reporting. I will discuss the different phases using real examples from DIPF. There are two basic approaches for collecting data: page tagging and log file analysis (Zheng & Peltsverger 2015). The various data collection methods result in various counting methods and data. When it comes to data interchange, these discrepancies, which ultimately lead to compatibility issues, must be taken into account. In this context, I will also make some suggestions for making web analytics data more open and reusable, in absence of research based on social media data (Bayer et al. 2021). The paper addresses then the issue of software that is often utilized for the full web analytics process (Kaushik 2007). I will contrast the benefits and drawbacks of two prevalent systems, as well as data privacy challenges and open source alternatives. In terms of measurement and analysis, I will emphasize the significance of portal-specific key performance indicators as opposed to common metrics such as visitors, page views, dwell time, and bounce rate. Using the DIPF portals as an example, I will also demonstrate how such a development process for key performance indicators, their implementation and use as altmetrics for performance evaluation might progress, as well as what insights can be derived from indicators that are closely linked to the portal’s overall goal. I will address the final phase, reporting, in light of performance measurement on the one hand and optimization on the other, linking to various methods of website review. Finally, I will discuss what further actions are needed to foster an open culture in the context of user research using web analytics. Collaborations, I propose, should be explored as a foundation for sharing data and knowledge, as should the extension of more practice-oriented publication forms to make operational knowledge available. Overall, it is vital to investigate in participatory and collaborative processes and balance the benefits and drawbacks of a sharing culture for web analytics.
Expected Outcomes
The paper highlights the opportunities and challenges of opening up research with web analytics. One of the major challenges is to enable open and at the same time protected access to web analytics data that takes into account both data protection and the dynamics of the new data generated daily. Also important is the development of web analytics indicators that reflect portal specifics and may thus be used to measure success, but also permit comparisons between different websites. To further the discipline, metrics research, such as how to exactly quantify reading on a page or how to better track and assess searches on websites using open analytics tools, should be encouraged. An important organizational and technological issue is reconciling web analytics requirements with the constraints of open source tools, and developing and testing solutions where this is not possible. The obstacles are countered by the benefits of the process of opening, which are similar to the prospects offered by Open Science in general (Pampel 2014). The publication of organizational and technical proceedings may result in improved infrastructure practices, not only for the use of web analytics, but also for the infrastructures themselves. This is because user research based on online data can be employed to optimize infrastructures (Beasley 2013). I propose that by making the processes and data openly available, the research output might increase which can lead to a better understanding of information behaviour in the educational field. Furthermore, web analytics data can be integrated or contrasted with other data sources, such as social media data, to create a cross-media picture of educational information activities.
References
Bayer, S., Breuer, J., Lösch, T. und Goebel, J. W. (2021). Nutzung von Social-Media-Daten in der Bildungsforschung. forschungsdaten bildung informiert 9, Version 1. https://www.forschungsdaten-bildung.de/files/fdb-informiert-nr-9.pdf (23/01/23) Beasley, M. (2013): Practical Web Analytics for User Experience. How Analytics Can Help You Understand Your Users. Waltham: Morgan Kaufman Dragoş, S.-M. (2011) Why Google Analytics cannot be used for educational web content. 2011 7th International Conference on Next Generation Web Services Practices, Salamanca, Spain, pp. 113-118, doi: 10.1109/NWeSP.2011.6088162 . Chandler, A. & Wallace, M. (2016). Using Piwik Instead of Google Analytics at the Cornell University Library. The Serials Librarian, 71:3-4, 173-179, DOI: 10.1080/0361526X.2016.1245645 Hassler, M. (2019): Digital und Web Analytics: Metriken auswerten, Besucherverhalten verstehen, Website optimieren. Frechen: mitp Business Jyothi, P. (2017). A Study on Raise of Web Analytics and its Benefits. International Journal of Computer Sciences and Engineering 5, 61-66. Keil, S.; Böhm, P.; Rittberger, M. (2015): Qualitative web analytics. New insights into navigation analysis and user behavior - a case study of the German Education Server. In: Pehar, F. et al (eds.): Re:inventing Information Science in the networked society. Glückstadt: Hülsbusch, S. 252-263. Kaushik, A. (2007). Web analytics an hour a day. Indianapolis, Ind.: Wiley. Palomino, F., Paz, F., Moquillaza, A. (2021). Web Analytics for User Experience: A Systematic Literature Review. In: Soares, M.M., Rosenzweig, E., Marcus, A. (eds) Design, User Experience, and Usability: UX Research and Design. HCII 2021. Lecture Notes in Computer Science(), vol 12779. Springer, Cham. https://doi.org/10.1007/978-3-030-78221-4_21 Pampel, H, & Dallmeier-Tiessen, S. (2014): Open Research Data: From Vision to Practice. In: Bartling, S & Friesike, S. (eds.): Opening Sience. https://link.springer.com/book/10.1007/978-3-319-00026-8 Perifanou, M. & Economides, A.A (2022). Analyzing repositories of OER using web analytics and accessibility tools. Univ Access Inf Soc https://doi.org/10.1007/s10209-022-00907-6 Quintel, D., & Wilson, R. (2020). Analytics and Privacy. Using Matomo in EBSCO’s Discovery Service. Information Technology and Libraries, 39(3). https://doi.org/10.6017/ital.v39i3.12219 Redkina, N.S. (2018). Library Sites as Seen through the Lens of Web Analytics. Autom. Doc. Math. Linguist. 52, 91–96. https://doi.org/10.3103/S0005105518020073 Web Analytics Association (2008). Web Analytics Definitions. https://www.slideshare.net/leonaressi/waa-web-analytics-definitions (23/01/12) Zheng, G. & Peltsverger, S. (2015). Web Analytics Overview. https://doi.org/10.4018/978-1-4666-5888
Search the ECER Programme
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.