22 SES 11 F JS, Digital Scholarship, Metrics and Reputation
Joint Paper Session NW 12 and NW 22
Growing international competition among universities is increasingly requiring accreditation by renowned international agencies, such as the AACSB (The Association to Advance Collegiate Schools of Business), towards which ISEG is now in process of accreditation. One of the priority criteria for accreditation is the evaluation of scientific productivity through bibliometric indicators of the scientific articles published in renowned international journals. These journals are classified in highly reputed international repertoires, such as ISI, SCOPUS and ABS. For a number of reasons, below mentioned, such journals are essentially mainstream, which impairs the classification of research from countries with less developed scientific communities, such as Portugal. Within their narrow discretion, the academic authorities of these countries often try to adapt those criteria to the national realities and to introduce some flexible mechanisms in the evaluation of scientific production.
The evaluation conditions and career progression of the ISEG's faculty also depend on the publishing of research production in international journals and repertoires such as the above mentioned. However, some of ISEG's faculty's scientific output fall outside those repertoires, although it may prove to be of great importance for the clarification of pressing problems of the Portuguese economy and business world.
In this paper we aim to assess the relative weight of ISEG's scientific output that risks being obscured by not being published in the reference repertoires and if their content takes into account relevant Portuguese issues in the fields of economics and management. We will investigate what methodologies are then followed by the responsible bodies of the ISEG, and in particular by its Scientific Council, to assess the quality and validate this non-mainstream production.production.
We will then consider, in comparative terms, the way ISEG classifies scientific publications that fall outside these repertoires versus those in these repertoires and how the inherent classification impacts on teachers and researchers evaluation.
Academic authorities and research policies, in general, fall often into ambivalent positions. The imperative transformation that the globalization process has imposed on research systems requires scientific results to be increasingly harmonized and to comply with criteria and benchmarks of international recognition. The reputation of these referentials, such as the bibliographic repertoires above mentioned, stems largely from the size of the market they serve and is greatly enhanced by the largest USA and UK research and higher education units. Having English as their primary language favors an increase in the number of readings, downloads and quotations (Solomon, 2014). This reputational process has been self-feeding on the basis of an erroneous inference: if an article is largely quoted it will have a high impact factor and hence will be considered of high quality. As the US controls a large share of the world's R & D budget, half of which is reinvested in the US itself, the citation and cross-referencing mechanism increasingly favors US research units and their values, which are then widespread throughout the world (Mingers and Willmott 2012 ; Altbach, 2015).
Under the risk of no access to scientific research certification and funding, in particular of external origin, the authorities and research policies of the peripheral countries are forced to embrace mainstream research path. On the other hand, they are confronted with a mismatch between research aligned with dominant thinking and the real problems of the societies in which they live. This raises the question of the social utility of scientific research (Apple, 2014; Bornmann, Stefaner, Moya Anègon and Mutz, 2014). This problem is all the more aggravated by the fact that scientific policy in such countries attempts to follow that of the dominant economies, mainly because they lack an own scientific strategy (Sotorauta and Kosonen, 2003).
•Altbach, Ph. G. (2015). The Tyranny of Citations. International Higher Education, ejournal (http://ejournals.bc.edu/ojs/index.php/ihe/article/viewFile/7889/7040). •Anninos, L. (2013). Research performance evaluation: some critical thoughts on standard bibliometric indicators. Studies in Higher Education, volume 39-2014, Issue 9 (http://www.tandfonline.com/doi/full/10.1080/03075079.2013.801429?scroll=top&needAccess=true). •Apple, M. (2014th ed.). Official knowledge: democratic education in a conservative age. New York: Routledge. •Beigel, F. (2014). Publishing from the periphery: Structural heterogeneity and segmented circuits. The evaluation of scientific publications for tenure in Argentina’s CONICET. Current Sociology, vol. 62(5), 743-765 (http://journals.sagepub.com/doi/pdf/10.1177/0011392114533977). •Bertocchi, G., Gambardella, A., Jappelli, T., Nappi, C. and Peracchi, F. (2013). Bibliometric Evaluation vs. Informed Peer Review: Evidence from Italy. IZA DP Nº 7739 (http://ftp.iza.org/dp7739.pdf). •Bornmann, L., Mutz, R., Neuhaus, Ch. And Daniel, H-D. (2008). Citation counts for research evaluation: standards of good practice for analyzing bibliometric data and presenting and interpreting results. Ethics in Science and environmental politics, vol. 8, 93-102, doi: 10.3354/esep00084 (http://www.int-res.com/articles/esep2008/8/e008p093.pdf) •Bornmann, L. and Leyedesdorff, L. (2014). Scientometrics in a changing science landscape. Science and Society. EMBO reports (2014) 15, 1228-1232 (http://embor.embopress.org/content/15/12/1228). •Bornmann, L. , Stefaner, M. , de Moya Anegón, F. and Mutz, R. (2014). Ranking and mapping of universities and research-focused institutions worldwide based on highly-cited papers: A visualisation of results from multi-level models. Online Information Review, Vol. 38 Iss: 1, pp.43 – 58. •Chavarro, Diego Andrés and Tang, Puay and Rafols, Ismael, Why Researchers Publish in Non-Mainstream Journals: Training, Knowledge Bridging, and Gap Filling (December 19, 2016). SWPS 2016-22. Available at SSRN : (file:///C:/Users/MARGARIDA/Downloads/SSRN-id2887274.pdf). •Falagas, M., Pitsouni, E., Malietzis, G. and Pappas, G. (2008). Comparison of PubMed, Scopus, Web of Science and Google Scholar: strengths and weaknesses. The FASEB Journal, vol. 22. Nº2, 338-342 (http://www.int-res.com/articles/esep2008/8/e008p093.pdf). •Mingers, J. e Willmott, H. (2012). Taylorizing business school research: On the ‘one best way’ performative effects of journal ranking lists. The Tavistock Institute, Human Relations , 66(8), 1051-1073 (http://journals.sagepub.com/doi/pdf/10.1177/0018726712467048). •Solomon, D.J. (2014). A survey of authors publishing in four megajournals. PeerJ2:e365, https://doi.org/10.7717/peerj.365. •Sotorauta, M. & Kosonen, K-J. (2003). Institutional Capacity and Strategic Adaptation in Less Favored Regions, University of Tampere. (https://smartech.gatech.edu/bitstream/handle/1853/43152/BengtAkeLundvall1.pdf). •Zanon, B. (2012). Research quality assessment and planning journals. The Italian perspective. Università degli Studi di Trento (file:///C:/Users/MARGARIDA/Downloads/22-88-2-PB.pdf).
00. Central Events (Keynotes, EERA-Panel, EERJ Round Table, Invited Sessions)
Network 1. Continuing Professional Development: Learning for Individuals, Leaders, and Organisations
Network 2. Vocational Education and Training (VETNET)
Network 3. Curriculum Innovation
Network 4. Inclusive Education
Network 5. Children and Youth at Risk and Urban Education
Network 6. Open Learning: Media, Environments and Cultures
Network 7. Social Justice and Intercultural Education
Network 8. Research on Health Education
Network 9. Assessment, Evaluation, Testing and Measurement
Network 10. Teacher Education Research
Network 11. Educational Effectiveness and Quality Assurance
Network 12. LISnet - Library and Information Science Network
Network 13. Philosophy of Education
Network 14. Communities, Families and Schooling in Educational Research
Network 15. Research Partnerships in Education
Network 16. ICT in Education and Training
Network 17. Histories of Education
Network 18. Research in Sport Pedagogy
Network 19. Ethnography
Network 20. Research in Innovative Intercultural Learning Environments
Network 22. Research in Higher Education
Network 23. Policy Studies and Politics of Education
Network 24. Mathematics Education Research
Network 25. Research on Children's Rights in Education
Network 26. Educational Leadership
Network 27. Didactics – Learning and Teaching
The programme is updated regularly (each day in the morning)
- Search for keywords and phrases in "Text Search"
- Restrict in which part of the abstracts to search in "Where to search"
- Search for authors and in the respective field.
- For planning your conference attendance you may want to use the conference app, which will be issued some weeks before the conference
- If you are a session chair, best look up your chairing duties in the conference system (Conftool) or the app.