Chapter 14: Meta-research and researcher evaluation
Open access

Meta-research (also known as ‘research on research’ and ‘science of science’) is a broad field of inquiry which seeks to understanding how research is performed and to guide improvements in systems and policies for its conduct and management. Research evaluation has become increasingly entwined with meta-research, especially that branch of the field dealing with the quantitative analysis of research performance at the level of researchers and research groups, institutions, and journals. In this chapter we focus on researcher-level evaluation and examine how quantitative and qualitative approaches have been developed to conceptualise and evidence the achievements of individual researchers, with a particular focus on diverse indicators and the evaluation of societal impact.

  • AESIS. (n.d.) https://aesisnet.com/ accessed 23 September 2023.

  • Baas, J., Schotten, M., Plume, A., Côté, G. and Karimi, R. (2020). Scopus as a curated, high-quality bibliometric data source for academic research in quantitative science studies. Quantitative Science Studies, 1(1), 377–386. https://doi.org/10.1162/qss_a_00019.

  • Baldwin, M. (2018). Scientific autonomy, public accountability, and the rise of ‘peer review’ in the cold war United States. ISIS, 109(3), 538–558. https://doi.org/10.1086/700070.

  • Barnes, C. (2017). The h-index Debate: An Introduction for Librarians. Journal of Academic Librarianship, 43(6), 487–494. https://doi.org/10.1016/j.acalib.2017.08.013.

  • Bernal, J. D. (1939). The Social Function of Science. George Routledge and Sons.

  • Bhandari, M., Guyatt, G. H., Kulkarni, A. v., Devereaux, P. J., Leece, P., Bajammal, S., Heels-Ansdell, D. and Busse, J. W. (2014). Perceptions of authors’ contributions are influenced by both byline order and designation of corresponding author. Journal of Clinical Epidemiology, 67(9), 1049–1054. https://doi.org/10.1016/j.jclinepi.2014.04.006.

  • Bloch, C., Sørensen, M. P. and Young, M. (2019). Tales of serendipity in highly cited research: an explorative study. Journal of the Knowledge Economy, 1(18). https://doi.org/10.1007/s13132-019-00625-0.

  • Bornmann, L. (2013). What is societal impact of research and how can it be assessed? A literature survey. Journal of the American Society for Information Science and Technology, 64(2), 217–233. https://doi.org/10.1002/asi.22803.

  • Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895–903). https://doi.org/10.1016/j.joi.2014.09.005.

  • Bornmann, L., Haunschild, R. and Marx, W. (2016). Policy documents as sources for measuring societal impact: how often is climate change research mentioned in policy-related documents? Scientometrics, 109(3), 1477–1495. https://doi.org/10.1007/s11192-016-2115-y.

  • Bornmann, L., Tekles, A., Zhang, H. H. and Ye, F. Y. (2019). Do we measure novelty when we analyze unusual combinations of cited references? A validation study of bibliometric novelty indicators based on F1000 Prime data. Journal of Informetrics, 13(4), 100979. https://doi.org/10.1016/j.joi.2019.100979.

  • Bornmann, L. and Marewski, J. N. (2019). Heuristics as conceptual lens for understanding and studying the usage of bibliometrics in research evaluation. Scientometrics, 120(2), 419–459. https://doi.org/10.1007/s11192-019-03018-x.

  • Braun, T. (2012). Editorial. Scientometrics, 92(2), 207–208. https://doi.org/10.1007/s11192-012-0754-1.

  • Burrows, C. J., Huang, J., Wang, S., Kim, H. J., Meyer, G. J., Schanze, K., Lee, T. R., Lutkenhaus, J. L., Kaplan, D., Jones, C., Bertozzi, C., Kiessling, L., Mulcahy, M. B., Lindsley, C. W., Finn, M. G., Blum, J. D., Kamat, P., Choi, W., Snyder, S. and Aldrich, C. C. (2020). Confronting Racism in Chemistry Journals. Journal of the American Chemical Society, 142(26), 11319–11321. https://doi.org/10.1021/jacs.0c06516.

  • Ciarli, T. and Ràfols, I. (2019). The relation between research priorities and societal demands: The case of rice. Research Policy, 48(4), 949–967. https://doi.org/10.1016/j.respol.2018.10.027.

  • Conroy, G. (2020) What’s wrong with the h-index, according to its inventor. Nature Index. https://www.natureindex.com/news-blog/whats-wrong-with-the-h-index-according-to-its-inventor accessed 23 September 2023.

  • Dahler-Larsen, P. (2011). The Evaluation Society. Stanford University Press.

  • Derrick, G., Faria, R., Benneworth, P., Budtz-Petersen, D. and Sivertsen, G. (2018a). Towards characterising negative impact: Introducing Grimpact. STI 2018 Conference Proceedings, 1199–1213.

  • Derrick, G. (2018b). Evaluation Mechanics. The Evaluators’ Eye (pp. 57–94). Springer International Publishing. https://doi.org/10.1007/978-3-319-63627-6_3.

  • Derrick, G. (2020). Missing a grant should be a beginning, not an end. Research Professional News. https://www.researchprofessionalnews.com/rr-news-political-science-blog-2020-2-missing-a-grant-should-be-a-beginning-not-an-end/ accessed 23 September 2023.

  • Doan, D. and Knight, B. (2020). Measuring What Matters. Global Fund for Community Foundations. https://globalfundcommunityfoundations.org/resources/measuring-what-matters/ accessed 23 September 2023.

  • DORA. (2013). https://sfdora.org/ accessed 23 September 2023.

  • Douglas, D. R. B., Grant, J. and Wells, J. (2020). Advancing University Engagement: University engagement and global league tables. Nous Group. https://nousgroup.com/wp-content/uploads/2020/07/Engagement-Report-Digital.pdf accessed 23 September 2023.

  • De Kleijn, M., Jayabalasingham, B., Falk-Krzesinski, H. J., Collins, T., Kuiper-Hoyng, L., Cingolani, I., Zhang, J., Roberge, G. et al. (2020). The Researcher Journey Through a Gender Lens: An Examination of Research Participation, Career Progression and Perceptions Across the Globe. Elsevier. www.elsevier.com/gender-report accessed 23 September 2023.

  • Encyclopedia.com. (n.d.). Quantification. International Encyclopedia of the Social Sciences. https://www.encyclopedia.com/social-sciences/applied-and-social-sciences-magazines/quantification accessed 23 September 2023.

  • Falagas, M. E., Kouranos, V. D., Arencibia-Jorge, R. and Karageorgopoulos, D. E. (2008). Comparison of SCImago journal rank indicator with journal impact factor. The FASEB Journal, 22(8), 2623–2628. https://doi.org/10.1096/fj.08-107938.

  • Ferretti, F., Pereira, Â. G., Vértesy, D. and Hardeman, S. (2018). Research excellence indicators: Time to reimagine the ‘making of’? Science and Public Policy, 45(5), 731–741. https://doi.org/10.1093/SCIPOL/SCY007.

  • Fontana, M., Iori, M., Montobbio, F. and Sinatra, R. (2020). New and atypical combinations: An assessment of novelty and interdisciplinarity. Research Policy, 49(7), 104063. https://doi.org/10.1016/j.respol.2020.104063

  • Fortunato, S., Bergstrom, C. T., Börner, K., Evans, J. A., Helbing, D., Milojević, S., Petersen, A. M., Radicchi, F., Sinatra, R., Uzzi, B., Vespignani, A., Waltman, L., Wang, D. and Barabási, A. L. (2018). Science of science. Science, 359(6379) eaao0185. https://doi.org/10.1126/science.aao0185.

  • Garfield, E. (1955). Citation indexes for science. Science, 122(3159), 108–111. https://doi.org/10.1126/science.122.3159.108.

  • Garfield, E. (2006). The history and meaning of the journal impact factor. Journal of the American Medical Association, 295(1), 90–93. https://doi.org/10.1001/jama.295.1.90.

  • Gasson, K., Herbert, R. and Ponsford, A. (2019). Fractional Authorship & Publication Productivity. ICSR Perspectives. https://www.elsevier.com/icsr/perspectives/fractional-authorship-and-publication-productivity accessed 23 September 2023.

  • Greenhalgh, T., Raftery, J., Hanney, S. and Glover, M. (2016). Research impact: A narrative review. BMC Medicine, 14(1), 78. https://doi.org/10.1186/s12916-016-0620-8.

  • Grove, J. (2017). Nobel winners share tips on their success. Times Higher Education. https://www.insidehighered.com/news/2017/10/12/nobel-winners-share-tips-their-success accessed 23 September 2023.

  • Gumpenberger, C., Hölbling, L. and Gorraiz, J. I. (2018). On the Issues of a “Corresponding Author” Field-Based Monitoring Approach for Gold Open Access Publications and Derivative Cost Calculations. Frontiers in Research Metrics and Analytics, 3(1). https://doi.org/10.3389/frma.2018.00001.

  • Hammarfelt, B. and Rushforth, A. D. (2017). Indicators as judgment devices: An empirical study of citizen bibliometrics in research evaluation. Research Evaluation, 26(3), 169–180. https://doi.org/10.1093/reseval/rvx018.

  • Helmer, S., Blumenthal, D. B. and Paschen, K. (2020). What is meaningful research and how should we measure it? Scientometrics, 125(1), 153–169. https://doi.org/10.1007/s11192-020-03649-5.

  • Herbert, R. (2020). Accept me, accept me not: What do journal acceptance rates really mean? ICSR Perspectives. https://www.elsevier.com/icsr/perspectives/accept-me-accept-me-not .

  • Hessels, L. K., van Lente, H. and Smits, R. (2009). In search of relevance: The changing contract between science and society. Science and Public Policy, 36(5), 387–401. https://doi.org/10.3152/030234209X442034.

  • Hicks, D., Wouters, P., Waltman, L., De Rijcke, S. and Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520(7548), 429-431. https://doi.org/10.1038/520429a.

  • Hicks, D. and Isett, K. R. (2020). Powerful numbers: Exemplary quantitative studies of science that had policy impact. Quantitative Science Studies, 1(3), 969–982. https://doi.org/10.1162/qss_a_00060.

  • Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America, 102(46), 16569–16572. https://doi.org/10.1073/pnas.0507655102.

  • Hirsch, J. E. and Buela-Casal, G. (2014). The meaning of the h-index. International Journal of Clinical and Health Psychology, 14(2), 161–164. https://doi.org/10.1016/S1697-2600(14)70050-X.

  • Hirsch, J. E. (2020). Superconductivity, what the H? The emperor has no clothes. ArXiv. http://arxiv.org/abs/2001.09496.

  • Holbrook, J. B. (2019). Designing responsible research and innovation to encourage serendipity could enhance the broader societal impacts of research. Journal of Responsible Innovation, 6(1), 84–90. https://doi.org/10.1080/23299460.2017.1410326.

  • Hug, S. E. and Aeschbach, M. (2020). Criteria for assessing grant applications: a systematic review. Palgrave Communications, 6, 37. https://doi.org/10.1057/s41599-020-0412-9.

  • Ioannidis, J. P. A., Boyack, K. W. and Klavans, R. (2014). Estimates of the Continuously Publishing Core in the Scientific Workforce. PLoS ONE, 9(7), e101698. https://doi.org/10.1371/journal.pone.0101698.

  • James, C., Colledge, L., Meester, W., Azoulay, N. and Plume, A. (2019). CiteScore metrics: Creating journal metrics from the Scopus citation index. Learned Publishing, 32(4), 367–374. https://doi.org/10.1002/leap.1246.

  • Jump, P. (2015). REF 2014: impact element cost £55 million. Times Higher Education. https://www.timeshighereducation.com/news/ref-2014-impact-element-cost-55-million/2019439.article accessed 23 September 2023.

  • Kim, J. and Kim, J. (2015). Rethinking the comparison of coauthorship credit allocation schemes. Journal of Informetrics, 9(3), 667–673. https://doi.org/10.1016/j.joi.2015.07.005.

  • King’s College London and Digital Science (2015). The nature, scale and beneficiaries of research impact. https://www.kcl.ac.uk/policy-institute/assets/ref-impact.pdf accessed 23 September 2023.

  • Kousha, K. and Thelwall, M. (2017). Patent citation analysis with Google. Journal of the Association for Information Science and Technology, 68(1), 48–61. https://doi.org/10.1002/asi.23608.

  • Kryl, D., Allen, L., Dolby, K., Sherbon, B. and Viney, I. (2012). Tracking the impact of research on policy and practice: Investigating the feasibility of using citations in clinical guidelines for research evaluation. BMJ Open, 2(2), e000897. https://doi.org/10.1136/bmjopen-2012-000897.

  • Kudos. (2020). Researchers’ needs in maximizing broader impacts of research. https://info.growkudos.com/btd-summary-dl accessed 23 September 2023.

  • Kuhn, T. S. (1962). The Structure of Scientific Revolutions. University of Chicago Press.

  • Leydesdorff, L., Wouters, P. and Bornmann, L. (2016). Professional and citizen bibliometrics: complementarities and ambivalences in the development and use of indicators—a state-of-the-art report. Scientometrics, 109(3), 2129–2150. https://doi.org/10.1007/s11192-016-2150-8.

  • Leyser, O. (2020). CaSE Annual Lecture 2020. CaSE Events. https://www.sciencecampaign.org.uk/engaging-with-policy/events/case-annual-lecture-2020.html accessed 23 September 2023.

  • Louder, E., Wyborn, C., Cvitanovic, C. and Bednarek, A. T. (2021). A synthesis of the frameworks available to guide evaluations of research impact at the interface of environmental science, policy and practice. Environmental Science and Policy, 116, 258–265. https://doi.org/10.1016/j.envsci.2020.12.006.

  • Manh, H. D. (2015). Scientific publications in Vietnam as seen from Scopus during 1996–2013. Scientometrics, 105(1), 83–95. https://doi.org/10.1007/s11192-015-1655-x.

  • Mckiernan, E. C., Schimanski, L. A., Nieves, C. M., Matthias, L., Niles, M. T. and Alperin, J. P. (2019). Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations. https://doi.org/10.7287/peerj.preprints.27638v2.

  • McNutt, M. K., Bradford, M., Drazen, J. M., Hanson, B., Howard, B., Jamieson, K. H., Kiermer, V., Marcus, E., Pope, B. K., Schekman, R., Swaminathan, S., Stang, P. J. and Verma, I. M. (2018). Transparency in authors’ contributions and responsibilities to promote integrity in scientific publication. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2557–2560. https://doi.org/10.1073/pnas.1715374115.

  • Mingers, J. and Leydesdorff, L. (2015). A review of theory and practice in scientometrics. European Journal of Operational Research, 246(1), 1–19. https://doi.org/10.1016/j.ejor.2015.04.002.

  • Moed, H. F. (2011). The Source-Normalized Impact per Paper (SNIP) is a valid and sophisticated indicator of journal citation impact. Journal of the American Society for Information Science and Technology, 62(1), 211–213. https://doi.org/10.1002/asi.21424.

  • Moher, D., Naudet, F., Cristea, I. A., Miedema, F., Ioannidis, J. P. A. and Goodman, S. N. (2018). Assessing scientists for hiring, promotion, and tenure. PLOS Biology, 16(3), e2004089. https://doi.org/10.1371/journal.pbio.2004089.

  • Morgan Jones, M., Manville, C. and Chataway, J. (2017). Learning from the UK’s research impact assessment exercise: a case study of a retrospective impact assessment exercise and questions for the future. Journal of Technology Transfer. https://doi.org/10.1007/s10961-017-9608-6.

  • Morris, Z. S., Wooding, S. and Grant, J. (2011). The answer is 17 years, what is the question: Understanding time lags in translational research. Journal of the Royal Society of Medicine, 104(12), 510–520. https://doi.org/10.1258/jrsm.2011.110180.

  • Moxham, N. and Fyfe, A. (2018). The Royal Society and The Prehistory of Peer Review, 1665–1965. The Historical Journal, 61(4), 863–889. https://doi.org/10.1017/S0018246X17000334.

  • Moya-Anegón, F., Guerrero-Bote, V. P., Bornmann, L. and Moed, H. F. (2013). The research guarantors of scientific papers and the output counting: A promising new approach. Scientometrics, 97(2), 421–434. https://doi.org/10.1007/s11192-013-1046-0.

  • National Science Foundation (NSF). (n.d.) https://www.nsf.gov/od/oia/special/broaderimpacts/ accessed 23 September 2023.

  • Nature (2015). Technical support. Nature, 517(7536), 528. https://doi.org/10.1038/517528a.

  • Newson, R., King, L., Rychetnik, L., Milat, A. and Bauman, A. (2018). Looking both ways: A review of methods for assessing research impacts on policy and the policy utilisation of research. Health Research Policy and Systems, 16(1). https://doi.org/10.1186/s12961-018-0310-4.

  • Nguyen, T. v., Ho-Le, T. P. and Le, U. V. (2017). International collaboration in scientific research in Vietnam: an analysis of patterns and impact. Scientometrics, 110(2), 1035–1051. https://doi.org/10.1007/s11192-016-2201-1.

  • Quan, W., Mongeon, P., Sainte-Marie, M., Zhao, R. and Larivière, V. (2019). On the development of China’s leadership in international collaborations. Scientometrics, 120(2), 707–721. https://doi.org/10.1007/s11192-019-03111-1.

  • Reale, E., Avramov, D., Canhial, K., Donovan, C., Flecha, R., Holm, P., Larkin, C., Lepori, B., Mosoni-Fried, J., Oliver, E., Primeri, E., Puigvert, L., Scharnhorst, A., Schubert, A., Soler, M., Soòs, S., Sordé, T., Travis, C. and Van Horik, R. (2018). A review of literature on evaluating the scientific, social and political impact of social sciences and humanities research. Research Evaluation, 27(4), 298–308. https://doi.org/10.1093/reseval/rvx025.

  • Robinson-Garcia, N., Costas, R., Isett, K., Melkers, J. and Hicks, D. (2017). The unbearable emptiness of tweeting — about journal articles. PLOS ONE, 12(8), e0183551. https://doi.org/10.1371/journal.pone.0183551.

  • Rowlands, I. (2018a). Is it time to bury the h-index? The Bibliomagician. https://thebibliomagician.wordpress.com/2018/03/23/is-it-time-to-bury-the-h-index/ accessed 23 September 2023.

  • Rowlands, I. (2018b). What are we measuring? Refocusing on some fundamentals in the age of desktop bibliometrics. FEMS Microbiology Letters, 365(8), 59. https://doi.org/10.1093/femsle/fny059.

  • Siddiqi, A., Stoppani, J., Anadon, L. D. and Narayanamurti, V. (2016). Scientific Wealth in Middle East and North Africa: Productivity, Indigeneity, and Specialty in 1981–2013. PLOS ONE, 11(11), e0164500. https://doi.org/10.1371/journal.pone.0164500.

  • Sivertsen, G., Rousseau, R. and Zhang, L. (2019). Measuring scientific contributions with modified fractional counting. Journal of Informetrics, 13(2), 679–694. https://doi.org/10.1016/j.joi.2019.03.010.

  • Smith, K. E., Bandola-Gill, J., Meer, N., Stewart, E. and Watermeyer, R. (2020). The Impact Agenda —Controversies, Consequences and Challenges. Policy Press.

  • Squazzoni, F., Bravo, G., Grimaldo, F., Garcıa-Costa, D., Farjam, M. and Mehmani, B. (2020). Only Second-Class Tickets for Women in the COVID-19 Race. A Study on Manuscript Submissions and Reviews in 2329 Elsevier Journals. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3712813.

  • Thelwall, M. (2020). The Pros and Cons of the Use of Altmetrics in Research Assessment. Scholarly Assessment Reports, 2(1), 2. https://doi.org/10.29024/sar.10.

  • Traag, V. A. (2021). Inferring the causal effect of journals on citations. Quantitative Science Studies. doi: https://doi.org/10.1162/qss_a_00128.

  • UKRI. (n.d.). REF impact. https://re.ukri.org/research/ref-impact/

  • VSNU, NFU, KNAW, NWO and ZonMw. (2019). Room for everyone’s talent. https://www.nwo.nl/en/position-paper-room-everyones-talent accessed 23 September 2023.

  • Waltman, L. and Larivière, V. (2020). Special issue on bibliographic data sources. Quantitative Science Studies, 1(1), 360–362. https://doi.org/10.1162/qss_e_00026.

  • Waltman, L. and Traag, V. A. (2020). Use of the journal impact factor for assessing individual articles need not be statistically wrong. F1000Research, 9, 366. https://doi.org/10.12688/f1000research.23418.1.

  • Waltman, L. and van Eck, N. J. (2012). The inconsistency of the h-index. Journal of the American Society for Information Science and Technology, 63(2), 406–415. https://doi.org/10.1002/asi.21678.

  • Waltman, L. and van Eck, N. J. (2015). Field-normalized citation impact indicators and the choice of an appropriate counting method. Journal of Informetrics, 9(4), 872–894. https://doi.org/10.1016/j.joi.2015.08.001.

  • Wang, L. and Wang, X. (2017). Who sets up the bridge? Tracking scientific collaborations between China and the European Union. Research Evaluation, 26(2), 124–131. https://doi.org/10.1093/reseval/rvx009.

  • Wang, Y., Jones, B. F. and Wang, D. (2019). Early-career setback and future career impact. Nature Communications, 10(1), 1–10. https://doi.org/10.1038/s41467-019-12189-3.

  • Wellcome Trust. (2020). What Researchers Think About the Culture They Work In. https://wellcome.org/reports/what-researchers-think-about-research-culture accessed 23 September 2023.

  • Willems, L. and Plume, A. (2021). Great power or great responsibility: What is the meaning of ‘corresponding authorship’ in modern research? ICSR Perspectives. https://www.elsevier.com/icsr/perspectives/corresponding-authorship-in-modern-research accessed 23 September 2023.

  • Willems, L., Wade, E., Herbert, R. and Plume, A. (2022). Tales of the unexpected: Designing for serendipity in research. ICSR Perspectives. https://www.elsevier.com/icsr/perspectives/designing-for-serendipity-in-research accessed 23 September 2023.

  • Wuchty, S., Jones, B. F. and Uzzi, B. (2007). The increasing dominance of teams in production of knowledge. Science, 316(5827), 1036–1039. https://doi.org/10.1126/science.1136099.

  • Yaqub, O. (2018). Serendipity: Towards a taxonomy and a theory. Research Policy, 47(1), 169–179. https://doi.org/10.1016/j.respol.2017.10.007.

  • Yousif, A., Niu, Z., Tarus, J. K. and Ahmad, A. (2019). A survey on sentiment analysis of scientific citations. Artificial Intelligence Review, 52(3), 1805–1838. https://doi.org/10.1007/s10462-017-9597-8.

  • Zhang, L. and Sivertsen, G. (2020). The New Research Assessment Reform in China and Its Implementation. Scholarly Assessment Reports, 2(1), 3. https://doi.org/10.29024/sar.15.