Chapter 13: Methods development in evidence synthesis: a dialogue between science and society
Open access

This chapter is about the science of evidence synthesis: the way that academics bring together knowledge from across multiple studies into a whole, to present the state of current understanding about a given area. While scientists have been doing this for centuries in the form of literature reviews, the advent of ‘evidence informed’ decision-making over the past 30-40 years has forced academics to develop a form of literature review that was demonstrably the sum of available knowledge in its area (rather than providing a partial and potentially biased picture). The key challenge methodologically has been in providing useful and useable evidence that can inform decisions, whilst not compromising on the high standards that usually need to be met to make claims about causality. Addressing this challenge has required the evolution of new research methods across multiple disciplines - something that seems likely to continue into the future.

  • Anderson, L. M., Petticrew, M., Rehfuess, E., Armstrong, R., Ueffing, E., Baker, P., Francis, D. and Tugwell, P. (2011). Using logic models to capture complexity in systematic reviews. Research Synthesis Methods, 2(1), 33–42. https://doi.org/10.1002/jrsm.32.

  • Arno, A. D., Elliott, J., Wallace, B., Turner, T., and Thomas, J. (2021). The views of health zguideline developers on the use of automation in health evidence synthesis, 10(16), 1–10. https://doi.org/10.21203/rs.3.rs-23742/v1.

  • Bender, E. M., Gebru, T., McMillan-Major, A., and Shmitchell, S. (2021). On the dangers of stochastic parrots: Can language models be too big? FAccT 2021 - Proceedings of the 2021 ACM Conference on Fairness, Accountability, and Transparency (Vol. 1). Association for Computing Machinery. https://doi.org/10.1145/3442188.3445922.

  • Cochrane, A. (1979). 1931–1971: a critical review, with particular reference to the medical profession. In Medicines for the year 2000.

  • Collini, S. (2012). What are Universities for? Penguin.

  • Easterbrook, P. J., Gopalan, R., Berlin, J. A. and Matthews, D. R. (1991). Publication bias in clinical research. The Lancet, 337(8746), 867–872. https://doi.org/10.1016/0140-6736(91)90201-Y.

  • Etzkowitz, H. and Leydesdorff, L. (2000). The dynamics of innovation: From National Systems and ‘mode 2’ to a Triple Helix of university-industry-government relations. Research Policy, 29(2), 111. https://doi.org/10.1016/S0048-7333(99)00055-4.

  • Glass, G. V. (2015). Meta-analysis at middle age: A personal history. Research Synthesis Methods, 6(3), 221–231. https://doi.org/10.1002/jrsm.1133.

  • Greenhalgh, T. (1997). How to read a paper: Getting your bearings (deciding what the paper is about). British Medical Journal, 315(7102), 243–246. https://doi.org/10.1136/bmj.315.7102.243.

  • Hammersley, M. (2001). On ‘Systematic’ Reviews of Research Literatures: A ‘narrative’ response to Evans and Benefield. British Educational Research Journal, 27(5), 543–554. https://doi.org/10.1080/01411920120095726.

  • Hannes, K., Booth, A., Harris, J. and Noyes, J. (2013). Celebrating methodological challenges and changes: reflecting on the emergence and importance of the role of qualitative evidence in Cochrane reviews. Systematic Reviews, 2, 84. https://doi.org/10.1186/2046-4053-2-84.

  • Hedges, L. V. and Olkin, I. (1985). Statistical Methods for Meta-Analysis. Academic Press.

  • Higgins, J. P. T., Lasserson, T., Chandler, J., Tovey, D., Thomas, J., Flemyng, E. and Churchill, R. (2021). Methodological Expectations of Cochrane Intervention Reviews (MECIR). Standards for the conduct and reporting of new Cochrane Intervention Reviews, reporting of protocols and the planning, conduct and reporting of updates. London.

  • Higgins, J. P. T., Altman, D. G., Gotzsche, P. C., Juni, P., Moher, D., Oxman, A. D., Savovic, J., Schulz, K. F., Weeks, L. and Sterne, J. A. C. (2011). The Cochrane Collaboration’s tool for assessing risk of bias in randomised trials. BMJ, 343(18 Oct), d5928–d5928. https://doi.org/10.1136/bmj.d5928.

  • Higgins, J. P. T., López-López, J. A., Becker, B. J., Davies, S. R., Dawson, S., Grimshaw, J. M., McGuinness, L. A., Moore, T. H. M., Rehfuess, E. A., Thomas, J. and Caldwell, D. M. (2019). Synthesising quantitative evidence in systematic reviews of complex health interventions. BMJ Global Health, 4(Suppl 1), e000858. https://doi.org/10.1136/bmjgh-2018-000858.

  • Illari, P. and Russo, F. (2014). Causality: Philosophical Theory Meets Scientific Practice. Oxford: Oxford University Press.

  • Kelly, M. P. (2018). The need for a rationalist turn in evidence-based medicine. Journal of Evaluation in Clinical Practice, (May), 1–8. https://doi.org/10.1111/jep.12974.

  • Kleijnen, J., Gøtzsche, R., Kunz, A., Oxman, A. and Chalmers, I. (1997). So what’s so special about randomisation? In A. Maynard and I. Chalmers (eds), Non-random Reflections on Health Services Research: On the 25 Anniversary of Archie Cochrane’s Effectiveness and Efficiency (pp. 93–106). London: BMJ Books.

  • MacLure, M. (2005). Clarity bordering on stupidity: where’s the quality in systematic review? Journal of Education Policy, 20(4), 393–416. https://doi.org/10.1080/0268093050013180.

  • Noblit, G. and Hare, D. (1988). Meta-ethnography. Sage.

  • Nowotny, H. (2004). The potential of transdisciplinarity. H. Dunin-Woyseth, H. and M. Nielsen, Discussing Transdisciplinarity: Making Professions and the New Mode of Knowledge Production, the Nordic Reader, Oslo School of Architecture, Oslo, Norway, 10–19. https://doi.org/10.1007/978-3-0348-8419-8_7.

  • Nowotny, H., Schwartzman, S., Scott, P., and Trow, M. A. (1994). The New Production of Knowledge. Thousand Oaks: SAGE Publications.

  • Nowotny, H., Scott, P. and Gibbons, M. (2001). Re-Thinking Science: Knowledge and the Public in an Age of Uncertainty. Wiley-Blackwell.

  • O’Connor, A. M., Tsafnat, G., Thomas, J., Glasziou, P., Gilbert, S. B. and Hutton, B. (2019). A question of trust: Can we build an evidence base to gain trust in systematic review automation technologies? Systematic Reviews, 8(1), 1–8. https://doi.org/10.1186/s13643-019-1062-0.

  • O’Mara-Eves, A. and Thomas, J. (2016). Ongoing developments in meta-analytic and quantitative synthesis methods: Broadening the types of research questions that can be addressed. Review of Education, 4(1), 5–27. https://doi.org/10.1002/rev3.3062.

  • O’Mara-Eves, A., Thomas, J., McNaught, J., Miwa, M. and Ananiadou, S. (2015). Using text mining for study identification in systematic reviews: a systematic review of current approaches. Systematic Reviews, 4(1), 5. https://doi.org/10.1186/2046-4053-4-5.

  • Oakley, A., Gough, D., Oliver, S. and Thomas, J. (2005). The politics of evidence and methodology: lessons from the EPPI-Centre as a case-study. Evidence and Policy, 1(1), 5–31.

  • Oakley, A. (2006). Resistances to ‘new’ technologies of evaluation: Education research in the UK as a case study. Evidence and Policy, 2(1), 63–87. https://doi.org/10.1332/174426406775249741.

  • Pawson, R. (2002). Evidence-based Policy: The Promise of ‘Realist Synthesis’. Evaluation, 8(3), 340–358. https://doi.org/10.1177/135638902401462448.

  • Petticrew, M., Tugwell, P., Kristjansson, E., Oliver, S., Ueffing, E. and Welch, V. (2012). Damned if you do, damned if you don’t: subgroup analysis and equity. Journal of Epidemiology & Community Health, 66(1), 95–98. https://doi.org/10.1136/jech.2010.121095.

  • Petticrew, M. (2015). Time to rethink the systematic review catechism? Moving from ‘what works’ to ‘what happens.’ Systematic Reviews, 4(1), 1–6. https://doi.org/10.1186/s13643-015-0027-1.

  • Petticrew, M., Knai, C., Thomas, J., Rehfuess, E. A., Noyes, J., Gerhardus, A. … McGill, E. (2019). Implications of a complexity perspective for systematic reviews and guideline development in health decision making. BMJ Global Health, 4(Suppl 1), e000899. https://doi.org/10.1136/bmjgh-2018-000899.

  • Pluye, P., and Hong, Q. N. (2014). Combining the power of stories and the power of numbers: mixed methods research and mixed studies reviews. Annual Review of Public Health, 35, 29–45. https://doi.org/10.1146/annurev-publhealth-032013-182440.

  • Ragin, C. (2008). Redisigning Social Inquiry: Fuzzy Sets and Beyond. University of Chicago Press.

  • Rehfuess, E. A., Booth, A., Brereton, L., Burns, J., Gerhardus, A., Mozygemba, K., Oortwijn, W., Pfadenhauer, L. M., Tummers, M., van der Wilt, G.-J. and Rohwer, A. (2017). Towards a taxonomy of logic models in systematic reviews and health technology assessments: A priori, staged, and iterative approaches. Research Synthesis Methods, 9(1), 13–24. https://doi.org/10.1002/jrsm.1254.

  • Reiss, J., and Sprenger, J. (2014). Scientific Objectivity. In E. N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2020 Edition). Retrieved 23 September 2023 from https://plato.stanford.edu/archives/win2020/entries/scientific-objectivity/.

  • Rogers, P. J. (2000). Causal models in program theory evaluation. New Directions for Evaluation, 87(87), 47–55. https://doi.org/10.1002/ev.1181.

  • Sackett, D. L., Rosenberg, W. M. C., Gray, J. A. M., Haynes, R. B., and Richardson, W. S. (1996). Evidence based medicine: what it is and what it isn’t. BMJ, 312(7023), 71–72. https://doi.org/10.1136/bmj.312.7023.71.

  • Siemieniuk, R. A., Bartoszko, J. J., Ge, L., Zeraatkar, D., Izcovich, A., Pardo-Hernandez, H., Rochwerg, B., Lamontagne, F., Han, M. A., Kum, E., Liu, Q., Agarwal, A., Agoritsas, T., Alexander, P., Chu, D. K., Couban, R., Darzi, A., Devji, T., Fang, B. and Fang, C. (2020). Drug treatments for covid-19: living systematic review and network meta-analysis. BMJ (Clinical Research Ed.), 370, m2980. https://doi.org/10.1136/bmj.m2980.

  • Thomas, J., Harden, A., Oakley, A., Oliver, S., Sutcliffe, K., Rees, R., Brunton, G. and Kavanagh, J. (2004). Integrating qualitative research with trials in systematic reviews. BMJ, 328(7446), 1010–1012. https://doi.org/10.1136/bmj.328.7446.1010.

  • Thomas, J., O’Mara-Eves, A. and Brunton, G. (2014). Using Qualitative Comparative Analysis (QCA) in systematic reviews of complex interventions: a worked example. Systematic Reviews, 3(67).

  • Thomas, J., McDonald, S., Noel-Storr, A., Shemilt, I., Elliott, J., Mavergames, C. and Marshall, I. J. (2021). Machine learning reduced workload with minimal risk of missing studies: development and evaluation of a randomized controlled trial classifier for Cochrane Reviews. Journal of Clinical Epidemiology, 133, 140–151. https://doi.org/10.1016/j.jclinepi.2020.11.003.

  • Thorne, S. (2017). Metasynthetic madness: What kind of monster have we created? Qualitative Health Research, 27(1), 3–12. https://doi.org/10.1177/1049732316679370.

  • Tricco, A. C., Antony, J., Zarin, W., Strifler, L., Ghassemi, M., Ivory, J., Perrier, L., Hutton, B., Moher, D. and Straus, S. E. (2015). A scoping review of rapid review methods. BMC Medicine, 13(1). https://doi.org/10.1186/s12916-015-0465-6.

  • Tsafnat, G., Dunn, A., Glasziou, P., and Coiera, E. (2013). The automation of systematic reviews. British Medical Journal, 346, f139–f139. https://doi.org/10.1136/bmj.f139.

  • van Grootel, L., van Wesel, F., O’Mara-Eves, A., Thomas, J., Hox, J. and Boeije, H. (2017). Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach. Research Synthesis Methods, (March), 1–9. https://doi.org/10.1002/jrsm.1241.

  • Weiss, C. H. (1979). The many meanings of research utilization. Public Administration Review, 39(5), 426–431. https://doi.org/10.2307/3109916.

  • Young, K., Ashby, D., Boaz, A. and Grayson, L. (2002). Social Science and the Evidence-based Policy Movement. Social Policy and Society, 1(3), 215–224. https://doi.org/10.1017/s1474746402003068.