When I use a word . . . Evidence synthesis—terms and types
BMJ 2025; 389 doi: https://doi.org/10.1136/bmj.r892 (Published 02 May 2025) Cite this as: BMJ 2025;389:r892- Centre for Evidence Based Medicine, Nuffield Department of Primary Care Health Sciences, University of Oxford, Oxford, UK
- Follow Jeffrey on X: @JKAronson
Reviews
At one time literature reviews were commonly constructed using the evidence that the writers wanted to rely on, omitting any evidence that might contradict the views they wanted to promulgate. Now we expect better. We expect that any review worthy of the name should include all the available evidence, or at least all the available evidence of a quality sufficiently high to make it worthy of inclusion, regardless of what the author thinks of it. We call this “evidence synthesis.”
The fact that it can be difficult to decide what evidence to include in a review, given the many different types of evidence available and their highly variable quality, with all their biases, stands in the way of perfect evidence synthesis, but the principles nevertheless remain.
The general principles are that the relevant literature has to be sought and collected, omitting nothing of relevance. The pieces of information or data included in the publications found and designated to be included in the review have to be extracted and evaluated, and the results have to be synthesised, interpreted, and presented.
Evidence synthesis
The idea of systematically reviewing all the published and unpublished evidence on a subject has a long history, some of it antedating the evidence based medicine movement.
In his presidential address to the 54th meeting of the British Association for the Advancement of Science, held in Montreal in 1884, Lord Rayleigh (John William Strutt, 1842–1919) observed that “Two process are at work side by side [in the advancement of science], the reception of new material and the digestion and assimilation of the old ...”1
Rayleigh’s use of the word “digestion” implies merely that one should take notice of the old research when contemplating the new. But his use of the word “assimilation” suggests something more: “to absorb and incorporate,” as “assimilate” is defined in the Oxford English Dictionary (OED).2 And “assimilation” is “The process whereby the individual acquires new ideas, by interpreting presented ideas and experiences in relation to the existing contents of his or her mind.”3 In other words, one should not only seek out and take notice of the old research but also combine it with the new to reach a better understanding—evidence synthesis.
Early instances
A few may have taken note of Rayleigh’s view at the time, and a few who were not in Montreal to hear his address, or did not later read it in the published version, must have thought similarly, because there are several examples of what he was alluding to—the need to take into account previous research when interpreting new findings, although they did not start to emerge until the 20th century.
Others, for example, have cited the case of Karl Pearson’s synthesis of data from 11 studies of the beneficial effects of typhoid vaccination and several other comparable cases in the fields of environmental studies, education, physics, and agriculture.4 Evidence synthesis undoubtedly existed before it got its name.
Terminology
Since those early examples, the idea of evidence synthesis has grown steadily, and the terms surrounding it have burgeoned bewilderingly.
“Research synthesis” is an alternative general term used to cover the field, but I prefer “evidence synthesis”: apart from the fact that it is used more often—as a crude example, 8946 versus 822 hits as textwords in PubMed—what we want to synthesise is not the research but the evidence that proceeds from it.
Other terms that are commonly used are “systematic review” and “meta-analysis.” “Systematic” is defined in the OED as “Of a writing or treatise: containing or presenting a complete exposition or outline of a subject.”5 And “meta-analysis,” a word invented by Glass in 1976,6 implies an analysis of analyses7; its history has been well documented.8
The authors of a review of different types of review identified 14: critical review, literature review, mapping review, meta-analysis, mixed studies review, overview, qualitative systematic review, rapid review, scoping review, state-of-the-art review, systematic review, systematic search and review, systematised review, and umbrella review.9
This collection of types is incomplete and its heterogeneity shows that the whole subject needs detailed re-evaluation, perhaps in a systematic review.
A personal experience
A few years ago, when preparing a talk on the use of plasma, serum, or blood concentration measurements of drugs in monitoring drug therapy, I consulted a paper on the use of radioimmunoassay in measuring plasma digoxin concentrations that I had written in 1975, published as a book chapter.10 In that paper I listed the results of 20 studies of the use of plasma digoxin concentrations, as measured by radioimmunoassay in 14 cases and other methods in six, to compare individuals with evidence of digoxin toxicity and those without. I also included four studies in which myocardial and plasma concentrations had been measured simultaneously. Averaging the results of the 14 studies in which radioimmunoassay had been used, I found that the mean concentration in patients with evidence of toxicity was 3.1 ng/mL compared with 1.3 ng/mL in those without toxicity, a highly significant difference.
Rereading my paper many years later, I realised, to my surprise, that what I had done was in fact an evidence synthesis, involving a systematic review of as much of the literature as I could find at the time and a meta-analysis of the resultant data, although I am sure that my methods, particularly my statistical analysis, would not pass muster by today’s standards.
Thinking back, I also realised that, having been asked to contribute the talk that eventually became the book chapter, I had in fact been struggling with a large amount of published literature on the subject, which I needed to tame. Without knowing about meta-analysis, a term that had yet to be invented, or the whole topic of evidence synthesis, including systematic reviews, I had simply tackled my own struggle by corralling the evidence and summarising it.
A final thought
In an essay titled “Why I Write,” which she admitted having stolen from George Orwell, the late Joan Didion wrote: “I write entirely to find out what I’m thinking, what I’m looking at, what I see and what it means. ... What is going on in these pictures in my mind? ... The picture tells you how to arrange the words and the arrangement of the words tells you, or tells me, what’s going on in the picture. Nota bene: It tells you. You don’t tell it.”11 Many other writers have expressed the same view, albeit in different words.
And that is precisely why we undertake evidence synthesis, to find out what we’re looking at, what we see, and what it means. And what it tells us.
Footnotes
Competing interests: None.
Provenance and peer review: Not commissioned; not externally peer reviewed.