In 1996, when the Joanna Briggs Institute (JBI) was established, the Cochrane Collaboration had been operating for only three years. From its inception, we in JBI saw ourselves as a kind of "sister" organization to Cochrane - indeed, the inaugural Chairperson of the JBI Committee of Management was the Director of the Australasian Cochrane Centre, and Cochrane has, until very recently, been represented on the JBI Committee of Management (later to become the Advisory Board). The Cochrane approach to the systematic review of evidence of the effects of interventions has always been regarded as the "gold standard" by JBI reviewers and is still the methodology used for JBI reviews of effects.
Although the early JBI team was multidisciplinary, the initial focus of work was essentially on the broad field of nursing and nurses,a as an organization whose evidence/knowledge interests, the early team felt, were not able to be met by the Cochrane project. Generally, there were very few high quality controlled trials in nursing and the majority of questions of interest to nurses concerned the practicalities of implementing the changes in practice required to implement evidence: patients' experiences of health, illness and healthcare; and processes of care (questions that were, on the whole, not related to cause-and-effect relationships). Because of this, the JBI mission was to compliment and build on the work of Cochrane by focusing on programs to translate knowledge/evidence into action in health policy and practices, and on the systematic review of evidence other than that derived from clinical trials.
The work of JBI over the past 20 years on translating knowledge/evidence into action has been enormous and the role JBI has played in getting evidence into action in numerous health systems in Australia and across the world is arguably more than any other organization.
It is in the area of the systematic review of evidence other than that derived from clinical trials, however, that I think JBI has excelled that is worthy of more examination, particularly because in the mainstream literature on evidence synthesis, this work is rarely recognized - or even referenced.
A small group of Australian scholars from various methodological backgrounds started work on trying to develop a theoretically grounded approach to searching for, appraising and synthesizing the findings of qualitative studies.b This work led to the development of meta-aggregation, founded in the work of Edmund Husserl1 and of the American Pragmatists (namely William James and others from the American Pragmatist School2) and grounded in the ideas set out by Estabrooks, Field and Morse (1994)3 on aggregation. By the year 2001, this methodological work was supported by an on-line software program (QARI - the Qualitative Assessment and Review Instrument) and a structured training program for potential qualitative reviewers.4 This represents the first comprehensive methodology and electronic "toolkit" designed for the synthesis of qualitative data - and from it flowed, over the years, a suite of methods and online software to review narrative (non-research data), economic, diagnostic test accuracy, and prevalence and incidence data, as well as scoping reviews, umbrella reviews and mixed methods reviews.
Given this significant contribution to scholarship in the field of evidence-based healthcare, it is curious, when examining the literature, to note that it is very rarely referred to, given that very little work was carried out in this area until after the commencement of the JBI work.
Although there is earlier work on synthesizing two or more qualitative studies within sociology and the social sciences - for example, Zhao (1991), and; Noblit and Hare (1988)5 - very little substantive work emerged in the health field before the year 2000 apart from Estabrooks, Field and Morse's work,3 and the work of Sandalowski (1997).6 Estabrooks and colleagues, and Sandalowski and colleagues have continued to contribute to the discourse on the synthesis of qualitative data and they also continue to be well cited in contemporary literature. From 2000, interest in the use of meta-ethnography as a method of qualitative synthesis of use in the health field emerged, notably in the work of Britten et al. (2002)7 who are again cited frequently in the extant literature. The work of JBI in this area is however relatively invisible. For example, an apparently well designed "critical review" of methods for the synthesis of qualitative research by Barnett-Page and Thomas for the Economic and Social Research Council in the UK8 makes no citations to the JBI meta-aggregative approach to qualitative synthesis, as is the case in much of mainstream literature on this topic. This is indeed strange given that the number of systematic reviews published using the JBI methodology and software far exceeds those of any other approach and given the outstanding analytical work on the approach by JBI scholars such as Kylie Porritt,9-12 Craig Lockwood9,10 and Catalin Tufunaru. Is it possibly because of where those involved in the JBI global project choose to publish? Or are there other reasons why this important body of work is invisible in evidence-based healthcare literature?
References