14 and 15 March 2012
Led by Evidence Aid and hosted by the Centre for Global Health, Trinity College Dublin, Ireland
This inaugural 2-day training workshop on systematic reviews, with a particular focus on their use in disasters and other humanitarian emergencies, was led by Mike Clarke (one of the founders of Evidence Aid). The course was hosted and introduced by Elish McAuliffe of the Centre for Global Health, Trinity College Dublin; and was provided free of charge to people with an interest in this area. Fourteen participants attended from a variety of organisations, including NGOs, charities and academic institutions. Their broad range of backgrounds covered people who were already conversant with the conduct of systematic reviews but from outside the disaster setting, to those who had never done or used them before. Mike started with the importance of formulating a clear question for the review, as the foundation for the rest of its conduct, including the eligibility criteria and search strategy. He also stressed the need to ensure that the question fits with the uncertainties that the audience for the review will wish to see addressed. Whilst a researcher could formulate a perfectly valid and interesting question, it would be of little or no value to decision makers if the question was not relevant to the problems they face as policy-makers, practitioners, patients or the public.
The course made extensive use of small group problem based learning, which took place both inside and outside the classroom. This allowed everyone to share their views and experiences, and to contribute to the discussions of the various elements that are key to the design of reviews. These discussions focused on topics that had been suggested by the participants themselves. They showed how much of the planning and conduct of systematic reviews can be tackled with common sense, applied within the framework of the systematic review process, and revealing how the process can be much less onerous than first thought.
When the workshop moved on to the dreaded statistics, the participants found how this might be related to horse racing. Odds, risks, and ratios all find their matches in betting on a horse and, by the end of the afternoon, the differences between these statistics were clearer, as was the importance of identifying which were being used when assessing the findings of a review or one of its included studies and the need for a careful plan when extracting data.