By Les Willis, Nutrition Consultant www.healthyaction.co.uk
Entering into the profession of dietetics few concepts will be more fundamental than that of evidence based practice
. Indeed in healthcare/medicine as a collection of professions, evidence based practice lies at the heart of what practitioners do every day. Understanding what evidence based practice encompasses is in itself a core skill for any healthcare professional. As we shall see in dietetics, evidence based practice has its own particular challenges.
The concept of evidence based practice is simple enough, exactly what it says on the tin; what professionals do is supported by evidence; or much more eloquently explained by David Sackett, universally recognized as a founding father of evidence based medicine: 'the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients'1. This sounds simple, however, on closer examination (rather like a patient); it not only raises a number of questions but places a significant amount of responsibility on the practitioner.
Not least of these questions are; how do we know what the best evidence is and how do we use it to make a decision. Effective evidence based practice relies on the ability of the practitioner (that's us!) to ask effective, answerable clinical questions of the evidence, to critically appraise that evidence, and then use the answers in combination with existing knowledge in the treatment of those entrusted to our care.
Hierarchy of Evidence:
1a: Systematic review of randomized control trials
1b: Individual randomized control trials
1c: All or none case series
2a: Systematic review of cohort studies
2b: Individual cohort study or randomized control trial with <80% follow up
2c: Outcomes research; ecological studies
3a: Systematic review of case control studies
3b: Individual case control study
4: Case Series
5: Expert opinion
Taken from Gray & Gray. Evidence based medicine: Applications in dietetic practice. Journal of the American Dietetic Association. Sept 2002. Vol 102 No 9.
The responsibility to give sound advice based on evidence synthesized with current knowledge and practice is incumbent not only on healthcare professionals but on anyone giving advice – be that nutritional or otherwise. Quackery has no place in the cannon of evidence based practice.
What counts as Evidence?
Evidence is, of course, central to evidence based practice. The key is to base decisions on the best evidence. To aid us in our quest we use a hierarchy of evidence by which to classify the best sources (interestingly the Daily Mail doesn't appear). The hierarchy grades the evidence, the higher up the better; however clinical decisions can be made on any type of evidence in the hierarchy depending on the prevailing circumstances.
Within this hierarchy of evidence it is assumed that the publication is peer reviewed. Peer review is a way of ensuring that what is published is accurate and useful – of course it is a matter of personal judgment the amount of faith you put in the source as well as the author(s) of the article concerned.
Where to find the Evidence
There are a lot of searchable databases for medical information, having a good search strategy is essential to navigate, however, when setting out it is useful to know a good place to start; these are just three examples:
Bandolier: (www.medicine.ox.ac.uk/bandolier/) is an evidence based medicine journal and gives an appraisal of primary research in summary. A good launch pad when considering any clinical question as well as when deeply immersed. And it is free too!
The Cochrane Library: (www.cochrane.org/) widely considered as conducting the gold standard of systematic reviews. A Cochrane systematic review is a critical assessment and evaluation of research combining results to give a summary estimate. These articles are characterized by clearly defined literature reviews and a heavy leaning on randomized control trials when weighting their summary results. Another great free resource.
PubMed: (www.pubmed.com) a free searchable database of articles which has a much overused find related articles feature; so it is easy to turn up many thousands of results if search without a strategy. Some of the articles are available free – most need a subscription.
Types of Research
There are two main categories of research to be aware of: Quantitative and Qualitative. In healthcare the emphasis has traditionally been on the quantitative with its emphasis on the measurable. Numerical results can be analyzed for significance and give a sense of concreteness to the results. However, healthcare is not as simple as 'did it work or not?' there is the important element of quality, and it is the measurement of the experience and quality of health care where qualitative research is important. Both combine to complete the healthcare research picture and the delivery of effective services, after all an effective treatment is useless if no one takes it up because it is completely unbearable when it's delivered.
Knowing where to start to look is one thing, however asking the right questions of the literature is essential to make sure you get the right answers, after all the applicability of your answer is only as good as the question you asked.
Formulating a question – The PICO method
P – Patient, population or problem that you are interested in
I – Intervention; this can be an exposure or prognostic factor
C – Comparison, control, alternative treatment or intervention
O – Outcome(s)
The PICO method guides us in asking effective and answerable questions, as we do our research in to the evidence the question can and probably will be reformulated; often more than once.
By refining the question we direct ourselves to the best possible evidence available, searching randomly may turn up something helpful but an awful lot that isn't and it will take an awful lot longer.
Appraising the Evidence
Once you are asking quality questions of the evidence you need to ensure that the evidence you are asking can give you a good quality answer. You can ask my 5 year old what the best car for going fast is, you can clearly define the particular road, its conditions, even the time as well as day of the week and he will give you a clear and definite answer. I wouldn't trust it much even if his favourite programme is Top Gear!
Validity and Reliability
Validity asks whether the research measures what it thinks it is measuring, and whether that measure applies. It is surprisingly common for the measurable outcome to be of dubious relevance to the question being asked. In sports studies, for example, often researchers will test using untrained subjects when the intended beneficiaries are trained athletes. Sometimes the mismatch between outcome measure and intended outcome can be a lot more subtle.
Validity can be affected in other ways too; the killer blow to any research is the introduction of bias. Unintentional, or otherwise, bias can fundamentally undermine any piece of research. That's why the randomized double blind control trial is considered by many to be the best type of research – no one knows who is getting the intervention till the study is over – that way the subjects and researchers cannot affect the results by knowing who is who.
The more chance there is of the researcher or subject knowing who is who in the study the more likely the results are to have been affected by this knowledge. The pressure for a successful published project can influence research very easily and subtly as well.
Another significant blow to validity can be dealt by a high dropout rate; if people cannot stay the course then you have to ask just how useful the intervention being studied really is.
Reliability asks whether the research could be repeated and gets the same results (or even results that are close). Often it is hard to tell in an article that is pressed for space, however there needs to be a way of knowing what the researchers did. It is no use having a great intervention or treatment if it only works once – and only when you do it. Also if the results cannot be replicated it asks some very serious questions about what really happened in the original study compared to the published article.
Validity and reliability are really questions about the integrity of the article. Do you trust it, does it give you confidence that what the authors say happened actually happened and that it will happen again when other people do the same? Once you are confident you have a good quality study there some other checks to be made. Check that the study applies to the same people that you have in mind, there may be problems in implementing an exercise regime that lowers blood pressure for healthy teens in a group of nursing home residents for example. Happy that the evidence applies, it is worth considering whether your people will actually do it, asking vegans to increase their fish intake to get omega-3 is not going to go well. Personal values and preferences need to inform practice too. Finally, is it even possible to implement what the evidence suggests? Salmon and broccoli may be a great meal to help improve health but for the unemployed single mum of two?
The Final Piece in the Puzzle – That's you!
The final part of evidence based practice, having formulated an effective question, reviewed and appraised quality evidence it is up to you to decide what to do next. Often there is no clear cut answer; sources of evidence can be anything from unhelpful and undecided to downright contradictory or almost completely non-existent.
Making decisions is a lot easier following the tenets of evidence based practice but - different practitioners will still arrive at different conclusions whatever the question. Which is why clinical guidance, such as that from NICE and SIGN, is so useful. Clinical guidance results from a panel of experts who have considered the evidence pertinent to a specific clinical question and who have formulated a set of best strategies as a result of that consideration.
There is a lot more to evidence based practice than it first seems. I have only brushed lightly over some concepts and considerations to give a feel of what evidence practice really involves. Hopefully, the importance of evidence based practice has become clear, without it any profession is nothing more a collection of opinions. And a collection of opinions is no way to conduct a profession because everyone has their own. Evidence based practice provides the framework in which to formulate an opinion that is both relevant and informed.