What Evidence Based Practice is NOT:
- Scanning a few pages of Wikipedia, finding a few that seem to link to what you are doing and then claiming it is 'evidence based'
- Doing a quick, after the event, 'back of a fag packet' evaluation of a single intervention and then claiming it worked and provides an evidence base for all future similar initiatives (commonly called 'projects'...)
- Doing a serious and well thought through evaluation of an intervention but overlooking the need for any kind of control group - ideally a randomised one (commonly called 'pilots'...)
- Not knowing what a control group is... (or what 'randomised' means)
- Conflating correlation with causation
- Lurching to conclusions without any peer challenge on the methods you have used and the analysis you have drawn
- Confusing evidence (from scientific study) with (forensic) evidence
- Avoiding statistical analysis because the 'numbers are so small' and/or not knowing what role probability takes in all this
- Just putting a lot of spurious references at the end of your report
- Art, rhetoric, flimflam or politics
- Justifying what you feel like doing anyway and trying to pretend it is scientific!!!
Am I shouting loudly enough yet?
I was fortunate to spend three years of my life studying psychology. During that time, I learnt a great deal about science, method and just what conclusions can be reasonably drawn from experiments. Because psychologists are generally scorned by 'proper' scientists (like physicists or geologists), and we are surrounded by the 'pop' psychology of journalism and gossip, we have to have methods of inquiry and analysis that are really 'hot' when it comes to making definitive conclusions.
Whilst I do not claim to be an expert on 'evidence based practice', I think I know what it isn't!
So please (please!) do not claim some new initiative or practice is 'evidence based' without it really being so!
No comments:
Post a Comment