#PTonICE Daily Show – Thursday, October 12th, 2023 – Evidence-based medicine: are you doing it wrong?

In today’s episode of the PT on ICE Daily Show, ICE COO Alan Fredendall discusses the three pillars of evidence-based medicine: clinical expertise, current best peer-reviewed evidence, and patient input. He gives suggestions on how clinicians can better incorporate all 3 pillars to improve practice.

Take a listen to the podcast episode or read the full transcription below.

If you’re looking to learn more about courses designed to start your own practice, check out our Brick by Brick practice management course or our online physical therapy courses, check out our entire list of continuing education courses for physical therapy including our physical therapy certifications by checking out our website. Don’t forget about all of our FREE eBooks, prebuilt workshops, free CEUs, and other physical therapy continuing education on our Resources tab.



Team, good morning. Welcome to the PT  on ICE Daily Show. Happy Thursday morning. I hope your morning is off to a great start. My name is Alan, happy to be your host today. Currently have the pleasure of serving as the Chief Operating Officer here at ICE and a faculty member in our Fitness Athlete Division. We’re here on YouTube, Instagram, the podcast on Thursday. It’s Leadership Thursday, that also means it is Gut Check Thursday. Gut Check Thursday this week, four rounds for time, some interval work. Four rounds, 10 handstand pushups. Those can be strict or kipping. Read the caption on Instagram for some help with modifications if you’re still working on those. 10 handstand pushups right into a 50 foot double kettlebell front rack walking lunge. Kettlebells in front of the body, working the thoracic spine, working the legs, 50 feet of a front rack lunge, and then out the door for a 200 meter run on the treadmill, whatever. The goal there is one to one work to rest. That means we’re looking to finish that round in about two minutes. Work two minutes, rest two minutes, complete for four rounds. you’ll be done in ideally about 16 minutes. So read the caption, check for modifications, scaling is needed to try to get your round time as close to two minutes as possible, modify the handstand pushups as needed, reduce the load on the lunge as needed, and then sub out the run for a row or bike as needed. So hope you have fun with that one. That’s a great one that really facilitates intensity. You’ve got some upper body with the handstand pushups, some lower body with the running and some monostructural, with the lunging, sorry, and then some monostructural with the running. So a great workout to really drop the hammer, rest, repeat a couple times, really working on that anaerobic glycolysis system. Before we get started, just some quick courses coming your way. Today I want to highlight our cervical and lumbar spine courses. A couple chances left towards the end of the year as we get near the holidays to catch cervical spine management. This weekend you can join Zach Morgan up in Waterford, Connecticut. The weekend of November 11th and 12th, you can join Jordan Berry up in Bridgewater, Massachusetts. That’s kind of the greater Boston area. And then December 2nd and 3rd, you can join Zach Morgan at his home base at Onward Tennessee in Hendersonville, Tennessee. Lumbar management, also a couple chances left before the end of the year. Next weekend, October 21st and 22nd, Jordan will be in Frederick, Maryland. That’s kind of west of the Baltimore area. He will also be in Fort Worth, Texas the weekend of November 4th and 5th. And then you have two chances the weekend of December 2nd and 3rd. You can catch our newest spine faculty member, Brian Melrose. He’ll be up in Helena, Montana. And then you can catch Jordan Berry at his home base in Onward, Charlotte, also the weekend of December 2nd and 3rd.


Today’s topic, evidence-based medicine. A couple different ways to frame this. Are you doing it right? Are you doing it wrong? Or it takes a village of really drilling down and better understanding what comprises evidence-based practice. For many folks, they think it’s the research. For others, they think it’s many, many, many years of clinical expertise, pattern recognition, and others believe none of that matters. What matters the most is actually what the patient believes is happening, what they believe will help them, and matching our treatments, our interventions, our education as best as possible to essentially the patient input side of the equation. And if you’re on the podcast, I’m gonna show a Venn diagram. You’re not missing much, if I’m being honest. I’ve got it right here on the whiteboard. What we know with evidence-based medicine is that it’s actually all of that stuff, right? It is three different spheres, three stools, whatever analogy or metaphor you’ve heard to refer to these before is correct. When we look at evidence-based medicine, is it an overlapping of, yes, scholarly evidence, peer-reviewed research, Yes, clinician experience, practice and pattern recognition. And yes, also patient expectations and beliefs, and that the point at which these three areas overlap is the middle where we have evidence-based medicine, evidence-based practice. But what you’ll find is because of this overlap, none of these areas can be evidence-based on their own. So our goal today is not to show you this Venn diagram, but to show you when evidence-based medicine goes wrong, how it goes wrong, and how we can all get a little bit sharper at evidence-based practice in our clinic with our patients. So, let’s tackle these points one by one. The first, the one we’re all most comfortable with as clinicians is our own clinical expertise. Probably more important than anything else with expertise and experience is the pattern recognition, the dose response relationship that begins to form in our brain The more patience we see, the longer we’ve been seeing patience. This is, you could call this the 10,000 hour rule, whatever you want to call it, but the belief that the more work, the more time you put in, the more you will maybe, theoretically, begin to master your craft. And there’s some truth to that and there’s some non-truth to that as well.


The biggest issue, as I have it written out here on the whiteboard, is that just focusing on this area in your practice, the bias here is that you become really prone to dogmas, becoming a dogmatic person, becoming almost a guru. We see this, of course, and we’re going to mention it a lot on social media, of the approach on one side of the continuum or other. It doesn’t really matter if manual therapy sucks. physical therapy doesn’t do anything to the far end of that same continuum of, I believe that I’m putting people’s bones back into place with things like spinal mobilization manipulation. So it doesn’t really matter where people fall in the continuum, they fall somewhere on some sort of dogmatic continuum line, which is not great because it tends to the further they get into their own dogma and guru like behavior, the less they tend to incorporate research evidence from peer-reviewed sources and also the patient input. These people over time you may have heard phrases of I use what works with most people and the key there is that it works with most people not all people of the true person practicing evidence-based medicine the true clinical expert is the person that gets all almost every single person better. It’s not enough to get 50% of your patients better, or 60, or 70. You should, or we hope you would be pursuing excellence in such a manner that you’re thinking, how can I help 99.99% of people? And again, just focusing so much on one of the three aspects of evidence-based medicine with your clinical expertise is not gonna cut it. I often think of how much pattern recognition informs practice, but that doesn’t mean that that’s what we do with every person. I often think of when people come into the clinic, they present with anterior shoulder pain, what we might call instability, the feeling of looseness in the joint or otherwise just pain or maybe even stiffness on the front of the shoulder. I look at it as something wrong with the relationship between the deltoid and the lat. I understand the need to treat the rotator cuff, load the rotator cuff, but I also understand that the rotator cuff is ultimately paying the price for what the deltoid and the lat are not doing for the shoulder complex itself. That when these folks present with limited range of motion overhead, that getting in and treating, particularly the internal rotators, subscapularis can have a lot of value in restoring that range of motion and increasing tolerance to load long-term. However, that pattern recognition in my head is yes, where I’m going to go to first, but again, I can’t get caught up too much in thinking this is what works with most people, this is what I’m gonna do no matter what. I have to be aware, I have to be humble that if it’s not working for that patient in front of me, I need to go back and say what does the evidence say, what other treatments could I pursue, and also what input does the patient have into the equation of Are we maybe, yes, identifying the right cause, using the right treatment, but the patient expectation is that they can continue to do three to five hours a day of elite level CrossFit training on top of trying to move through the rehab of their shoulder. Those two things are always going to be at odds, and until I can start to incorporate more of the other arms of evidence-based medicine, I’m going to have a limited effect of how many people I can potentially help rather than most, I’m thinking again, how can I help that 99% of people?


That moves really nice into making sure that we understand that yes, evidence-based medicine does include evidence. It includes what we would call and what’s labeled as current best evidence. That’s the second aspect of evidence-based medicine. I think we can be really hard on ourselves and social media here can make you feel like you’re not doing a good job at keeping up with the research. Because the truth here, if we’re being really intellectually honest, is no one can keep up with the research. There are 1.8 million scientific journal articles published every year. There are 35,000 articles being published every single week. It is impossible for any individual practitioner to read all of those. Ever. It doesn’t matter if that was your full-time job. You would not be able to keep up with it. So what we tend to see is that we tend to focus on specialty areas in practice. And I think that’s okay. I think that helps narrow our lens. And as long as we are finding a source bias here is I think we do a good job with hump day hustling. There are other great sources as well that do a good job of taking a bunch of research and condensing it in a way that can be absorbed, especially that is then kind of classified by specialty area. But understanding, it’s really impossible here to always be up to date on the current best evidence. And just being up to date and reading new articles doesn’t mean that that evidence necessarily has any value. We need to be mindful of that fact as well, that just because something new has been published doesn’t mean it has value. This is a great example. This is an article. You may have seen this make the rounds on social media. The title is, One and Done, The Effectiveness of a Single Session of Physiotherapy Compared to Multiple Sessions to Reduce Pain and Improve Function in Patients with Musculoskeletal Disorders, a Systematic Review and Med Analysis. This paper was published just a couple days ago, so brand new off the press, right? We tend to associate newer with better in research, which is not always the case. And we tend to try to immediately incorporate articles like this into practice and make giant conclusions that often the paper does not support. Already there are people on social media posting this article and saying, look, physical therapy doesn’t work. You should not go to physical therapy. There are folks posting this and saying, see, I told you manual therapy does suck. In some of these studies, in a systematic review, they did manual therapy. I told you it was worthless. Dry dealing does nothing. Spinal manipulation does nothing. Cupping does nothing. People who practice that are committing malpractice. They should be fined or lose their license or be in prison for doing dry needling. And all of those giant conclusions are being made from just this one article. They’re being made in such a manner too that tells a lot of us who read a lot of research that they probably haven’t actually read the full paper, right? They probably have just read the abstract. Because if we read the full paper, what this paper is really saying is that more physical therapy doesn’t seem to help as long as all we care about measuring is pain. No information was given about any other outcome measure, strength, changes in vital signs, did people’s blood pressure get better, did stuff like depression, anxiety get better, kinesiophobia, all these other different things that we can measure about a patient that we would expect to change with physical therapy intervention were not measured in any of these studies. And probably the most important thing that’s missing from this study all the studies that it analyzes and pretty much every piece of physical therapy research is there’s absolutely no information on what was actually done to these people in a way not only that the study could be replicated in the future and possibly validated, or that we have any idea of what was done. It’s entirely possible that folks in some of these studies only got manual therapy, that some folks maybe, yes, got exercise, but how was it dosed? Did they test the sub-max lift? Did they train at or above 60% of that sub-max number to ensure that strength was actually happening? And the answer to all those questions usually is no. So it’s really important we don’t get deep down the evidence-based hole, knowing that for the most part, a lot of the research that comes out, even though there’s a high volume of it, it’s all quite weak and doesn’t necessarily get incorporated into practice because it doesn’t really help change and inform practice pretty significantly. Also from this study, Most of these patients had a spinal fracture, they had diagnosed osteoarthritis of the knee, or they had some sort of whiplash disorder of the neck. So kind of specialty populations that can’t just really be extrapolated to the general population to say that physical therapy doesn’t work. Nonetheless, people grab this article and they cite it. That kind of shows us an overlap between the sphere of clinical expertise and pattern recognition and evidence. I’ve written it right here on the whiteboard. That person, we would call that person a cherry picker. That person has a very shallow knowledge of the research and they’re basically using the research to better inform their own dogma, right? That is not evidence-based medicine. That is just cherry picking research that supports your bias and ignoring the rest and not really taking a deep dive in the research. We have to remember as well that it is evidence based not evidence only that we have to act in the absence of evidence we actually have to do something with people and that we don’t always have the best research to inform what we’re currently doing in the practice that if we are treating a patient we’re doing certain interventions they are making progress both according to their own input, their own goals, their subjective input, and also what we’re measuring objectively, then by every way we can measure it to both us and to the patient, the patient is making satisfactory progress. And sometimes we don’t always have research to support that. And that’s okay. We need to also be intellectually honest, that some of the research we would like to see happen can’t happen. A lot of research is either done on folks who are already healthy or it’s done in a manner that whatever intervention is given can’t potentially make that person either less healthy or more injured. We often see people in low back pain get some sort of treatment and then another group gets some sort of what we call usual care. Either way, somebody is getting some sort of intervention that is designed to improve their symptoms, not maybe theoretically worsen their symptoms. I would love to see research of folks lifting near or at their maximal one rep max potential with a deadlift, and I would love to see the outcomes of what happens with a group of people who lift with a focus on a brace neutral spine, what happens to people who intentionally flex their spine throughout the deadlift, what happens to people who intentionally extend their spine without a deadlift. Is that research ever likely to happen? No. Why? Because it would be really unethical to take a group of people who have nothing wrong with them and potentially cause them maybe a lifetime of debilitating injury just to try to prove a point from the research, and that is not the point of research. We have to be mindful that we’re conducting research on human beings who have lives, who have families, who have jobs, and as much as we would like to see some specific lines of research come to fruition, we’ll probably never see some of that because of the interventions the risk is simply too high, it probably won’t pass review from something like an institutional review board at a university. So we need to be mindful as well of, yes, we’re always trying to keep up with the current best evidence, but that doesn’t mean it’s actually the best, even if it is current, and it doesn’t actually mean that it’s research we would actually like to see happen, because it can be limited, again, by the ethical nature of actually conducting that research on living human beings. The bias here is being prone to being so far in this camp, and I’ve written here on the Venn diagram of being up in the ivory tower, of only doing things that has a lot of evidence to support it. Again, in the absence of evidence, we still need to do something with that patient. We still need to understand their condition. We still need to at least try some other evidence-based interventions to help that patient out. What many of you can’t do is have a patient come in for evaluation and say, I don’t have the current best evidence way to treat you, you’ll need to leave now. That usually doesn’t go very well. And we need to recognize as well, that patient is probably just gonna go see another provider anyways. Even if you were being very, very intellectually honest with them, that there was no evidence on treatment for their current condition, they’re probably just gonna go somewhere else and get less evidence informed care there anyways. So for the best, it’s probably that they stick with you for the long term.


Our final aspect is including patient expectations, values, input. I think this is the weakest area for all of us, of the thing we probably consider last, when maybe it should be what we consider first. This is forgotten far too often that the patient, again, is a living human being with thoughts, feelings, beliefs in front of us, and doing our best to match our interventions to their expectations, beliefs, values, is really, really important, and kind of tying in to the current best evidence, we have really good evidence to show that as well. If that patient comes in and says, hey, you know what, you may not remember, but you saw my husband about six months ago for some really bad low back pain. he was in so much pain, he was off work, and you did something with some needles and electricity or something, and anyways, he felt so much better, he was able to go back to work, he’s back, he has no issues anymore, that’s fantastic, and I was hoping, with my back pain, that we could try something like that. Now, of course, what that patient did not get from their husband is all the other stuff you probably, hopefully, did with that patient. But what they took away from it was that dry needling appeared to cure that person. And so, it’s really helpful, I think, if you can match that expectation as much as possible. Yes, you could give that patient a 45 minute lecture on how dry needling for low back pain doesn’t have as much evidence to support it as strengthening the spine and increasing cardiorespiratory fitness and reducing inflammatory diet and getting more sleep and managing your stress and you can go all the way down that pain neuroscience rabbit hole to the point at which maybe that patient doesn’t come back to see you anymore Or if your long-term goal is to help that person and you know what is the most evidence-based way to help that person is to have their back get stronger, to help them with their current lifestyle habits, then probably the shortest point there, the shortest line between two points is a straight line between points A and B. It means that if you can just offer the dry needling, that’s probably going to be the most beneficial thing, right? You’re matching that patient expectation, belief, and value. Does it take time? Yes. It doesn’t take a lot of time. Does it take a lot of resources? No, it doesn’t. It costs a couple cents for the needles, right? And it lets us get to what we ultimately want to get to that person which is addressing their lifestyle, getting them loading, getting them moving if they’re not currently moving, and overall changing their life for the better from both a physical fitness but also overall health and lifestyle perspective. And I think far too often We have an agenda, we have a bias with certain treatments where it doesn’t matter who comes in the door. We can be on either side of the dogmatic perspective of everybody gets spinal manipulation, everybody gets dry needling without actually consulting the patient, do they want this or not? Are they open to another treatment? And what will ultimately get us to what we know works the best for most people, which is to get them moving more, get them stronger, get their heart rate up, address their lifestyle. So you can have many sessions of education only. You would think you’re practicing in the most current evidence-based way, but we know we can’t talk patients better. We actually need to do some stuff. And at the end of the day, I would challenge you that it’s probably better if they do that stuff with you versus leaving your care and going to see another healthcare provider. That’s another thing that articles like this do not address, of how much follow-up care did patients receive after they leave the study. Overwhelmingly, that is something that is not addressed. of if you do not provide the treatments that the patient wants, whether they want manual therapy, whether they want strengthening and you don’t have the time or equipment to provide that, whatever they want, if you do not match those expectations and values, they’re probably gonna go somewhere else. They’re gonna spend healthcare dollars somewhere else. And that might be with a healthcare provider that’s not as evidence-based as you are. So challenge yourself. Are you actually practicing within all of these three different spheres? Are you trying your best to keep up on the scholarly research, at least as it relates to the areas of practice that you’re passionate about? Are you honest with yourself that you do have clinical pattern recognition that has value, but knowing that it does have its limitations and you’re willing to adjust your treatment when things don’t work? And are you combining your practice expertise and the current best evidence with patient expectations and values to ensure that the treatment you’re offering is actually the treatment that the patient wants. So check yourself. Evidence-based medicine, are you actually doing it? I hope this was helpful. I hope you all have a fantastic weekend. Have fun with Gut Check Thursday. If you’re gonna be at a live course, I hope you have a fantastic time. We’ll see you next week. Bye, everybody.


Hey, thanks for tuning in to the PT on Ice daily show. If you enjoyed this content, head on over to iTunes and leave us a review and be sure to check us out on Facebook and Instagram at the Institute of Clinical Excellence. If you’re interested in getting plugged into more ice content on a weekly basis while earning CEUs from home, check out our virtual ice online mentorship program at ptonice.com. While you’re there, sign up for our Hump Day Hustling newsletter for a free email every Wednesday morning with our top five research articles and social media posts that we think are worth reading. Head over to ptonice.com and scroll to the bottom of the page to sign up.