Does Experience Matter?

Photo Credit:

Photo Credit:

Yes. Experience matters. Thanks for reading! I’ll look for your comments below…

Of course I’m kidding. I turned off the comments long ago – the last people that I want to hear from are the loons on the internet…

So, does experience matter? Well, that’s an ambiguous question. It’s kind of like asking, “Does education matter?” It’s great that you received a formal education or have many years of practice, but its value depends on what you actually learned through the process. Two people with the same experience can arrive at completely different conclusions.

Most people misunderstand “experience” to be some kind of inherent credibility. People think experience matters so much that they believe that practitioners with more experience actually have better outcomes. They don’t. Their outcomes are worse. Systematically worse.

How could that be? Think about day-to-day practice. Patient comes in and says, “This hurts.” You apply Treatment X (patent pending). After a couple months of Treatment X (patent pending) the patient is better. Rinse and repeat for 40 years.

Here’s a dirty little secret. Gather in really close so I can let you in on it. ALL PATIENTS GET BETTER!!!! It’s called regression to the mean. Giving them any kind of attention also helps. 40 years of applying your treatment to patients and watching the majority of them get better reinforces your belief that your treatment is effective. THIS is the “experience” you are relying upon. It actually means nothing and can reinforce your belief in something outdated and poorly supported. Maybe your treatment works. For all you know, you might actually be slowing their recovery. Most likely, you are merely entertaining the patient while time solves their problem or they are simply telling you, “Thanks, it’s better!” so they can stop coming in and not hurt your feelings. What I am saying is that experience in this capacity means very little.

Why does experience even come up when we talk Evidence-Based Practice? David Sackett originally described Evidence-Based Practice back when I was in PT school in the mid-1990s and it is now sacred gospel. Honestly, it confuses the hell out of me.

One thing that pisses me off is the often oversimplified listing of the components of Evidence-Based Practice:

1. Best Research Evidence
2. Clinical Expertise
3. Patient Values and Preferences

Why does that piss me off? Well, because the simple minded come along and say:

I want to do Treatment X (patent pending).

  • “Best Research Evidence”? I’ve got some poorly conducted study to cite – Check!
  • “Clinical Expertise”? I’ve been doing this for 20 years and I think know it works – Check!
  • “Patient Preferences/Values”? Of course they want it after my convincing sales pitch – Check!

I’m applying Evidence-Based Practice motherfuckers!!!!

Yeah. That’s what Sackett meant [sarcasm font]. Let’s face it, human bias and self-delusion will overcome all barriers and contort itself into whatever guidelines you throw at it. The smarter you are, the better you will be at going around it. An oversimplified three-point checklist is no match! Practitioners use that experience clause to tell themselves, “When in doubt, go with whatever you want! You’re the expert!”

We don’t need Evidence-Based Practice. We just need science, that’s all. “But wait,” you say. “Science doesn’t account for clinical expertise or patient values/preferences!” Two things:

1. Yes it does.
2. You’re a moron.

Science is our understanding of reality. As you gain experience, you learn how to apply that science in a clinically relevant way. It all starts to fit together as you practice and you are able to incorporate new information as it comes along. But you are applying science all the way. You are looking at the science of therapeutic alliance, and the science of pain perception, and the science of exercise physiology, and the science of pathology, and the science of socioeconomics, and the science of statistics, etc, etc, etc. The most experienced practitioners can see the overall picture of that science while embracing the remaining uncertainty; refusing to “fill in the blanks” with guesses. Of course bad studies are published all the time. Scientific experience also helps keep you skeptical, always looking for the other unaccounted explanation. Always reflecting and self-critical to keep an eye on that encroaching bias.

And what of patient preferences/values? That doesn’t mean just doing whatever the patient wants – patient satisfaction alone is a recipe for disaster. What it means is informed consent. Educate the patient on what is known and list all of their science-supported options (including “Do nothing”). Let them choose what they think is best for them.