Your Doctor Probably Won't Get This Question Right...

Your doctor could be killing you.

I've been saying this for years. Blind obedience to everything your doctor tells you is dangerous and potentially deadly. Just consider a small study conducted by Harvard Medical School.

The researchers asked doctors (at an unnamed hospital in Boston) this question:

If a test to detect a disease whose prevalence is one out of 1,000 has a false-positive rate of 5%, what is the chance that a person found to have a positive result actually has the disease, assuming that you know nothing about the person's symptoms or signs?

The most popular answer was 95%. That's wrong. And it could be deadly.

The real answer is about 2%.

Just 20% of medical students knew the answer, and only 24% of attending physicians got it right. The results were similar for other hospital staff.

This is an incredibly common error that doctors commit when encountering a particular math problem. This might not seem important... until you realize this means doctors struggle to interpret the results of screening tests designed to show the likelihood that you have a disease.

Most people – and doctors – naturally focus on the test's 95% accuracy.

But to get the correct answer, you have to focus on both the disease's prevalence and the test's ability to diagnose the disease.

Look at it this way...

Only one person in 1,000 has the disease. If you test 1,000 people, you'll find the one true positive result plus 50 false positives (5% of the 1,000).

So out of 51 positive results, only one is a true positive. That makes the likelihood 1.96%.

According to a variety of studies – not just our own conjecture – a lot of doctors miss this. The first study that showed doctors bungling this calculation was called "Interpretation by Physicians of Clinical Laboratory Results" by W. Casscells, A. Schoenberger, and T.B. Graboys in 1978. Many other studies have shown it since.

For doctors out there... please don't brush this off as a "trick question." It's not a trivial matter.

False positives can lead to unnecessary, expensive, and invasive treatments. In the best-case scenario, it results in worry and anguish for a healthy patient. In the worst hands, it can lead to disfiguring surgeries, dangerous drugs, and unnecessary radiation.

One reader wrote in the last time I brought this up:

The supposedly "correct" answer of 2% would only apply if the test were used as a screening tool by giving it to everyone or to a random sampling of people, as opposed to testing only those people with symptoms of the disease.

Yup, that's exactly the point...

Using tests, even highly accurate tests, without considering the full story of a patient's symptoms gives a skewed diagnosis. And without a basic understanding of probability, your doctor could easily do harm and still believe he was right in ordering the test and recommending treatment.

And as far as medical science has come, there's still no such thing as a perfect test.

The key is considering the pre-test probability and the post-test probability of the disease and combining that with the statistical characteristics of the test before thinking about sending patients to get any sort of lab work.

You should check if your doctor truly understands the math. Ask the question we posed. If he misses it, get out of there. Find a doctor who knows the answer.