How to understand health statistics
Health statistics are confusing. For patients. For journalists. For politicians. And, most worryingly, for doctors. This confusion can kill. Jim Pollard gets a headache so you don't have to.
In 1995, it was reported that so-called 'third generation oral contraceptives' (better known as the pill) were twice as likely to cause thrombosis as second generation pills. The risk of life-threatening blood clots prompted many women to stop taking the pill. Many got pregnant. The cost to the NHS of the resulting 13,000 extra abortions was estimated at £46 million.
But what were the women really worried about? It is true that taking the third gen pill doubled your risk. But double what? On a second gen pill, 1 women in 7,000 had a thrombosis. On a third gen pill, 2 women in 7000 had a thrombosis. Yes, an increase of 100%. Double the risk. But of a very small absolute risk. Two in 7000 is twice as much as 1 in 7000 but it is still tiny.
The authors of a new paper Helping Doctors and Patients Make Sense of Health Statistics tell this story to show how failing to fully understand statistics can seriously damage your health.
Relative risk is the chance of one thing happening compared to something else. Absolute risk is the risk of that thing happening without comparison with anything.
Relative risks can often be very high. But it's the absolute risk that counts.
When reading a media story or talking to your doctor about risks, look for or ask for statistics on absolute risk. This should be given in real numbers — ie 1 in 10 — not percentages.
Relative risks can cover up what is really important. For example X medicine might lower the risk of Y from 20% to 10%. This is clearly a far more important reduction than reducing the risk from 0.002% to 0.001% but the relative reduction — 50% - is the same in both examples. Even in this case the real figures are more helpful than the percentages (X reduces the risk of Y from 20 cases in 100 to 10 in 100.)
When the former mayor of New York Rudy Guiliani, right, was running to become president of the USA in 2007, he tried to compare the USA's privatised health system with the UK's public one, the NHS. Because of his political views, he wanted to show that private medicine was best. He did this using himself as an example.
Guiliani had had prostate cancer. He claimed that his chance of surviving prostate cancer had been 82% in the USA and only 44% in England. On the face of it what he said was true. His data was based on 'five year survival stats'. These figures show many people are still alive five years after diagnosis. But the key words here are 'after diagnosis'. Guiliani was not comparing like with like.
In the USA men are usually diagnosed as a result of prostate screening (the PSA test) whereas in the UK, where the PSA is less widely used, diagnosis is on the basis of symptoms of cancer. As a result patients in the USA are diagnosed earlier. When you realise this, it is no longer a surprise that USA patients live longer. They would have anyway.
Take the example of Joe Prostate. He dies at the age of 80. If Joe Prostate lives in America his cancer would have been diagnosed at, say, 73 as the result of the PSA test. Five years later he would still be alive so the five-year survival rate data would show that he had 'survived' prostate cancer in the USA. If Joe lives in the UK, he may have been 76 before his cancer was diagnosed. Five years later he'd be dead so in the five years survival stats would show that he had 'not survived' cancer in the UK. But either way Joe is still dead at 80. The absolute result for Joe Prostate is the same — presenting it relatively creates only confusion.
To be fair to Giuliani it is unlikely that he did this deliberately for political ends. He probably just didn't understand the figures.
Most politicians, like most journalists, most patients and — gulp — even most doctors don't understand stats.
Although doctors do better at stats tests than the general population, one doctor in 4 does not have basic statistical literacy and most docs, like the rest of us, tend to misunderstand exactly what relative risks mean in absolute terms.
Do you have basic statistic literacy? Try the 3 questions in the box. (The answers are at the foot of the article).
Having read this on the PSA test, you might think that all the same it's still better to live in the USA — at least you get diagnosed quicker. That's true but only if the diagnosis is useful. By the age of 70, two men in three will have some degree of prostate cancer but the vast majority will die of something else. Overdiagnosis will inflate survival rates.
Obviously, if you diagnose people who have very mild prostate cancer that will never do them any harm anyway, it's no wonder you have high survival rates.
Neither the stats (nor the PSA test, for that matter), can answer the real question: is your prostate cancer aggressive enough to kill you before something else does? If it isn't, overdiagnosis will not only distort the statistics but also damage your health as you will have had a lot of treatment, stress and worry for nothing.
You could say it's better to be safe that sorry but the actual effect of overdiagnosis may be that you're simply sorry about something else, erection dysfunction, for example. In at least 1 man in 3, prostate surgery will cause peeing and erection problems.
The figures Giuliani should have been quoting to make a meaningful comparison — your chances of dying from prostate cancer in the USA or the UK — show little difference between the two countries. There are about 26 prostate cancer deaths per 100,000 men in the USA compared to 27 per 100,000 in the UK.
So what does a test like the PSA actually show? The PSA is a blood test that measures levels of prostate-specific antigen, a protein produced by the prostate gland. An enlarged prostate will produce more PSA.
But having an enlarged prostate — and therefore a high PSA score - does not automatically mean cancer. In fact, according to the US National Cancer Institute, most men with a high PSA score do not have prostate cancer. When a biopsy to look for cancer is carried out, only about 25-30% of men with high PSA scores actually have it.
In other words even if you have a high PSA score, your chances of actually having cancer are at the very worst only 1 in 3.
Understanding the stats enablea you to see the PSA for what it is: just a guide. It doesn't show cancer, it suggests the possibility of it. But not all men — and not all doctors — know this. A survey by the German equivalent of Which? in 2004 found that only 2 out of 20 urologists understood enough to give their patients the full pros and cons of the PSA test.
Not only is it important to know what a test actually shows, it is also important to understand how accurate it is. All screening tests make mistakes. There can be false positives — the test detects something that isn't actually there - and false negatives — the test doesn't detect something that is there. Even the best tests of all including DNA testing throw up false positives and false negatives.
Not understanding that tests can make mistakes can be deadly.
In the early days of HIV/AIDS, there were cases of people who had positive tests killing themselves even though there was a 50-50 chance of error. (Today, of course, there is also far better treatment.)
It is important to understand the risk of false-positives as it applies to your own situation. To continue with the HIV example, heterosexual men are far, far less likely to get HIV/AIDS than gay men. But the test, of course, doesn't know this. It is just as likely to be wrong for a straight man as a gay man.
Regular screening programmes can increase the risk of an individual getting a false-positive. On a one-off test your chances of a false positive may be say 1 in 1000. Have ten tests and the risk is 10 in 1000 (or 1 in 100).
Remember also that while screening can help detection, it makes no difference at all to your risk of actually getting the disease. Research shows that people often don't realise this and believe, for example, that breast screening reduces the risk of breast cancer.
So what should you do?
When you read about medical risks or discuss them with your doctor, here are some questions to ask:
- Risk of what? Risk of dying or of simply developing a symptom?
- Risk when? Over the next five years, ten years, a lifetime?
- How big a risk? Demand this in absolute terms — ie. you have a 1 in X chance of Y — rather than relative terms. Be very wary of percentages.
- Risk to who? Make sure you understand the risk to you as a male of whatever age from whatever ethnic group. There is massive variation. To give an obvious example, the risk of women getting breast cancer is far higher than for men.
Make sure you understand exactly what risk a doctor is talking about. On anti-depressant drugs like Prozac, for example, a doctor might say 'there is a 30-50% chance of developing erection or other sexual problems'. Sounds clear enough. Or does it? It means that for every 10 men who takes Prozac, 3-5 will develop a sexual problem but some men hear this as meaning there is a 30-50% of erection problems every single time they have sex — a gross overestimation of the risk.
Of course the doctor ought to make this clear but probably won't. You need to ask. Not only do many doctors not understand the stats themselves, they are also concerned that patients will no longer trust them if they disclose their own uncertainty.
Read between the lines. The media loves relative risk as it makes a better story. Relative risks are also often simpler to express even if they're not easier to understand. You're much more likely to read a story headlined X 'doubles your risk of Y' than X 'increases your chance of dying of Y from 1 in 7000 to 2 in 7000 over the next ten years'.
We live in a world that doesn't really want to understand statistics.
We like to bandy stats about but we don't want to accept what they are really telling us. Because what statistics, all statistics, are really telling us is that we have to live with uncertainty.
For many years medicine never wanted to accept this. In a way, statistics go against the scientific grain. Scientists want to show that X causes Y. Statisticians always add an element of doubt: X causes Y in Z number of cases out of 100. Early doctors believed science was about finding causes not weighing up chances. That's one reason it took so long for things we now take for granted — like washing hands to avoid infection — to become common practice.
Today, the reason statistics are still badly abused is that all of us want them to do something they cannot: provide certainty. None of us - advertisers, politicians, media, public - want to live with the uncertainty that is implicit in a genuine understanding of statistics. Advertisers want certainty. Our product does X. Politicians want certainty. Our policy does Y. And we, as decision-makers, want certainty. Life's a lot simpler that way.
Now, in general, I believe simpler is best. This whole website is about making health choices simpler. But oversimplification is dangerous. Oversimplification of health statistics increases your chance of dying by Wednesday week by 57.4%.
The statistical literacy answers are
- 10 (out of 1,000)
- 0.1% and
In research cited by Gigerenzer, 28% of doctors get at least one of these questions wrong. The one they find hardest is number 2, converting a proportion to a %age. The general public too find this hardest with only about 1 in 4 of us answering it correctly.
- This article is drawn from the paper Helping Doctors and Patients Make Sense of Health Statistics by Gigerenzer et al (Psychological Science in the Public Interest; Volume 8, no.2 — November 2007)
Page created on November 3rd, 2008
Page updated on March 24th, 2010