There is nothing like a good health scare to get the pulse racing. And as far as the media are concerned, they don't get much better than this: a study in a leading medical journal claiming that a widely used medication increases the risk of heart attacks. The story is based on the findings of an international team of researchers published late last month in the British Medical Journal. Taken at face value, it implies that people taking calcium supplements on the advice of their doctors to combat brittle bone disease may face a 30 per cent higher risk of a heart attack.
By now, of course, the media has moved off in search of its next scare, but what should the rest of us make of it? The advice from academics in such cases is usually to ignore the stories as journalistic hype and to consult those who really know - such as, well, academics. Yet anyone who regularly reads research papers and the resulting media stories soon uncovers a shocking fact: that the journalists generally get the basic facts right. In the case of the calcium pill story, an independent expert assessment of the media coverage performed for the UK National Health Service found that in general, the news reports "correctly reflected the findings".
What academics are less willing to concede is that often it is the studies themselves that raise questions of credibility, rather than the subsequent media coverage. Much of the research that finds its way into life science journals is based on small samples - perhaps just a few dozen volunteers who agreed to take part in return for a fee. It does not require a PhD in statistics to know that such studies may lack for evidential weight.
This is most obvious in studies claiming to have "debunked" some or other finding academics do not care for - such as, say, the effectiveness of complementary medicine. Such studies are often based on so few patients that they would fail to detect the efficacy even of proven conventional drugs. Much research in pharmacology and genetics also makes use of animal experiments, the relevance of which to humans is notoriously variable. For example, a 2006 study found that out of 100 treatments for stroke that had shown promise in animal studies, not one had worked in human trials.
None of these criticisms can be levelled at the calcium pill research, which was based on an analysis of 11 studies involving almost 12,000 patients. But as so often with such reports, a little digging soon starts to undermine its apparently impressive conclusions. Take that headline figure for the increased risk of heart attack: 30 per cent. This is much less scary than it seems. That is because - in common with so many health scares - it is merely a statement of the relative risk, not the absolute risk. In other words, according to the study, someone taking the calcium pills faces a 27 per cent increase in their pre-existing risk of a heart attack. Given the relatively low chances of having a heart attack at all, that is a lot less worrying than facing a 27 per cent absolute risk of having a heart attack. Indeed, according to the researchers' own figures, if 100 people took calcium pills for five years, it would lead to only another one or two extra deaths compared to a similar group of people who stayed off them.
So the headline finding hardly suggests calcium pills are causing a healthcare catastrophe. And that is if you believe the figure at all. Some critics have highlighted disturbing features of the research, such as the fact it reviewed largely unpublished studies, and that these were never designed to study the link between calcium pills and heart attacks. To statisticians, this smacks of "data dredging", in which databases are trawled for interesting anomalies - ignoring the fact that any that turn up may well be nothing but flukes.
The dangers of data dredging were highlighted some years ago by other researchers studying treatments for heart attacks. Their findings showed that aspirin could cut the risk of death following a heart attack by 25 per cent. On submitting their paper to The Lancet, however, the researchers were asked to scour their data-set for more insights. They complied, but only after insisting they include an example of the perils of such data-dredging - by analysing their results for astrological influences.
Sure enough, the researchers found that the benefits of aspirin applied to everyone except those born under Gemini and Libra. The finding was a fluke - but one that serves as a salutary example of the pitfalls of pushing data too far. Yet the biggest concern that critics seem to have about the calcium study is one that rears its head repeatedly over health scares: sheer implausibility. Studies of the biochemical effect of calcium supplements have shown they reduce blood pressure and improve cholesterol status - hardly what one expects from medication that now stands accused of boosting the risk of heart attack.
All of which leads us back to the original question: whom should we believe? It is clear that the choice we face is not - as we are so often told - between the academics and hack reporters. Rather, it is between the contents of the technical papers published in the medical literature. Yet few of us have the time or expertise to make such judgements. Instead, we must trust the academic journals to separate the scientific wheat from the chaff.
Robert Matthews is visiting reader in science at Aston University, Birmingham, England

