Faculty of 1000

Post-publication peer review

Posts Tagged ‘lies’

Private investigations

Posted by rpg on 20 January, 2010

One of the really great things about science is its potential for self-correction. If you have an hypothesis, a result (strange or otherwise), a set of data, it can be tested by anyone. This is encouraged, in fact: when you publish you’re not just saying ‘look how clever I am’ but also ‘here’s something new! Can you do it too?’. This philosophy is diametrically opposed to that behind Creationism, say; or homeopathy. In those belief systems whatever the High Priest says is of necessity true, and experiment must bend around them until the results fit.

This means that, in science, a finding or publication that people get very excited about at the time can be shown to be wrong—either through deliberate fraud, experimental sloppiness (although the boundary between the two can be fuzzy) or simply because we’re as scientists wiser now than we were then. This happens, and it’s normal and part of the process. We should welcome it; indeed, my friend Henry Gee has claimed that everything Nature publishes is wrong, or at least provisional.

So what we have to do is be completely open about this, no matter how embarrassing it is for the journal that published the work in the first place.

You know where I’m going with this, don’t you?

It was Derek Lowe who first alerted me to a paper published in Science last year, with the ome-heavy title Reactome Array: Forging a Link Between Metabolome and Genome. This was flagged as a ‘Must Read‘ (free link) back in November, because according to our reviewer Ben Davis

If this worked it could be marvellous, superb.

However, as Ben said in his evaluation,

this work should be read with some caveats. Try as we might, my group, as well as many colleagues, and I have tried to determine the chemistry described […] In my opinion, this is a work that deserves a “Must Read” rating and I strongly encourage the reader to read the source material and reach their own conclusions.

And as Derek points out, Science published an ‘Editorial expression of concern‘, noting a request for  evaluation of the original data and records by officials at the authors’ institutions, as well as mentioning it on their blog. Heavy. Immediately I saw this, I let our Editorial team know we might have a problem and we published a note to warn our readers that the work described in the paper was suspect.

Today we published a dissent to the evaluation from Michael Gelb, who says

There are many reactions shown that seem unusual and controversial […] My colleagues and I have tried to decipher the chemistry shown in Figure 1 of the main text and in the supplemental material. Many of the indicated reactions seem highly unlikely to occur, and the NMR data showing that some of the structures that were made are confusing and controversial.

We’ve also published a follow-up from Ben:

I agree wholeheartedly with the sentiments expressed in the Dissenting Opinion. The chemistry presented in this paper and in the online SI has varied in its description and content worryingly over the last 2 months.

and, rather tellingly,

as yet no chemical samples or key reagents have yet been made generally available.

(One of the usual conditions of publishing in reputable journals is that you make reagents available to other scientists, so that they can repeat your work. Failing to honour this commitment is not playing to the rules.)

It’ll be interesting to see when, not if, the original paper is retracted; and by whom.

And this, people, is the self-correcting wonder of science. Remember this, next time someone starts rabbiting about scientific conspiracies, or sends you their new theory of general relativity, or anything else that sounds crazy. It probably is.

Read the rest of this entry »


Posted in Journals, Literature, Science | Tagged: , , , , , , , | 3 Comments »

Half The Lies You Tell Ain’t True

Posted by rpg on 6 August, 2009

You’ve probably seen all the fuss over Wyeth and the ghost-writing of medical articles, along with the associated smugness of certain commentators. According to my contacts in the medical comms industry, the practice as such is nothing new, and there are very, very strong guidelines. The creative outrage we’re seeing is really rather misplaced:

Well, this is 1998 information and back then, things were a LOT slacker and this kind of thing did go on. The last 5 years have seen a big change and the policy that Wyeth now has is pretty much in line with everyone else


Pharma has sorted this out and anyone behaving like that gets fired. In fact, not stating the source of funding for writing invokes the OIG Federal Anti-kickback Statute, and that is two years in chokey.

{REDACTED} have EXAMS on compliance and anyone breaching compliance in a way that results in negative press for us or our clients is fired.

Teapot, there’s a storm a-brewin’.

Anyway, I didn’t want to talk about that, except it’s a nice hook on which to hang examples of a different-but-similar kind of spin.

Via John Graham-Cumming (go sign his Turing petition) I found this wonderful, wonderful site that shows you just how medical comms, pharma companies, eco-terriers, homeopaths, publishers, bloggers, GPs, PR agencies, newspapers, organic interest groups and in fact just about anyone can lie to you without really lying. It’s precisely the sort of thing that Ben Goldacre tries to get the numpty public (and let’s face it, most scientists/medics) to understand, except with pretty graphs and funky webby clicknology.

2845 ways to spin the risk uses an interactive animation to show exactly how drugs, interventions, whatever can be made to look good, bad or indifferent, simply by displaying the same data in different ways.


Play with the animation for a while, then go read the explanations. The whole ‘relative risk/absolute risk/number needed to treat’ thing is pretty well explained, along with bacon butties:

Yet another way to think of this is to consider how many people would need to eat large bacon sandwiches all their life in order to lead to one extra case of bowel cancer. This final quantity is known as the number needed to treat (NNT), although in this context it would perhaps better be called the number needed to eat. To find the NNT, simply express the two risks (with and without whatever you are interested in) as decimals, take the smaller from the larger and invert: in this case we get 1/(0.06 – 0.05) = 100. Now the risks do not seem at all remarkable.

Mmm. Bacon.

Interesting tidbits abound:

One of the most misleading, but rather common, tricks is to use relative risks when talking about the benefits of a treatment, for example to say that “Women taking tamoxifen had about 49% fewer diagnoses of breast cancer”, while harms are given in absolute risks – “the annual rate of uterine cancer in the tamoxifen arm was 30 per 10,000 compared to 8 per 10,000 in the placebo arm”. This will tend to exaggerate the benefits, minimise the harms, and in any case make it unable to compare them. This is known as ‘mismatched framing’

which is quite intriguing, but then we find that it

was found in a third of studies published in the British Medical Journal.


It’s a splendid breakdown of all the things we need to understand when, for example—oh I don’t know—understanding clinical trial results, perhaps; and I certainly haven’t got to grips with it all yet.

I should, but it’s lunchtime and I now fancy some bacon…

Posted in Communication, Statistics | Tagged: , , , | Comments Off on Half The Lies You Tell Ain’t True