The seventh International Congress on Peer Review and Biomedical Publication was held in Chicago a couple of weeks ago.
Participants tend to be editors of medical journals. They meet every four years. They are not happy about the current situation. Neither am I.
Scientific American was there. I quote Hilda Bastian:
Most findings are false
– “John Ioannidis pointed to the very low rate of successful replication of genome-wide association studies (not much over 1%) as an example of very deep-seated problems in discovery science. Half or more of replication studies are done by the authors of the original research:
“It’s just the same authors trying to promote their own work.”
Industry, he says, is becoming more concerned with replicability of research than most scientists are.
Bias, bias everywhere
Why is there so much un-reproducible research?
- Ioannidis points to the many sources of bias in research.
- Chavalarias and he trawled through more than 17 million articles in PubMed
- They found discussion of 235 different kinds of bias.
Return to science
What would help?
- We need to go back to considering what science is about:
- Validation practices have to be at the core of what we do.
- We have to get used to small genuine effects and not expect (and fall for) excessive claims.
- We need to have – and use – research reporting standards.
- [Ioannidis] advocates registering research: protocols through to datasets.
Lots of articles have no impact at all
Isuru Ranasinghe looked at un-cited and poorly cited research in cardiovascular research.
- The overall quantity is rising rather dramatically as the biomedical literature grows
- 1 in 4 journals have more than 90% of their content going un-cited or poorly cited five years down the track.
- About half of all articles don’t have an influence – if you judge it by citation.
Citations are a poor measure
There was a lot of agreement… on the general lousiness of citation as a measure and influence on research.
- Tobias Opthof, presenting his work on journals pushing additional citation of their own papers, called citation impact factors “rotten” and “stupid”.
- Elizabeth Wager [reported] on analyses of overly prolific authors
- Surely research has to be about doing research, not just publishing a lot of articles.
Commercial relationships are hidden
Kristine Rasmussen told us that in Denmark, doctors have to apply for permission to collaborate with industry. That enabled Rasmussen and her colleagues to study whether or not doctors who did clinical trials were declaring their commercial relationships with drug manufacturers.
- Out of 171 trial authors, 11% did not disclose a COI with the trial sponsor or drug manufacturer
- Another 26% did not disclose that they had a commercial relationship with the manufacturer of another drug for the same use.
- From the audience, Leslie Citrome remarked that some academic departments are involved in so many industry trials that they should now be regarded as contract research organizations rather than academia.
The evidence is lost
- As the years went by, a dwindling minority of papers were accompanied by author email addresses that still functioned.
- In the end, only 37% of the data even from papers in 2011 were still findable and retrievable.
- By the time they got to papers published in 1991, only 7% of the data could be determined to truly still be in existence and retrievable.
By then, few authors could be found, and most of them were reporting that their data were lost or inaccessible.
- Researchers who had the data had died, retired, or the research had been done five computers and two universities ago.
- Or the data were in software or hardware that no one could access any more.
- Vines thinks years from now people will look back and think it was silly not to publish data at the same time as the article.
- P 30/13. Faglig kitsch. In Norwegian