Previous: Assertions Based On Single Studies | Table Of Contents
Famous food researcher Brian Wansink, director of Cornell’s Food and Brand Lab and considered a world-renowned expert on eating, was fairly recently accused of academic misconduct. Many of his papers were retracted by JAMA in one single day.
For years, my bullshit meter had been pinging. I even reached out to a few colleagues and told them so. Something is amiss, I said. Nobody can publish these many studies. The man was a study machine. And so many positive results. I thought I knew what was going on. It turns out I was right. He had committed some of the research sins I’ve talked about many times. These sins are a “dirty little secret” if you will.
One of these unethical and damaging practices is sometimes referred to as “self-plagiarism”. Let’s say I write what is essentially the same article, with the same thesis and conclusion, and publish with different wording in several different places. Is this a big deal? Not really. These are just informal articles. If I were to follow this practice it would just be for the purpose of fitting the article to a particular audience I was aiming toward. Also, if I had a “parent” article I would most likely acknowledge this article in some way.
Bullshit Example Number 4: Fastest Results Possible!
It would be hard to prove the fastest possible result a person could get in a fitness or weight loss intervention. However, if I claim that my weight loss program delivers the “fastest results possible,” I could probably get away with it. Fastest results possible, after all, could be interpreted as a very reasonable thing to say.. If there is reasonable evidence to suggest that the typical leading weight loss solutions deliver a certain average result, and my product also delivers this same result, I could say that I interpret this as the fastest results possible within the realm of these types of products. I am saying, literally, that my product is just like all the rest. No better, no worse. However, the fastest results possible will usually be interpreted as faster results. Another misleading but literally true claim.
But what if a scientific researcher does this? Well, then it is a very damaging and, to me, quite unethical practice. The major problem is using some data, or another part of an earlier study or paper in a new paper without cross-referencing it, to make it appear as if the data, study sample, etc. is completely new and original to the study or paper in question. This absolutely puts a wrench in the whole kit-n-caboodle we call science. And it is done constantly. Most studies I find, in fact, are published in several different journals with no reference made to them ever having been published before. The question is, is it done deceptively? Well, if no mention is made of how parts of one paper overlap with another, it is always deceptive! If we think something is new, when it is not, it completely skews our evidence base, making it hard to make informed decisions.
Usually, once an article has been published in a peer-reviewed journal, it is not acceptable to re-publish it, and this could be considered copyright infringement.
Many times, you will see the same name pop up again and again in regards to a specific scientific claim and it will seem like the researcher is just publishing reams of research in regards to this claim. This seems to have happened with Brian Wansink. When you see this happen, consider that this person may be doing another form of self-plagiarism called salami slicing. This is when a researcher takes what is essentially one large study that could be reported in one paper and slices it up into smaller chunks which he scatters all over the place, mostly to increase the author’s number of publications. This makes the importance of the work seem greater than it actually is. Call it a shotgun effect.
Don’t blame it entirely on the researchers, though. Not many want to increase their publication count just for the sake of it! It’s publish or perish and, according to Nature Materials: “Much of the problem arises not from an inherent desire among researchers to maximize their publication count, but from the conditions that are set by funding and appointment bodies, which determine what gets funded and who gets tenure. In the ‘publish or perish’ climate that has evolved over recent decades, overemphasis on the size of an individual’s (and, increasingly, entire research group’s) publication record as a means of quantifying their research output inevitably rewards quantity over quality. Moreover, this has the effect of abdicating responsibility for such assessment to the journals in which they publish — a responsibility that is neither appropriate nor desired.” 1”The Cost of Salami Slicing : Article : Nature Materials.” Nature Publishing Group : Science Journals, Jobs, and Information. Web. 10 Jan. 2012. <http://www.nature.com/nmat/journal/v4/n1/full/nmat1305.html>.,2Cicutto, L. “Plagiarism: Avoiding the Peril in Scientific Writing.” Chest 133.2 (2008): 579-81. <http://chestjournal.chestpubs.org/content/133/2/579.full>
The explosion of researchers and journals is large enough without the above practices adding to the pile. I mentioned the term “peer-reviewed” a couple of times in this post but it has started to become a joke since there are just too many papers appearing on a daily basis for most of them to be peer-reviewed.
You can bet that your average fitness article citing some research study (while not actually citing it) is completely oblivious to these problems.
Next: Reference Padding in Fitness Articles
Resources
↲1 | ”The Cost of Salami Slicing : Article : Nature Materials.” Nature Publishing Group : Science Journals, Jobs, and Information. Web. 10 Jan. 2012. <http://www.nature.com/nmat/journal/v4/n1/full/nmat1305.html>. |
---|---|
↲2 | Cicutto, L. “Plagiarism: Avoiding the Peril in Scientific Writing.” Chest 133.2 (2008): 579-81. <http://chestjournal.chestpubs.org/content/133/2/579.full> |