Why You Shouldn’t Believe Everything You Read in the Newspapers about Medical Studies

One of my favorite journals is called PLOS ONE.  This is a journal which supports open access. That means anyone can access any article in this Journal without paying a fee. Medical studies published in this journal are accessible to anyone.

Most of you probably don’t realize but when you see a medical study quoted in a newspaper article, you can’t actually access the original study on the Internet without paying a hefty fee, usually $20-$40! If you have access to a medical library then you may be able to access the article but for most people the original articles are off-limits without paying large fees.

Plos.org is an organization that supports open access publication of scientific articles.

That’s why I admire them.

Back to my main story. A recent study in PLOS ONE looked at how often medical research results are replicated, meaning does a second or third similar study show the same results.

The researchers in this study looked at 4723 studies that were included in 306 meta-analysis articles. (A meta-analysis is a study where you combine the results of many other research studies in order to get an overview of findings.)   The researchers divided the studies into lifestyle related studies which looked at things like drinking coffee or smoking cigarettes and non-lifestyle studies such as genetic markers for Alzheimer’s. There were 639 lifestyle studies and 4084 non-lifestyle studies.

The question is of the studies that were picked up by newspapers, how many of them were replicated by subsequent studies?  The answer is only about half of the studies held up when tested again in another study. The other thing that was interesting in this article was that when studies failed to replicate, newspapers never reported that failure. Interesting examples included studies that linked a specific gene to depression, schizophrenia, or autism. None of these studies replicated successfully, which you think would be big news and would be reported by many newspapers, but the truth is that not a single newspaper article reported these failures to replicate.

This shows that newspapers don’t have much genuine interest in good science reporting. Good science reporting always involves being skeptical of new and different results, as well as following up on attempts to replicate those results.

So, what does this mean about science results reported in popular media? What it probably means is that if the finding is new and exciting and different, you probably should be highly skeptical of it being true. And the more esoteric the finding is (such as genetic markers) the more skeptical you should be.

For instance, a recent study that was funded by drug companies looked at whether the statin class of medications have side effects or whether these side effects are just a placebo effect. I’ll write more extensively about this study later, but the study’s findings–that only when people knew they were taking statins did they experience side effects– should probably be viewed very skeptically since many other studies have shown side effects from statins and many clinical reports have confirmed the side effects. (And of course any study that is funded by the manufacturer of a drug should be viewed highly skeptically.)

The bottom line is this: finding the truth is hard, and science is no shortcut. Only findings that have been repeated and replicated in numerous studies should be believed.

——————————————————————————————————————

Dr. Andrew Gottlieb is a clinical psychologist in Palo Alto, California. His practice serves the greater Silicon Valley area, including the towns of San Jose, Cupertino, Santa Clara, Sunnyvale, Mountain View, Los Altos, Menlo Park, San Carlos, Redwood City, Belmont, and San Mateo. Dr. Gottlieb specializes in treating anxiety, depression, relationship problems, OCD, and other difficulties using evidence-based Cognitive Behavioral Therapy (CBT). CBT is a modern no-drug therapy approach that is targeted, skill-based, and proven effective by many research studies. Visit his website at CambridgeTherapy.com or watch Dr. Gottlieb on YouTube. He can be reached by phone at (650) 324-2666 and email at: Dr. Gottlieb Email.

Bad Science, Reported Badly, and Then Corrected Thanks to Your Intrepid Blogger!

I read a lot. One of my favorite online magazines is Slate.com. It is a wide-ranging online mag that covers politics, news, the arts, business, and science. I was reading the other night and noticed an article by the writer Will Saletan that was looking at some scientific research on “Gaydar”. Gaydar is the supposed ability to discern whether a person is homosexual simply by looking at them.

In the original article, Saletan quoted research by Nicholas Rule, Nalini Ambady, Reginald Adams Jr., and Neil Macrae at Tufts University. The researchers took personal ad photos from gay and straight men, and then had college students look at them to rate whether they were straight or gay. For some reason the researchers chose to use correlation coefficients or R scores to report their data. The highest R scores were 0.31, which in the original version of the article Saletan incorrectly stated was the equivalent of an accuracy rate of 65%. I’m not sure where he got the 65% number, but I immediately recognized that this was a mistake. An R score, when squared, represents the percentage of the variance being explained. So squaring an R score of 0.31 means that roughly 9% of the variance has been explained. That means that 91% of the variance in the dependent variable is still unexplained.

In the original article Saletan had called these experiments “impressive”. Given the tiny bit of variance explained by even the strongest of the experiments, I would call them less than impressive. And given the subject of the experiment, I would actually call them “oppressive”. This is a great example of taking extremely weak scientific findings and spinning them into something approaching meaningfulness. There are so many alternate explanations for why tiny findings could have happened that do not require any assumption of accurate “gaydar”.

I wrote a comment on the article explaining the mistake.   To the credit of Saletan (and Slate magazine), they noticed and read my comment on the inaccurate reporting of statistical findings, and after an e-mail correspondence with me regarding the accurate interpretation of the statistics, posted a revised version of the article. That’s honest and impressive. It also shows that it’s worth writing comments on online articles, and that writers read the comments.

I still think the original research doesn’t merit even the corrected coverage that Slate gave it, but at least the science is accurately reported. Of course, the biggest flaw in the research was that they were only looking at photos of gay men who were openly gay, and the article really is about can you tell if a man is secretly gay. So the bottom line is that even if the researchers had done better research, it still wouldn’t answer the original question of the article.

I should add that I question the use of science to pursue questions that tread dangerously close to prejudice and stereotyping. But we live in a free country, and scientists have every right to do research on any topic they choose. I’m just not sure that the National Science Foundation should be funding such research. In any case, I was glad to be able to correct misinterpretations of the statistical results of the study.

Notes:

The original version of the article is in Google’s cache,  here, at least for now. (Google updated the page, so now it’s the same as the corrected page.)

The corrected version of the article is here.

The research that the article is based on is here.

 

Copyright © 2010, 2011 Andrew Gottlieb, Ph.D. /The Psychology Lounge/TPL Productions


——————————————————————————————————————

Dr. Andrew Gottlieb is a clinical psychologist in Palo Alto, California. His practice serves the greater Silicon Valley area, including the towns of San Jose, Cupertino, Santa Clara, Sunnyvale, Mountain View, Los Altos, Menlo Park, San Carlos, Redwood City, Belmont, and San Mateo. Dr. Gottlieb specializes in treating anxiety, depression, relationship problems, OCD, and other difficulties using evidence-based Cognitive Behavioral Therapy (CBT). CBT is a modern no-drug therapy approach that is targeted, skill-based, and proven effective by many research studies. Visit his website at CambridgeTherapy.com or watch Dr. Gottlieb on YouTube. He can be reached by phone at (650) 324-2666 and email at: Dr. Gottlieb Email.