Why You Shouldn’t Believe Everything You Read in the Newspapers about Medical Studies

One of my favorite journals is called PLOS ONE.  This is a journal which supports open access. That means anyone can access any article in this Journal without paying a fee. Medical studies published in this journal are accessible to anyone.

Most of you probably don’t realize but when you see a medical study quoted in a newspaper article, you can’t actually access the original study on the Internet without paying a hefty fee, usually $20-$40! If you have access to a medical library then you may be able to access the article but for most people the original articles are off-limits without paying large fees.

Plos.org is an organization that supports open access publication of scientific articles.

That’s why I admire them.

Back to my main story. A recent study in PLOS ONE looked at how often medical research results are replicated, meaning does a second or third similar study show the same results.

The researchers in this study looked at 4723 studies that were included in 306 meta-analysis articles. (A meta-analysis is a study where you combine the results of many other research studies in order to get an overview of findings.)   The researchers divided the studies into lifestyle related studies which looked at things like drinking coffee or smoking cigarettes and non-lifestyle studies such as genetic markers for Alzheimer’s. There were 639 lifestyle studies and 4084 non-lifestyle studies.

The question is of the studies that were picked up by newspapers, how many of them were replicated by subsequent studies?  The answer is only about half of the studies held up when tested again in another study. The other thing that was interesting in this article was that when studies failed to replicate, newspapers never reported that failure. Interesting examples included studies that linked a specific gene to depression, schizophrenia, or autism. None of these studies replicated successfully, which you think would be big news and would be reported by many newspapers, but the truth is that not a single newspaper article reported these failures to replicate.

This shows that newspapers don’t have much genuine interest in good science reporting. Good science reporting always involves being skeptical of new and different results, as well as following up on attempts to replicate those results.

So, what does this mean about science results reported in popular media? What it probably means is that if the finding is new and exciting and different, you probably should be highly skeptical of it being true. And the more esoteric the finding is (such as genetic markers) the more skeptical you should be.

For instance, a recent study that was funded by drug companies looked at whether the statin class of medications have side effects or whether these side effects are just a placebo effect. I’ll write more extensively about this study later, but the study’s findings–that only when people knew they were taking statins did they experience side effects– should probably be viewed very skeptically since many other studies have shown side effects from statins and many clinical reports have confirmed the side effects. (And of course any study that is funded by the manufacturer of a drug should be viewed highly skeptically.)

The bottom line is this: finding the truth is hard, and science is no shortcut. Only findings that have been repeated and replicated in numerous studies should be believed.


Dr. Andrew Gottlieb is a clinical psychologist in Palo Alto, California. His practice serves the greater Silicon Valley area, including the towns of San Jose, Cupertino, Santa Clara, Sunnyvale, Mountain View, Los Altos, Menlo Park, San Carlos, Redwood City, Belmont, and San Mateo. Dr. Gottlieb specializes in treating anxiety, depression, relationship problems, OCD, and other difficulties using evidence-based Cognitive Behavioral Therapy (CBT). CBT is a modern no-drug therapy approach that is targeted, skill-based, and proven effective by many research studies. Visit his website at CambridgeTherapy.com or watch Dr. Gottlieb on YouTube. He can be reached by phone at (650) 324-2666 and email at: Dr. Gottlieb Email.

Why You Should Never Read Online Illness or Medication Forums, and Why You Should be Skeptical of Google Search Results as Well

The first thing many people seem to do when they get a diagnosis of a physical or a mental illness is to go to the internet and search on that illness. Patients who are prescribed medications do the same. Often the search results lead to internet forums. These forums consist of user-generated content that usually is not moderated or edited by any professional. Anyone can post on these forums. This seems reasonable, right? But in this article I’m going to tell you why, for the most part ,you should avoid reading these forums. And I will also tell you why you should be skeptical of Google search results regarding any illness.

When people read on forums about their illness or medication, they get scared. Many of the forum posts will say that your illness leads to awful and dire outcomes, and that the medications prescribed to you will make you depressed, addicted, or crazy.

For instance, I often treat tinnitus patients. Samplings of the forums that cover tinnitus suggest that most of the people who post on these forums are completely miserable and suffering terribly from their tinnitus.

So what’s the problem here? Isn’t this useful information? Can’t patients learn something interesting and helpful from these forums?

Unfortunately, Internet illness forums often present a distorted, grim, and negative impression of most illnesses and most medications. Why is this? The main reason is because of selection and sampling bias. The groups of people who post on illness forums are not a representative sample of people with a particular illness. Let’s use tinnitus as an example. If you read the tinnitus forums you would assume that everybody with tinnitus is anxious and depressed about it.

But actually, we know from research studies that roughly 20% to 40% of the population experience tinnitus symptoms from time to time. We also know that roughly 2% of people who have tinnitus symptoms suffer psychologically. So the data from research suggests that a small subset (2%) of people who have tinnitus symptoms suffer anxiety and depression as a result of their tinnitus. Most people (98%) with tinnitus symptoms do not suffer significantly or they have adapted over time and gotten over their suffering.

But the forums are full of posts from the people who suffer the most. People who don’t suffer don’t spend their time posting. And people who have overcome their suffering also don’t post. So reading the forums gives a tinnitus patient a distorted and scary view of the experience of tinnitus.

The other problem in reading internet information about illnesses is the way that Google Search ranks and orders search results. When you search on tinnitus, what you might not realize is that Google presents pages in order of popularity, not in order based on how accurate or scientific they are. Sites that are clicked on more frequently will rise up in the Google search results and sites that are clicked on less frequently will fall down. When you do a Google search people typically click on the most shocking and scary links. “Tinnitus caused by alien abduction” will get a lot of clicks even though it may represent a site run by a single person who claims to have been abducted by aliens. Thus the alien abduction tinnitus site will move up in the Google rankings.

Boring scientific sites fall down in the search rankings. That’s because they have scientific names that don’t encourage people to click on the links.

So how can patients get accurate information about their illness or about medication treatments?

One way is to search within scientific and medical sites. For instance, Medscape is an excellent website that offers medical articles about almost every illness. WebMD is another site more designed for lay people, which also offers good information. If you want to search scientific articles you can use the PubMed search engine which searches published research articles.

Let’s do a Google search on tinnitus. Overall, the 1st page of Google results is pretty representative of medical and scientific sites. But the 3rd listing titled “In the news”, is an article “Martin McGuinness tells of misery living with tinnitus,” from the Belfast Telegraph. Pretty grim, you think, misery!

But if you actually clicked through to the article you would get a very different impression because Martin McGuinness actually says that “it had a limited impact on day-to-day life and work and that family, friends and work colleagues were very supportive.… It does not limit me in a professional or personal capacity.” This is a much more positive view than suggested by the title and the Google link.

This is a great example of why the Internet is dangerous. The headline is what’s called click bait, a link that falsely represents the actual page, which is designed to attract people’s clicks.

Forums about medication are also problematic. Many psychiatric medications can have side effects. For most people these side effects are minimal or tolerable and are overbalanced by the benefits of the medications. For a minority of patients, the side effects are not minimal and these are the patients who are over-represented in most Internet medication forums. Also, on an Internet forum you never really know all of the medications the person is taking, the accurate dosages, as well as their underlying illness.

There is one more problem with reading about illnesses on the Internet. It’s one that particularly disturbs me. Many websites, even websites that purport to be objective, actually are selling something. They may be selling a supplement or vitamin, or an e-book or some other kind of program to treat an illness. Obviously, to increase sales, these commercial websites will paint a distorted negative picture of any illness or condition. They may also disparage other more traditional and scientifically validated treatments or drugs. In general, you should be skeptical of any information that comes from a website that sells products or services.

To review:

  1. Take Google search results with many grains of salt. Remember that Google orders search results by popularity not by accuracy.
  2. Beware of Internet illness and medication forums. By and large, they are populated with an unrepresentative sample of illness sufferers, the ones who suffer the most and cope the least well. Reading them will depress you and make you anxious.
  3. If you want to get information about your illness or potential treatments, utilize established and reputable medical and psychological information sites. An exhaustive list of best medical sites can be found at: the Consumer and Patient Health Information Site. Some of the good medical sites include MedscapeWebMD, and MayoClinic. Some of the best sites for mental health information include PsychCentral, NIMH , American Psychiatry Association, American Psychology Association .
  1. Finally, remember that a very large percentage of websites are actually selling something, and be skeptical of information from these sites.

In conclusion, suffering any illness or condition is unpleasant and sometimes scary. Don’t make it worse by consuming information on the Internet in a random way. Be skeptical and selective and remember that Google is not always your friend. Often a good physician or good psychologist can give you clear and balanced information.



Dr. Andrew Gottlieb is a clinical psychologist in Palo Alto, California. His practice serves the greater Silicon Valley area, including the towns of San Jose, Cupertino, Santa Clara, Sunnyvale, Mountain View, Los Altos, Menlo Park, San Carlos, Redwood City, Belmont, and San Mateo. Dr. Gottlieb specializes in treating anxiety, depression, relationship problems, OCD, and other difficulties using evidence-based Cognitive Behavioral Therapy (CBT). CBT is a modern no-drug therapy approach that is targeted, skill-based, and proven effective by many research studies. Visit his website at CambridgeTherapy.com or watch Dr. Gottlieb on YouTube. He can be reached by phone at (650) 324-2666 and email at: Dr. Gottlieb Email.

How Reporters Screw up Health and Medical Reporting (and How You Can Catch Them Doing So)

I’ve written before about common mistakes in interpreting medical research in my blog post How to Read Media Coverage of Scientific Research: Sorting out the Stupid Science from Smart Science. I recently read a very interesting post by Gary Schwitzer about the most common mistakes that journalists make when reporting health and medical findings.

The three mistakes that he discusses:

 1.      Absolute versus relative risk/benefit data

“Many stories use relative risk reduction or benefit estimates without providing  the absolute data. So, in other words, a drug is said to reduce the risk of hip fracture by 50% (relative risk reduction), without ever explaining that it’s a reduction from 2 fractures in 100 untreated women down to 1 fracture in 100 treated women. Yes, that’s 50%, but in order to understand the true scope of the potential benefit, people need to know that it’s only a 1% absolute risk reduction (and that all the other 99 who didn’t benefit still had to pay and still ran the risk of side effects).

2.      Association does not equal causation

A second key observation is that journalists often fail to explain the inherent limitations in observational studies – especially that they cannot establish cause and effect. They can point to a strong statistical association but they can’t prove that A causes B, or that if you do A you’ll be protected from B. But over and over we see news stories suggesting causal links. They use active verbs in inaccurately suggesting established benefits.

3.      How we discuss screening tests

The third recurring problem I see in health news stories involves screening tests. … “Screening,” I believe, should only be used to refer to looking for problems in people who don’t have signs or symptoms or a family history. So it’s like going into Yankee Stadium filled with 50,000 people about whom you know very little and looking for disease in all of them. … I have heard women with breast cancer argue, for example, that mammograms saved their lives because they were found to have cancer just as their mothers did. I think that using “screening” in this context distorts the discussion because such a woman was obviously at higher risk because of her family history. She’s not just one of the 50,000 in the general population in the stadium. There were special reasons to look more closely in her. There may not be reasons to look more closely in the 49,999 others.”

Let’s discuss each of these in a little bit more depth. The first mistake is probably the most common one, where statistically significant findings are not put into clinical perspective. Let me explain. Suppose we are looking at a drug that prevents a rare illness. The base rate of this illness, which we will call Catachexia is 4 in 10,000 people. The drug reduces this illness to one in 10,000 people, a 75% decrease. Sounds good, right?

Not so fast. Let me add a few facts to this hypothetical case. Let’s imagine that the drug costs $10,000 a year, and also has some bad side effects. So in order to reduce the incidence from four people to one person in ten thousand, 9996 people who would never develop this rare but serious illness must be treated. The cost of doing so would be $99,960,000! Plus those 9996 people would be unnecessarily exposed to side effects.

So which headline sounds better to you?

New Drug Prevents 75% of Catachexia Cases!


New Drug Lowers the Prevalence of Catachexia Cases by Three People per 10,000, at a Cost of Almost $100 Million Dollars

The first headline reflects a reporting of the relative risk reduction, without cost data, and the second headline reflects the absolute risk reduction, and the costs. The second headline is the only one that should be reported but unfortunately the first headline is much more typical in science and medical reporting.

The second error where association or correlation does not equal causation is terribly common as well. The best example of this is all of the studies looking at the health effects of coffee. Almost every week we get a different study that claims either a health benefit of coffee or a negative health impact of coffee. These findings are typically reported in the active tense such as, “drinking coffee makes you smarter.”

So which headline sounds better to you?

Drinking Coffee Makes You Smarter


Smarter People Drink More Coffee


Scientists Find a Relatively Weak Association between Intelligence Levels and Coffee Consumption

Of course the first headline is the one that will get reported, even though the second headline is equally inaccurate. Only the third headline accurately reports the findings.

The theoretical problem with any correlational study of two different variables is that we never know, nor can we ever know, what intervening variables might be correlated with each. Let me give you a classic example. There is a high correlation between the consumption of ice cream in Iowa and the death rate in Mumbai, India. This sounds pretty disturbing. Maybe those people in Iowa should stop eating ice cream. But of course the intervening variable here is summer heat. When the temperature goes up in Iowa people eat more ice cream. And when the temperature goes up in India, people are more likely to die.

The only way that one could actually verify a correlational finding would be to do a follow-up study where you randomly assign people to either consume or not consume the substance or food that you wish to test. The problem with this is that you would have to get coffee drinkers to agree not to drink coffee and non-coffee drinkers to agree to drink coffee, for example, which might be very difficult. But if you can do this with coffee, chocolate, broccoli, exercise, etc. then at least you could demonstrate a real causal effect. (I’ve oversimplified some of the complexity of controlled random assignment studies, but my point stands.)

The final distortion which involves confusion about screening tests is also very common, and unfortunately, incredibly complex. The main point that Schwitzer is trying to make here though is simple; screening tests are only those tests which are applied to a general population which is not at high risk for any illness. Evaluating the usefulness of screening tests must be done in the context of a low risk population, because that is how most screening tests are used. Most people don’t get colon cancer, breast cancer, or prostate cancer, even over 50. If you use a screening test only with high-risk individuals then it’s not really a screening test.

There is the whole other issue with reporting on screening tests that I’m only going to briefly mention because it’s so complicated and so controversial. It’s that many screening tests may do as much harm as good. Recently there has been a lot of discussion of screening for cancer, especially prostate and breast cancer. The dilemma with screening tests is that once you find cancer you almost always are obligated to treat it because of medical malpractice issues and psychological issues (“Get that cancer out of me!”) The problem with this automatic treatment is that current screening doesn’t distinguish between fast-growing dangerous tumors and very slow growing indolent tumors. Thus we may apply treatments which have considerable side effects or even mortality to tumors that would never harm the person.

Another problem is that screening often misses the onset of fast-growing dangerous tumors because they begin to grow between the screening tests. The bottom line is that screening for breast cancer and prostate cancer may have relatively little impact on the only statistic that counts – the cancer death rate. If we had screening tests that could distinguish between relatively harmless tumors and dangerous tumors then screening might be more helpful, but that is not where we are yet.

One more headline test. Which headline do you prefer?

Screening for Prostate Cancer Leads to Detection and Cure of Prostate Cancer


Screening for Prostate Cancer Leads to Impotence and Incontinence in Many Men Who Would Never Die from Prostate Cancer

The first headline is the one that will get reported even though the second headline is scientifically more accurate.

I suggest that every time you see a health or medicine headline that you rewrite it in a more accurate way after you read the article. Remember to use absolute differences rather than relative differences, to report association instead of causation, and add in the side effects and costs of any suggested treatment or screening test. This will give you practice in reading health and medical research accurately.

Also remember the most important rule, one small study does not mean anything. It’s actually quite humorous how the media will seize upon a study, even though the study was based on 20 people and hasn’t been replicated or repeated by anybody. They also typically fail to put into context the results of one study versus other studies of the same thing. A great example is eggs and type II diabetes. The same researcher, Luc Djousse, published a study in 2008 (link) that showed a strong relationship between the consumption of eggs and the occurrence of type II diabetes, but then in 2010 published another study finding absolutely no correlation whatsoever. Always be very skeptical, and most often you will be right.

I’m off to go make a nice vegetarian omelette…


Copyright © 2011 Andrew Gottlieb, Ph.D. /The Psychology Lounge/TPL Productions


Dr. Andrew Gottlieb is a clinical psychologist in Palo Alto, California. His practice serves the greater Silicon Valley area, including the towns of San Jose, Cupertino, Santa Clara, Sunnyvale, Mountain View, Los Altos, Menlo Park, San Carlos, Redwood City, Belmont, and San Mateo. Dr. Gottlieb specializes in treating anxiety, depression, relationship problems, OCD, and other difficulties using evidence-based Cognitive Behavioral Therapy (CBT). CBT is a modern no-drug therapy approach that is targeted, skill-based, and proven effective by many research studies. Visit his website at CambridgeTherapy.com or watch Dr. Gottlieb on YouTube. He can be reached by phone at (650) 324-2666 and email at: Dr. Gottlieb Email.