Why You Should Never Read Online Illness or Medication Forums, and Why You Should be Skeptical of Google Search Results as Well

The first thing many people seem to do when they get a diagnosis of a physical or a mental illness is to go to the internet and search on that illness. Patients who are prescribed medications do the same. Often the search results lead to internet forums. These forums consist of user-generated content that usually is not moderated or edited by any professional. Anyone can post on these forums. This seems reasonable, right? But in this article I’m going to tell you why, for the most part ,you should avoid reading these forums. And I will also tell you why you should be skeptical of Google search results regarding any illness.

When people read on forums about their illness or medication, they get scared. Many of the forum posts will say that your illness leads to awful and dire outcomes, and that the medications prescribed to you will make you depressed, addicted, or crazy.

For instance, I often treat tinnitus patients. Samplings of the forums that cover tinnitus suggest that most of the people who post on these forums are completely miserable and suffering terribly from their tinnitus.

So what’s the problem here? Isn’t this useful information? Can’t patients learn something interesting and helpful from these forums?

Unfortunately, Internet illness forums often present a distorted, grim, and negative impression of most illnesses and most medications. Why is this? The main reason is because of selection and sampling bias. The groups of people who post on illness forums are not a representative sample of people with a particular illness. Let’s use tinnitus as an example. If you read the tinnitus forums you would assume that everybody with tinnitus is anxious and depressed about it.

But actually, we know from research studies that roughly 20% to 40% of the population experience tinnitus symptoms from time to time. We also know that roughly 2% of people who have tinnitus symptoms suffer psychologically. So the data from research suggests that a small subset (2%) of people who have tinnitus symptoms suffer anxiety and depression as a result of their tinnitus. Most people (98%) with tinnitus symptoms do not suffer significantly or they have adapted over time and gotten over their suffering.

But the forums are full of posts from the people who suffer the most. People who don’t suffer don’t spend their time posting. And people who have overcome their suffering also don’t post. So reading the forums gives a tinnitus patient a distorted and scary view of the experience of tinnitus.

The other problem in reading internet information about illnesses is the way that Google Search ranks and orders search results. When you search on tinnitus, what you might not realize is that Google presents pages in order of popularity, not in order based on how accurate or scientific they are. Sites that are clicked on more frequently will rise up in the Google search results and sites that are clicked on less frequently will fall down. When you do a Google search people typically click on the most shocking and scary links. “Tinnitus caused by alien abduction” will get a lot of clicks even though it may represent a site run by a single person who claims to have been abducted by aliens. Thus the alien abduction tinnitus site will move up in the Google rankings.

Boring scientific sites fall down in the search rankings. That’s because they have scientific names that don’t encourage people to click on the links.

So how can patients get accurate information about their illness or about medication treatments?

One way is to search within scientific and medical sites. For instance, Medscape is an excellent website that offers medical articles about almost every illness. WebMD is another site more designed for lay people, which also offers good information. If you want to search scientific articles you can use the PubMed search engine which searches published research articles.

Let’s do a Google search on tinnitus. Overall, the 1st page of Google results is pretty representative of medical and scientific sites. But the 3rd listing titled “In the news”, is an article “Martin McGuinness tells of misery living with tinnitus,” from the Belfast Telegraph. Pretty grim, you think, misery!

But if you actually clicked through to the article you would get a very different impression because Martin McGuinness actually says that “it had a limited impact on day-to-day life and work and that family, friends and work colleagues were very supportive.… It does not limit me in a professional or personal capacity.” This is a much more positive view than suggested by the title and the Google link.

This is a great example of why the Internet is dangerous. The headline is what’s called click bait, a link that falsely represents the actual page, which is designed to attract people’s clicks.

Forums about medication are also problematic. Many psychiatric medications can have side effects. For most people these side effects are minimal or tolerable and are overbalanced by the benefits of the medications. For a minority of patients, the side effects are not minimal and these are the patients who are over-represented in most Internet medication forums. Also, on an Internet forum you never really know all of the medications the person is taking, the accurate dosages, as well as their underlying illness.

There is one more problem with reading about illnesses on the Internet. It’s one that particularly disturbs me. Many websites, even websites that purport to be objective, actually are selling something. They may be selling a supplement or vitamin, or an e-book or some other kind of program to treat an illness. Obviously, to increase sales, these commercial websites will paint a distorted negative picture of any illness or condition. They may also disparage other more traditional and scientifically validated treatments or drugs. In general, you should be skeptical of any information that comes from a website that sells products or services.

To review:

  1. Take Google search results with many grains of salt. Remember that Google orders search results by popularity not by accuracy.
  2. Beware of Internet illness and medication forums. By and large, they are populated with an unrepresentative sample of illness sufferers, the ones who suffer the most and cope the least well. Reading them will depress you and make you anxious.
  3. If you want to get information about your illness or potential treatments, utilize established and reputable medical and psychological information sites. An exhaustive list of best medical sites can be found at: the Consumer and Patient Health Information Site. Some of the good medical sites include MedscapeWebMD, and MayoClinic. Some of the best sites for mental health information include PsychCentral, NIMH , American Psychiatry Association, American Psychology Association .
  1. Finally, remember that a very large percentage of websites are actually selling something, and be skeptical of information from these sites.

In conclusion, suffering any illness or condition is unpleasant and sometimes scary. Don’t make it worse by consuming information on the Internet in a random way. Be skeptical and selective and remember that Google is not always your friend. Often a good physician or good psychologist can give you clear and balanced information.

 

——————————————————————————————————————

Dr. Andrew Gottlieb is a clinical psychologist in Palo Alto, California. His practice serves the greater Silicon Valley area, including the towns of San Jose, Cupertino, Santa Clara, Sunnyvale, Mountain View, Los Altos, Menlo Park, San Carlos, Redwood City, Belmont, and San Mateo. Dr. Gottlieb specializes in treating anxiety, depression, relationship problems, OCD, and other difficulties using evidence-based Cognitive Behavioral Therapy (CBT). CBT is a modern no-drug therapy approach that is targeted, skill-based, and proven effective by many research studies. Visit his website at CambridgeTherapy.com or watch Dr. Gottlieb on YouTube. He can be reached by phone at (650) 324-2666 and email at: Dr. Gottlieb Email.

Bad Science, Reported Badly, and Then Corrected Thanks to Your Intrepid Blogger!

I read a lot. One of my favorite online magazines is Slate.com. It is a wide-ranging online mag that covers politics, news, the arts, business, and science. I was reading the other night and noticed an article by the writer Will Saletan that was looking at some scientific research on “Gaydar”. Gaydar is the supposed ability to discern whether a person is homosexual simply by looking at them.

In the original article, Saletan quoted research by Nicholas Rule, Nalini Ambady, Reginald Adams Jr., and Neil Macrae at Tufts University. The researchers took personal ad photos from gay and straight men, and then had college students look at them to rate whether they were straight or gay. For some reason the researchers chose to use correlation coefficients or R scores to report their data. The highest R scores were 0.31, which in the original version of the article Saletan incorrectly stated was the equivalent of an accuracy rate of 65%. I’m not sure where he got the 65% number, but I immediately recognized that this was a mistake. An R score, when squared, represents the percentage of the variance being explained. So squaring an R score of 0.31 means that roughly 9% of the variance has been explained. That means that 91% of the variance in the dependent variable is still unexplained.

In the original article Saletan had called these experiments “impressive”. Given the tiny bit of variance explained by even the strongest of the experiments, I would call them less than impressive. And given the subject of the experiment, I would actually call them “oppressive”. This is a great example of taking extremely weak scientific findings and spinning them into something approaching meaningfulness. There are so many alternate explanations for why tiny findings could have happened that do not require any assumption of accurate “gaydar”.

I wrote a comment on the article explaining the mistake.   To the credit of Saletan (and Slate magazine), they noticed and read my comment on the inaccurate reporting of statistical findings, and after an e-mail correspondence with me regarding the accurate interpretation of the statistics, posted a revised version of the article. That’s honest and impressive. It also shows that it’s worth writing comments on online articles, and that writers read the comments.

I still think the original research doesn’t merit even the corrected coverage that Slate gave it, but at least the science is accurately reported. Of course, the biggest flaw in the research was that they were only looking at photos of gay men who were openly gay, and the article really is about can you tell if a man is secretly gay. So the bottom line is that even if the researchers had done better research, it still wouldn’t answer the original question of the article.

I should add that I question the use of science to pursue questions that tread dangerously close to prejudice and stereotyping. But we live in a free country, and scientists have every right to do research on any topic they choose. I’m just not sure that the National Science Foundation should be funding such research. In any case, I was glad to be able to correct misinterpretations of the statistical results of the study.

Notes:

The original version of the article is in Google’s cache,  here, at least for now. (Google updated the page, so now it’s the same as the corrected page.)

The corrected version of the article is here.

The research that the article is based on is here.

 

Copyright © 2010, 2011 Andrew Gottlieb, Ph.D. /The Psychology Lounge/TPL Productions


——————————————————————————————————————

Dr. Andrew Gottlieb is a clinical psychologist in Palo Alto, California. His practice serves the greater Silicon Valley area, including the towns of San Jose, Cupertino, Santa Clara, Sunnyvale, Mountain View, Los Altos, Menlo Park, San Carlos, Redwood City, Belmont, and San Mateo. Dr. Gottlieb specializes in treating anxiety, depression, relationship problems, OCD, and other difficulties using evidence-based Cognitive Behavioral Therapy (CBT). CBT is a modern no-drug therapy approach that is targeted, skill-based, and proven effective by many research studies. Visit his website at CambridgeTherapy.com or watch Dr. Gottlieb on YouTube. He can be reached by phone at (650) 324-2666 and email at: Dr. Gottlieb Email.

How to Read Media Coverage of Scientific Research: Sorting Out the Stupid Science from Smart Science

Reading today’s headlines I saw an interesting title, “New Alzheimer’s Gene Identified.”

I was intrigued. Discovering a gene that caused late onset Alzheimer’s would be a major scientific breakthrough, perhaps leading to effective new treatments. So I read the article carefully.

To summarize the findings, a United States research team looked at the entire genome of 2269 people who had late onset Alzheimer’s and 3107 people who did not. They were looking for differences in the genome.

In the people who had late onset Alzheimer’s, 9% had a variation in the gene MTHFD1L, which lives on chromosome 6. Of those who did not have late-onset Alzheimer’s 5% had this variant.

So is this an important finding? The article suggested it was. But I think this is a prime example of bad science reporting. For instance, they went on to say that this particular gene is involved with the metabolism of folate, which influences levels of homocysteine. It’s a known fact that levels of homocysteine can affect heart disease and Alzheimer’s. So is it the gene, or is it the level of homocysteine?

The main reason why I consider this an example of stupid science reporting is that the difference is trivial. Let me give you an example of a better way to report this. The researchers could have instead reported that among people with late-onset Alzheimer’s, 91% of them had no gene changes, and then among people without late onset Alzheimer’s 95% of them had normal genes. But this doesn’t sound very impressive, and calls into question whether measurement errors would account for the differences.

So this very expensive genome test yields absolutely no predictive value in terms of who will develop Alzheimer’s and who will not. There is a known genetic variant, called APOE, which lives on chromosome 19. Forty percent of those who develop late-onset Alzheimer’s have this gene, while only 25 to 30% of the general population has it. So even this gene, which has a much stronger association with Alzheimer’s, isn’t a particularly useful clinical test.

The other reason this is an example of stupid science is that basically this is a negative finding. To scan the entire human genome looking for differences between normal elderly people and elderly people with Alzheimer’s, and discover only a subtle and tiny difference, must’ve been a huge disappointment for the researchers. If I had been the journal editor reviewing this study, I doubt I would’ve published it. Imagine a similar study of an antidepressant, which found that in the antidepressant group, 9% of people got better, and in the placebo group 5% got better. I doubt this would get published.

Interestingly enough, the study hasn’t been published yet, but is being presented as a paper at the April 14 session of the American Academy of Neurology conference in Toronto. This is another clue to reading scientific research. If it hasn’t been published in a peer-reviewed scientific journal, be very skeptical of the research. Good research usually gets published in top journals, and research that is more dubious often is presented at conferences but never published. It’s much easier to get a paper accepted for a conference than in a science journal.

It’s also important when reading media coverage of scientific research to read beyond the headlines, and to look at the actual numbers that are being reported. If they are very small numbers, or very small differences, be very skeptical of whether they mean anything at all.

As quoted in the article, “While lots of genetic variants have been singled out as possible contributors to Alzheimer’s, the findings often can’t be replicated or repeated, leaving researchers unsure if the results are a coincidence or actually important,” said Dr. Ron Petersen, director of the Mayo Alzheimer’s disease research Center in Rochester, Minnesota.

So to summarize, to be a savvy consumer of media coverage of scientific research:

1. Be skeptical of media reports of scientific research that hasn’t been published in top scientific journals. Good research gets published in peer-reviewed journals, which means that other scientists skeptically read the article before it’s published.

2. Read below the headlines and look for actual numbers that are reported, and apply common sense to these numbers. If the differences are very small in absolute numbers, it often means that the research has very little clinical usefulness. Even if the differences are large in terms of percentages, this doesn’t necessarily mean that they are useful findings.

An example would be a finding that drinking a particular type of bourbon increases a very rare type of brain tumor from one in 2,000,00 to three in 2 million. If this was reported in percentage terms the headline would say drinking this bourbon raises the risk of brain tumor by 300%, which would definitely put me and many other people off from drinking bourbon. (By the way, this is a completely fictitious example.) But if you compare the risk to something that people do every day such as driving, and revealed the driving is 1000 times more risky than drinking this type of bourbon, it paints the research in a very different light.

3. Be very skeptical of research that has not been reproduced or replicated by other scientists. There’s a long history in science of findings that cannot be reproduced or replicated by other scientists, and therefore don’t hold up as valid research findings.

4. On the web, be very skeptical of research that’s presented on sites that sell products. Unfortunately a common strategy for selling products, particularly vitamin supplements, is to present pseudoscientific research that supports the use of the supplement. In general, any site that sells a product cannot be relied on for objective information about that product. It’s much better to go to primarily information sites like Web M.D., or the Mayo Clinic site, or one can go directly to the original scientific articles (in some cases), by using PubMed.

So be a smart consumer of science, so that you can tell the difference between smart science and stupid science.

Copyright © 2010 Andrew Gottlieb, Ph.D. /The Psychology Lounge/TPL Productions

——————————————————————————————————————

Dr. Andrew Gottlieb is a clinical psychologist in Palo Alto, California. His practice serves the greater Silicon Valley area, including the towns of San Jose, Cupertino, Santa Clara, Sunnyvale, Mountain View, Los Altos, Menlo Park, San Carlos, Redwood City, Belmont, and San Mateo. Dr. Gottlieb specializes in treating anxiety, depression, relationship problems, OCD, and other difficulties using evidence-based Cognitive Behavioral Therapy (CBT). CBT is a modern no-drug therapy approach that is targeted, skill-based, and proven effective by many research studies. Visit his website at CambridgeTherapy.com or watch Dr. Gottlieb on YouTube. He can be reached by phone at (650) 324-2666 and email at: Dr. Gottlieb Email.