
Live Long and Well with Dr. Bobby
Let's explore how you can Live Long and Well with six evidence based pillars: exercise, good sleep, proper nutrition, mind-body activities, exposure to heat/cold, and social relationships. I am a physician scientist, Ironman Triathlete, and have a passion for helping others achieve their best self.
Live Long and Well with Dr. Bobby
#22: Health Headlines: Helpful? Harmful? Or Just Plain Confusing?
In this episode, Dr. Bobby tackles the often perplexing world of health headlines. From bold claims about intermittent fasting to the benefits of wearing socks to bed, he breaks down how to evaluate these headlines critically. With nine key questions to ask about a headline, insights into the hierarchy of evidence, and two practical examples, Dr. Bobby provides listeners with tools to discern fact from fiction in health journalism.
And, your Health Type influences how you might use information. Take the Health Quiz
Join the Mastermind Workshop Waitlist here: the Live Long and Well Jumpstart
Key Topics Covered:
- Understanding Health Headlines:
- Should you believe a health headline? How do you decide whether it is likely to be true, or not adequately based upon evidence?
- Common examples of sensational headlines and their flaws.
- Nine Essential Questions to Evaluate Headlines:
- Is the article published in a reputable outlet by a science writer?
- Was the headline based on actual scientific studies or just an expert's opinion?
- Is the study published in a peer-reviewed journal, or was it just presented at a meeting?
- What journal was it published in, and what is its impact factor?
- Who conducted the study, and where?
- How large was the study population?
- What type of study was it? (Randomized controlled trial vs. observational vs. model-based.)
- Was there an editorial discussing the study’s limitations?
- Does the headline sound "too good to be true"?
- Hierarchy of Evidence:
- From most likely credible to least likelycredible:
- Meta-analyses.
- Randomized controlled trials (RCTs).
- Observational studies.
- Case series.
- Expert guidelines.
- Individual expert opinions.
- Explanation of each and when to trust them.
- From most likely credible to least likelycredible:
- Examples of Health Studies:
- Intermittent Fasting and Heart Risk: Why the headline about a 91% increased risk of death was flawed.
- Meal Replacement Shakes: Insights from a Chinese randomized trial and its limitations.
- The Problem of Data Manipulation (P-Hacking):
- How over-analysis of databases can lead to misleading conclusions.
- The importance of recognizing correlation vs. causation in studies.
Takeaways for Listeners:
- Use the 9 Questions Framework to critically evaluate health headlines and articles.
- Understand that the type of study (e.g., RCT vs. observational) significantly impacts its credibility.
- Remember that sensational headlines often oversimplify or distort study findings.
- Stay skeptical of small studies or ones with vague methodologies.
Engage with Dr. Bobby:
- Have a confusing health headline you’d like Dr. Bobby to analyze? Send it in!
- Take the health type quiz at DrBobbyLiveLongAndWell.com to better understand how your approach to wellness influences your perception of health information.
- Don’t forget to leave a review on Apple Podcasts, Spotify, or wherever you listen!
Hi, I'm Dr Bobby Du Bois and welcome to Live Long and Well, a podcast where we will talk about what you can do to live as long as possible and with as much energy and figure that you wish. Together, we will explore what practical and evidence-supported steps you can take. Come join me on this very important journey and I hope that you feel empowered along the way. I'm a physician, ironman, triathlete and have published several hundred scientific studies. I'm honored to be your guide. Welcome back everyone.
Speaker 1:This is episode 22, health Headlines Helpful, harmful or just plain confusing. Almost every day, there are new health headlines. They might be ones recently like apple cider vinegar leads to 12 pounds weight loss. Or red light therapy reduces pain, inflammation and aging skin. Or intermittent fasting causes heart attacks. How about hypothermia? Kills cancer cells. And finally, wearing socks to bed improves your sleep. Well, should we believe these or not? How do you tell? Today we're going to explore some issues that might help you decide whether to believe a headline in the article that relates to it or not. Now, this is just the beginning of our journey on this topic, because this type of knowledge and exploration will take some time, over the next few months, to teach you how to do it and some tricks and some questions that will guide you along the way. Well, last episode we talked about health type and many of you took the quiz. Thank you. And if you haven't, you might want to, because your health type helps you to understand how you approach health. And it may help you understand how you approach a headline and an article that you may read. For example, a holistic health hacker who focuses on really all aspects of health might read a headline and say great, a new thing that I might dive into and add to my regimen. Now that same headline for a purposeful path planner might get them overwhelmed oh my gosh, there's this new thing I should do and I don't know what to do and I don't know whether to believe it. And they might be confused. Well, a contentment creator, again, may read the same headline and say well, you know, it doesn't really fit with my lifestyle. So if you're interested and you haven't done the quiz, just go to my website, drbobbylivelongandwellcom.
Speaker 1:Well, as you know, I like to begin each episode with a personal story. Now, the most important reason that I want to talk about this topic is I used to have a full head of hair and I've pulled most of it out. When my family and friends and colleagues told me oh my gosh, there's this headline. I just read this article and now it's my new understanding of health and life. Don't eat seed oils Well, we talked about that in episode 19. Oh, you got to take this new supplement packet that I read about. Or, oh my gosh, there was a wonderful article on hydrogen water and it's now my new secret weapon for health. Well, all kidding aside, yes, I did lose my hair, but I probably can't fully blame it on this.
Speaker 1:When I hear my friends and colleagues and family talk about this and say why they're so excited about trying this new thing they read about, I ask them okay, so what compelled you to believe this is the way to go? And they have a couple of different answers. One is well, you know it was published in the New York Times or CNN, and you know must be right, because otherwise it wouldn't be in one of those magazines or newspapers that are like. Or they might say well, my friend's friend's friend tried it and it worked. Well, my friend's friend's friend tried it and it worked.
Speaker 1:Another challenge with headlines that people bring to me say well, dr Bobby, you know I read a headline and then again sometime later I read the opposite. You know dueling headlines. One headline might be don't use hormone replacement therapy in menopause, even when you have significant menopausal symptoms, but then you might run across kind of the opposite article. You should consider hormone replacement therapy for menopausal symptoms. And which do you follow? Also, keep in mind, explaining why I pulled my hair out is that today's headline may be the opposite tomorrow. Remember when eggs are bad, whoops, now eggs are good. Headlines not that long ago were well, don't expose babies and infants to peanuts because that will affect their development of peanut allergies. Oops, that was wrong. Development of peanut allergies. Oops, that was wrong.
Speaker 1:As folks may know, my career was spent looking at evidence, published articles and figuring out what does and doesn't work in healthcare and for whom, and I've had an opportunity to publish about 180 peer-reviewed articles of my own on this and related topics. Now, whether it's back surgery and who should get it, an expensive new medication, who would benefit, who doesn't? The latest fad. So in my day-to-day life, it truly pains me when folks get swept up by the latest headline. Well, our plan for today's episode three parts. First, how to read a headline in the news article what to think about, and I'm going to give you nine questions to ask yourself as you go through the article, and, of course, I'll put all nine of these in the show notes. Second part there are various types of scientific studies and what's called a hierarchy of evidence, from the most believable type of study to perhaps the least believable, and I'll walk you through that and then I'll run through in part three some examples to bring out a few of these issues. Again, this is just the beginning of a journey that we will travel together and for you, please suggest some headlines you might want me to talk about and, if you like the podcast, please leave a review on Apple or Spotify or wherever else. All right, part one reading a headline and what to think about. Nine questions.
Speaker 1:Well, there was a recent headline that was what you eat for breakfast influences weight loss and, as I'll get to in a moment, what you're supposed to eat differs if you're a man or a woman. So question number one is this headline and this article in a reputable publication was it written by a science writer? Well, in, this breakfast food that you're supposed to eat to lose weight it came out in Newsweek. So if something comes out in Newsweek or the New York Times Wall Street Journal, you might think it's more credible than if it comes out in Newsweek or the New York Times Wall Street Journal. You might think it's more credible than if it comes out in a fitness magazine or in Reddit. So that's the first thing. Get a sense of where you found this article and it may ratchet up or down on the believability depending upon that. Depending upon that.
Speaker 1:Two is the headline and the article based upon an actual scientific study that was done in people or just the opinion of scientists? So in this article about what you should eat to lose weight, for breakfast, men, they recommend that you eat carbohydrates like oatmeal, and women, their breakfast should have fats like omelets. And they then wrote an article in a journal and then this headline came out of it. Basically, it was called the best breakfast foods for men and women revealed. So when I say, was this based on an actual study in people? The answer for this one was not exactly this article. The underlying study was really a mathematical equation. So they did take 12 people and they put them on a seven-day water-only fast and they measured all sorts of proteins that were being produced by the body, and they then took this information and built a mathematical model. So they didn't actually test in people different foods for breakfast and then see whether they lost weight or not. It was basically a biochemical analysis that they extrapolated with a model, and so when I read a headline that's based upon something that isn't actually a study of testing different breakfast foods and looking at weight loss, all of a sudden I'm now less excited and less likely to believe it.
Speaker 1:Okay, number three Is the underlying study published yet or was it just presented at a meeting and hasn't yet undergone a rigorous peer review? So we're going to talk in a little bit about another study, which basically the headline was fasting, intermittent fasting increases heart-related deaths, and, as we'll see for that article, it wasn't yet published. It was just based upon what was presented at a meeting. So if something was presented at a meeting and this is an article about it, it's much less rigorously reviewed than one that was published. Going back to our breakfast example, it was published, okay.
Speaker 1:So what we've now done are three questions that relate to the article you just read and the headline. Now the next set of questions will need to be based upon finding what that scientific study was. So there may be a hyperlink in your news article. You might be able to just look it up, because they often say, well, this came out in Lancet and Jones was the author and you could try to look it up For these next questions. You will actually see need to get the article in hand.
Speaker 1:Okay, question next. Question Number four where was it published? Which journal was it published in? Now, was it a peer-reviewed journal and there's something called an impact factor, so every journal could be JAMA, new England Journal, journal of Nutrition. They have what's called an impact factor. The higher the number related to the journal, the better the journal is likely to be, and it's based upon how often articles in the journal get cited by other scientific articles and if it's above the numbers above 30, by other scientific articles. And if it's above the numbers above 30, like the New England Journal or the JAMA, that's a high number. Cardiology journals some of the top ones have numbers in the sort of 15 to 40. And nutrition ones tend to be much, much lower sort of in numbers of one to three and that's starting to get down to a level of maybe the journal isn't as prestigious and rigorous as it could be. Okay, so this article, the one that was the mathematical model about what you should eat for breakfast to lose weight, did have an impact factor of 20. So that's a good thing.
Speaker 1:Next question who did the work? Was this done in the US? Was this done outside the work? Was this done in the US? Was this done outside the US? Was it done in a rigorous and known academic institution? Well, the breakfast study was done in London and in Netherlands at a couple of different academic institutions.
Speaker 1:Critical one how big was the study? So the study I just talked about, the breakfast choices was based on 12 people. A study that you read an article about that's about 12 people, or 15 people, or 20 people doesn't excite me. Was it about 500 people? 5,000 people? Now I'm more interested, so be wary when the studies are very, very small. Seven what type of study was it? Was it a randomized controlled trial? We're going to talk about that in a bit or was it an observational study? Or, like I talked about with the breakfast one, is it a model?
Speaker 1:Okay, here's something. If you're looking at the actual article, was there an editorial about the article? That's a great way to learn about what the limitations are about what the study was. So if you can read about other experts as they've talked about this study, that's a great window into whether this is something to believe or not. And then there's the sort of grandmother test, question number nine Does it seem to be too good to be true? That's a good one, because so much of these headlines seem too good to be true. And if that's the case, again I'm going to wonder. Now, none of these nine questions by themselves will tell you whether to absolutely believe the study or not. But you know, you add up the numbers of these nine that answered in a proper way or a positive way, or suggested it might be a good study. Now, all of a sudden, you might want to believe it. If there are very, very few that are supportive in these nine areas, then maybe I'll really wonder about the study.
Speaker 1:Okay, part two. Let's talk about what's called the hierarchy of evidence. So there are six different tiers or types of studies and they're either more likely to be something that's believable and valid or perhaps less likely. The best of the best is what's called a meta-analysis. So let's say there were three or six or 20 different clinical studies, clinical trials, where they actually compared two different groups of people and randomized them. But there isn't just one of these. And now there's got a bunch of these studies where you can statistically bring all of those clinical trials together in what's called a meta-analysis, and in general, a meta-analysis is sort of the highest level of evidence and the most likely to be believable likely to be believable.
Speaker 1:The next is the actual randomized control trial itself, and it is the gold standard for so much of what we do. So any drug or most drugs that are approved on the market by the FDA have gone through a randomized control trial. What does that mean? It means you've got two different groups. You start from the beginning. You randomize them. So you say, okay, this 50 people will get this drug, this 50 people will get placebo, and we'll see what happens. Well, this is where we learned in randomized control trials that statins like Lipitor reduce your risk of heart attack, or recently, the Ozempic and other GLP-1s and how much weight loss they do. And again, these were randomized trials. They took people, divided them into two groups at random, gave them a drug, gave them a placebo, and then observed what happened after that. So underneath meta-analysis, you now have randomized control trials.
Speaker 1:Underneath that you have observational studies. This is typically where you get two different groups of people and you follow them forward in time. For example, oh, there's this population and some eat fish and some don't eat fish. Or some eat a lot of red wine, some drink a very little red wine, and what happens? You know which people do better. Now the challenge with observational studies is you have to adjust for other things. So people who eat fish may not be the same as people who don't. People who eat fish may be eating fewer calories during the day. They may be exercising more. They may be eating their fish while drinking some red wine or white wine. So you have to adjust for differences because this wasn't randomized. You didn't say well, this group of people, I'm going to give you a fish diet and this group of people won't, and I'm going to follow forward in time. That's a randomized control trial, when you just grab people and see who did something or who didn't and try to make some inferences about it. That's an observational study and you definitely have to adjust for what are called confounders.
Speaker 1:Okay, lower on the hierarchy, a case series. Case series is when a doctor or a academic institution says okay, I had 10 patients with knee pain and I gave them injections of platelets in their knee and eight of them got better. And I'm going to tell you about those eight and 10 people. So that's a case series. There is no randomization, there's no placebo. That's just explaining what happened.
Speaker 1:So this is a lower level of evidence. Below that are guidelines. So guidelines are really experts coming together and saying this is our understanding of what works or what doesn't. This is how you get recommendations about mammograms or recommendations not to do whole body MRI screening, and guidelines seem like gosh. Wouldn't that be the best thing, because these are experts looking at the evidence. The problem is that if we went back into history in 1491 and we got a bunch of experts together and asked, well, is the world flat? They would have said yes, columbus obviously proved that wasn't the case a year or so later. So guidelines are based on the current evidence and that doesn't necessarily mean that they are correct.
Speaker 1:And then the last in the hierarchy is just a single person's expert opinion. Now if you're higher on the hierarchy, like a meta-analysis, it's more likely to be believable than if it's just a case series or expert opinion. But there are poorly done randomized control trials and they're really well done observational studies. So just because it's an RCT doesn't necessarily mean it's going to be more valid than an observational study. But in general, the higher up on the hierarchy you are, the more likely to believe it. So again, when you are reading your headline and reading the article and asking what type of study is that article based upon, again the hierarchy can be helpful to say should I believe the headline or not? Okay, so I've laid out a variety of sort of frameworks and conceptual things. So now I think what's important is let's talk about actual studies and we're going to talk about one observational study and one randomized control trial.
Speaker 1:Now, the observational studies is, I would say, the biggest problem today. And if you look at the headlines, I would say a huge percentage of them are based on an observational study. And again, in observational studies you follow folks over time and you compare different groups and you see what happens. So not long ago there was a very widely seen headline in lots and lots of news media and it went something like this Intermittent fasting, which is a weight loss approach, has a 91% higher risk of death from cardiovascular disease. Well, that's worrisome. So you read this headline oh gosh, I've been doing intermittent fasting. I think this is something that you know. When I've been reading about it, boy, it seems like it's worth trying. But this article, this headline, says there's a 91% higher risk of death.
Speaker 1:So what did they do in this study? Again, this was an observational study. There were two different groups. At the beginning of this data collection, folks were asked a food diary and basically asked hmm, what did you eat yesterday and when did you eat it? And then they divided these groups of people into two One who ate all their food in a fairly narrow time window so maybe they didn't eat breakfast, and so it was all between the hours of noon and 7 or 8 pm Versus a group of people that ate throughout the day, just like sort of more normal people. So they found these two groups and then they looked forward in time over the course of eight years and then found oh my gosh, there's 91% more deaths in that intermittent fasting group, in the group that had all their food in a narrow time window. Oh no, this is a problem, okay.
Speaker 1:So before you throw out any idea about doing intermittent fasting, let's walk behind the headlines and begin to understand the study and whether to believe it. Okay. So there were a number of concerns with this study. The first is that the study hasn't been published. It was a poster at a news conference or at a medical convention by the American Heart Association, so it hasn't yet undergone peer review. Hasn't been come out in a publication, so it has not undergone rigorous peer review. So when you see a headline and it's based on something that hasn't even been published yet, okay, now all of a sudden, I'm not very convinced that it's worth worrying about. All right. Well, the next concern I have is are the data reliable or believable? So how do they do this study?
Speaker 1:At the beginning of this eight-year observation, they said to people well, what did you eat yesterday and what time did you eat it? And they did this on two different days yesterday and what time did you eat it? And they did this on two different days. Well, we know from lots of other studies that food diaries, food recall, is really, really bad. You know, if I said to you now, and you can think about this what did you eat yesterday and how much of each food did you eat yesterday? Was it a, you know, a quarter a cup of cottage cheese. Was a full cup of cottage cheese? Did you eat it at eight in the morning or did you eat yesterday? Was it a quarter a cup of cottage cheese? Was a full cup of cottage cheese? Did you eat it at eight in the morning or did you eat it at 10 am? When did you eat it?
Speaker 1:So food diaries are notoriously problematic, and so what this study was built around was a food diary and projecting forward eight years. Now, what is the likelihood that what you ate eight years ago was what you continued to eat every day for the next eight years? So just because at this moment in time, at the beginning of when they collected data, maybe you didn't eat throughout the day, you just ate during a very narrow period of time, what's the likelihood that was true the next week, the next year and eight years later? So projecting forward one day of food diary for eight years seems really problematic to me. Next, were the comparisons groups similar? Okay, so we're trying to say that people who fasted or didn't eat throughout the day had a higher rate of heart attacks than people who ate throughout the day, and what the implication is that the higher heart attack rate was directly attributed to the fasting group.
Speaker 1:So now I scratch my head and I say, well, wait, a second sort of who tended to skip meals. Now I should point out that this data was collected 15 to 20 years ago, long before anybody talked about intermittent fasting as a way to lose weight. So 20 years ago, why did people perhaps eat only during a narrow portion of the day? Well, that may be because they're working at two or three jobs and they don't have a chance to sit down throughout the day and eat. Maybe they're just under a lot of stress and they missed meals. Or maybe they had issues with finances and money. So fasting may not have been oh, I'm doing this to be healthy. It may have just been an artifact of other things going on in their life. And these people who maybe were at high stress, maybe back then they were smokers and they didn't have time to exercise because they didn't have time to eat. So they probably didn't have time to exercise. So that also then says wait a second. Does this really really make sense? So when I step back and say, does intermittent fasting cause an increase in heart attacks, my take is that this is a really flawed study and that what we observe was probably due to something else and not the fasting. So rolling this back up again to the headline easy to see the headline, easy to get to scared. But if we drill in a bit further we learn that this was so flawed on so many levels that I'm not going to pay attention to it.
Speaker 1:Okay, so this was what's called a database study. Database study is you have these existing databases? The kind of the granddaddy of all of them is called the Framingham Heart Study, which was started in the 1940s. They collected information on a whole group of people. Basically they collected heart risk factors, cardiovascular risk factors, and then they followed these people in time to figure out who ended up developing heart disease. And from this wonderful database we learned that blood pressure and cholesterol are really critical risk factors for heart disease. There's also the NHANES database, which you may hear about. This was begun in the 60s. 5,000 people basically. They survey them every year or every couple of years and they're monitoring 17 different conditions like diabetes and kidney disease, et cetera, et cetera, and they're collecting risk factors and they're trying to put these two together. And then, most recently, there's something called the UK Biobank done in the UK and there are 500,000 people. It's been going on for almost 20 years and they're measuring things like your physical activity, your cognitive function, biochemistry, genetics lab tests, and so on and so forth function, biochemistry, genetics lab tests, and so on and so forth. The problem with these databases is that they are sitting there waiting to be analyzed. It's cheap to analyze them, it's quick and it's almost irresistible to play with the data. So somebody might say, oh, I've got this interesting database and I have access to it.
Speaker 1:I wonder if I tested whether eating apricots is related to breast cancer and you do an analysis and it's like nah, no relationship. Well, okay, maybe it wasn't apricots, maybe it's peaches. Nah, that didn't show anything. Well, how about highly processed foods? Nah, that didn't show anything. Well, how about highly processed foods? Or certain genetic types gene types that we have in our bodies raises your heart disease risk? Well, maybe I didn't find that. But how about your dementia risk? Oh, didn't find that. What about pancreatic cancer?
Speaker 1:This is called data drudging or p-hacking, where you basically play with the data and try to see if you can find something that explains or predicts something else. The problem is that if you do enough of this testing, you're going to come up with something that's positive. Just statistically we talked about this in episode 12, to test or not to test, 5% of a blood test will likely be abnormal. Well, if you do 10 or 15 or 20 investigations of a database, you're going to find something abnormal and something positive. So we don't know. The problem is when people write up oh, we found that apricots cause cancer or peaches cause heart disease. Well, you don't know how many times they played with the data before finding something that's published and that's called publication bias, meaning we do lots of things but we don't necessarily write it all up and submit it for publication. So if it doesn't show anything, we probably won't write it up for publication. If it looks exciting, like intermittent fasting increases your risk of heart disease death, well then you write it up and it gets a lot of attention. So unfortunately, we don't know how much the data was played with before they came up with something positive. And that's a really big problem. And because these databases are so common now, lots of people are playing with them and coming up with things that make great. Lots of people are playing with them and coming up with things that make great headlines. And one of the other critical things about database analyses or observational studies is it gives you a correlation meaning? Oh, you know peaches and cancer, but that doesn't mean that peaches cause cancer, and we'll sort of talk about this in future episodes.
Speaker 1:Okay, randomized control trials, that's been felt to be the gold standard. That's how drugs are approved and they've given us huge insights. Medications are as powerful as surgery for heart disease. But even a randomized control trial doesn't always give us the right answer. Well, there was a headline recently meal replacement shakes lead to greater weight loss, and this was a randomized trial with 60 obese individuals. One group got meal replacement this was like a shake for dinner and that group got about 1,300 calories in their diet. The second group had a reduced calorie lunch to bring their total down to around 1,300 calories, and the third group had normal eating. Now what they found was that both of the groups that had fewer calories on a daily basis lost weight. Now, that's no huge surprise. What was a surprise is that the ones that had the meal replacement for dinner lost 15 pounds versus the others that had the smaller lunch, and that was their way of reducing calories lost six pounds. So a huge difference and huge differences in fat loss. This sounds really, really, really impressive.
Speaker 1:The problem is, when you read the discussion in the limitation section in the actual article, probably the group that had the low-calorie lunch may not have had lower calories, or at least not nearly as low, as the other group. So the group that had the shake for dinner, they probably did get close to 1300 calories in their diet total, but the one that had the reduced calorie lunch, well, maybe they really didn't actually eat a whole lot less and that's why the shake group did better, not because shakes are better than lower calories, but maybe it was a way that there was actually lower calories and in the other group there wasn't. Now this was a study done in China and folks there have their own unique diet, their own unique amounts of kind of how much they walk and how much they exercise. So even though this was a randomized, controlled trial, there can be problems. So reading the discussion section brought out some of the limitations by reading the actual article and if there had been an editorial, that could be helpful.
Speaker 1:This is a really important topic and you are likely to run across headlines daily or even weekly. I hope I've given you some tools or questions about how to read the headline in the article and how to look at the underlying study and, as I mentioned, your health type may influence how you approach these headlines, so please send me examples and I look forward to talking about this with you, getting feedback from you and continuing on our journey to live long and well. Thanks so much for listening to Live Long and Well with Dr Bobby. If you liked this episode, please provide a review on Apple or Spotify or wherever you listen. If you want to continue this journey or want to receive my newsletter on practical and scientific ways to improve your health and longevity, please visit me at drbobbilivelongandwellcom. That's, doctor, as in D-R Bobby. Live long and wellcom.