The video discusses the challenges of interpreting scientific findings and the prevalence of false or exaggerated claims in public research. It emphasizes the nature of scientific research as trial and error, leading to some false findings, potentially misleading the public through media and social platforms.
Highlighting the importance of systematic reviews, the video introduces the concept through the story of Archie Cochrane, who advocated for research summaries and systematic evaluations to ensure the reliability of healthcare information. The speaker outlines how to critically assess the reliability of health science information by asking critical questions about human studies, red herrings, and randomized controlled trials.
Main takeaways from the video:
Please remember to turn on the CC button to view the subtitles.
Key Vocabularies and Common Phrases:
1. fluke [fluːk] - (noun) - A stroke of luck; an unexpected, rare, or chance outcome. - Synonyms: (coincidence, accident, luck)
...what looked like a really interesting finding was actually just a fluke.
2. self corrects [sɛlf kəˈrɛkts] - (verb) - The process by which a system or organization fixes its own mistakes over time. - Synonyms: (auto-corrects, rectifies, adjusts)
Well, science self corrects and one of the main ways it does that is through something called a systematic review.
3. systematic review [sɪstəˈmætɪk rɪˈvjuː] - (noun) - A methodical and comprehensive review of existing research on a specific topic. - Synonyms: (comprehensive review, structured analysis, methodical evaluation)
He called for research summaries to be produced that compile all the available evidence on a specific question. This is what we now know as a systematic review.
4. red herring [rɛd ˈhɛrɪŋ] - (noun) - Something that misleads or distracts from the relevant or important issue. - Synonyms: (diversion, distraction, misleading clue)
Are we sure it's not a red herring?
5. randomized controlled trial [ˈrændəˌmaɪzd kənˈtroʊld traɪəl] - (noun) - An experiment in which participants are randomly assigned to different treatment groups to eliminate bias. - Synonyms: (RCT, clinical trial, experimental study)
Has there been a randomized controlled trial?
6. confounding [kənˈfaʊndɪŋ] - (noun) - A situation in which the relationship between two variables is distorted by an external factor linked to both. - Synonyms: (bias, interference, distortion)
In the context of medical research, what we're talking about here is confounding.
7. placebo [pləˈsiːboʊ] - (noun) - A harmless substance given to a patient in a clinical trial for comparison with an actual drug. - Synonyms: (dummy treatment, inactive substance, sugar pill)
The treatment group get yeast, the control group get a placebo.
8. clinical trial [ˈklɪnɪkəl traɪəl] - (noun) - A research study that tests how well new medical approaches work in people. - Synonyms: (medical trial, experiment, study)
And we know this because 90% of candidate drugs will fail when they get to the clinical trial stage
9. heaps [hɪps] - (noun) - A large quantity or number of something. - Synonyms: (piles, lots, loads)
The next day, he woke up to heaps of yeast.
10. culprit [ˈkʌlprɪt] - (noun) - A person or thing responsible for a fault or problem. - Synonyms: (offender, wrongdoer, perpetrator)
Fire is the culprit.
3 questions to ask before buying into health trends - Karen Dawe - TEDxBristol
Why Most Public Research Findings are False. That's the title of a paper that came out when I was roughly halfway through my PhD. I spent about 18 months failing to get any of my experiments to work and I was ready to give up. But then I realized to an extent, that's just how science works. Science investigates things nobody has investigated before. It's trial and error. So studies start small, maybe too small, and what looked like a really interesting finding was actually just a fluke. Most is maybe an exaggeration, but certainly some published scientific findings are wrong.
What does that mean for you? You currently have access to more health science information than ever before and more ways to consume it. TikTok, Facebook, YouTube. And I love that. So what's the problem? Well, the problem is it might just be wrong. We all know that feeling of seeing a headline that really resonates, but maybe sounds a bit too good to be true. How can you tell? I'm a scientist and health technologist. What I do is help companies compile scientific evidence to support their product and then figure out how reliable that evidence is. I work with medical devices for drug delivery. I work with software to help you get better sleep. I work with medicines for the most rare and severe types of seizures. I am not an expert in any of those fields and you don't need to be either. What I do is the science of science.
The science of science is a thing and its principles can help you judge health science information in the media. So some scientific findings are wrong. What do we do about it? Well, science self corrects and one of the main ways it does that is through something called a systematic review. Now, the origins of the systematic review are maybe not what you'd expect. In the early 1930s, a young medical student called Archie Cochrane had a problem. In interviews he refers to it as a sexual dysfunction. It's the 1930s. You've got a sexual dysfunction. Archie did what any one of us would do. He went to Vienna to find Sigmund Freud. What Archie didn't know then was that he had a genetic condition, not a psychological one. Psychoanalysis didn't help and he became very frustrated. He was seeing one of Freud's students and he says he believed strongly in intuition.
While I couldn't help asking why? What's the evidence for what you are saying? This experience in Vienna really got Archie thinking about evidence. And by the time he returned to medical school, he was kind of a troll. He says we started a sort of a club of individuals who would ask consultants, what's the evidence that your treatment has any effect. We found this great fun. There was so little evidence available. By the time he left medical school, he was obsessed with evidence and it came to define his, his medical career. Among his many great contributions was the idea that single studies can't be relied upon to make healthcare decisions. He called for research summaries to be produced that compile all the available evidence on a specific question.
This is what we now know as a systematic review and they are a required part of the treatment approval process. The most robust type of systematic review is called the Cochrane Review. Cochrane reviewers judge how good the evidence is using a tool which is essentially a huge technical questionnaire and no big deal, but it was developed right here in Bristol, used globally, not just for Cochrane Reviews. And it was co created by the team that I used to work for. So I kind of felt that I was okay to mess with it a little bit. What I wanted to do was take it out of the hands of the professionals and turn into something anyone can use to judge health science information in the media.
So what I did was take the core principles of Cochrane and boil them down to just three questions, three questions that I want you to remember and think about the next time. Don't worry, I remember them. If they go, it's fine, and you're going to remember them too. I want you to remember them because I want you to think about them the next time you see a too good to be true headline or a weirdly compelling TikTok. Okay, so has there been a study done in humans? Are we sure it's not a red herring? Has there been a randomized controlled trial? If the answer to any of these questions is no, then it's not bad science, not by any means. But it does mean there's a high risk that in time it'll turn out that the findings weren't reliable.
Before we get into it, we need to be aware that it will take conscious effort because we want to believe. I want to believe I have an autoimmune disease, I'm sick a lot, I'm on immunosuppressant medication, there's no cure. And the plan is for me to be on immunosuppressant medication for the rest of my life. So when I see a TikTok that says it's definitely going to cure me, I want to believe, I want it to be true. I, I don't want to question it. So it takes conscious effort to stop, think and ask, has there been a study done in humans? Are we sure it's not a red herring.
Has there been a randomized controlled trial? This first one's super easy and a real killer question. And once you start asking this question and listening out for the answer, it will filter out a lot of health claims. All medical breakthroughs have to start somewhere. And a lot of them start by growing cells in petri dishes in a lab. But that doesn't mean that all cells in petri dishes are destined to become great medical breakthroughs. A really good way to remember this is that cancer cells grown in a dish in a lab can be very effectively killed with a hammer. All kinds of things will kill cancer cells in petri dishes, but that doesn't make them a viable cure for cancer. And we know this because 90% of candidate drugs will fail when they get to the clinical trial stage. Mostly this is because once we put them in real life, humans, they just don't work in the way the evidence suggested they would. So if we ask this question, has there been a study done in humans? And the answer is no, we can say, okay, that's super interesting, but maybe it'll turn out to not be a reliable finding.
If we ask this question, has there been a study done in humans? And the answer is yes, we're going to take it up a notch. We're going to say, are we sure it's not a red herring in the context of medical research? What we're talking about here is confounding. confounding happens when something is related to both the health factor that we're interested in and the health outcome causally related to both those things. It's quite a tricky concept. So I'm going to give you an example that's going to change the way you listen to health science information in the media.
Nice graph here. Very clear relationship between whatever's on X and whatever's on Y. Looking at this graph, the more X happens, the more Y happens. So if Y is something bad, this graph suggests we want to stop doing X. This is a real relationship. It is not a chance correlation. But what's key is the nature of this relationship. So let's see what our variables are. On our Y axis, we've got total cost of fire damage. And on our X axis, we've got number of firefighters at the scene. This is a real relationship, but it is not a direct causal relationship. It's a red herring.
Fire is the culprit. Fire causes both the total cost of fire damage and the number of firefighters at the scene. Now we know what firefighters do we understand that system, we often don't know what health factors do and we don't necessarily understand the underlying system. This is why it's so important to ask, how was the study done? What was the study design? So what study design would avoid red herrings confounding that would be randomized controlled trials.
Last question. Has there been a randomized controlled trial? Randomization is the key to getting rid of red herrings, because then the red herrings occur as much in the treatment group as they do in the control group. At the end of the first year of World War II, Archie Cochrane found himself as a prisoner of war. As a medical officer, he was in charge of the health of around 8,000 British prisoners. Their diet was awful. Archie noticed everyone, including himself, was beginning to swell. They were getting very, very sick indeed.
Archie suspected it was thiamine deficiency. But Archie is a prisoner of war. He is not a doctor in a hospital. He can't just order up a bunch of yeast, which is what they need. Archie needs to convince the guards beyond all doubt that yeast is the treatment they need. Archie manages to get a small amount of yeast on the black market. He numbers the soldiers and he sends the odd numbers off to the treatment group, the even numbers off to the control group. The treatment group get yeast, the control group get a placebo. But it was vitamin C.
He might have been tempted to give the sicker patients the yeast, but then they're not randomized, and then yeast isn't the only difference between the groups. What if the yeast doesn't help because the sicker patients are too sick? And what if the sicker patients are sick because they also smoke? What if the sicker patients are more sick because of some fundamental genetic difference? When you're a prisoner of war trying to convince guards to buy you yeast, there can be no what ifs. And when we are spending money and resource on thousands of very sick patients in clinical trials, there can be no what ifs.
Within three days, there was a noticeable difference between the groups. Archie wrote up his findings and told the guards. The next day, he woke up to heaps of yeast. The whole camp got better because of what was essentially one of the first ever randomized controlled trials. It was even blinded. The patients didn't know if they were getting the treatment or the placebo. And this is another really important thing about randomised controlled trials. The expectations and hopes of both the patients and the researchers impact results in trials that aren't double blinded. The treatment effect can be exaggerated by as much as 17%.
Archie Cochrane changed the face of medicine by not accepting what he was being told, by asking tough questions, even of people of authority and experience. Ask tough questions even if people are wearing a white coat in their video, even if they have the word doctor in their username, even if they are stood on a red dot giving a TEDx talk. Ask tough questions. The best advice I can give you is that if you see a too good to be true headline or a compelling TikTok, go to the Cochrane online database. I'm not affiliated with Cochrane, by the way. I should just say, but at this time they are considered the gold standard in healthcare evidence and they are used globally to make policy and healthcare decisions.
They may have already asked the tough questions for you, but if there isn't a review already on what you're interested in, we can all be more like Archie by asking the tough questions ourselves. Has there been a study done in humans? Are we sure it's not a red herring? Has there been a randomized controlled trial? Change how you listen to this information. Because if you're listening to something really interesting and the answer to all these questions is yes, then you might want to pay attention. Thank you very much.
SCIENCE, EDUCATION, HEALTH, SYSTEMATIC REVIEW, COCHRANE REVIEW, CRITICAL THINKING, TEDX TALKS