The video raises awareness about the increasing advancements in AI-generated voice technology and its capabilities to mimic one of the most renowned voices in broadcasting, Sir David Attenborough. The hosts challenge viewers to discern between Attenborough's authentic voice and its AI-generated counterpart, highlighting how closely the AI can match his voice's nuances. It brings to light the concerns of voice cloning being used without consent, creating ethical dilemmas for public figures whose voices are easily recognizable and revered worldwide.
Dr. Jennifer Williams, an expert in AI audio, discusses the serious implications of these voice cloning technologies. She explains that these technologies can replicate voices closely by scraping data from the internet, potentially resulting in unauthorized use for misinformation or nefarious purposes. Williams shares insights into the widespread creation and potential misuse of AI-generated voices, stressing the importance of distinguishing between creative uses and harmful applications.
Main takeaways:
Please remember to turn on the CC button to view the subtitles.
Key Vocabularies and Common Phrases:
1. discern [dɪˈsɜrn] - (verb) - To perceive or recognize the difference between things. - Synonyms: (detect, identify, distinguish)
The hosts challenge viewers to discern between Attenborough's authentic voice and its AI-generated counterpart.
2. nuances [ˈnuːˌɑːnsɪz] - (noun) - Subtle differences or distinctions in expression, meaning, or response. - Synonyms: (subtleties, differences, nuances)
... highlighting how closely the AI can match his voice's nuances.
3. revered [rɪˈvɪrd] - (verb) - To be deeply respected or admired. - Synonyms: (respected, esteemed, admired)
... creating ethical dilemmas for public figures whose voices are easily recognizable and revered worldwide.
4. misinformation [ˌmɪsɪnˌfɔrˈmeɪʃən] - (noun) - False or inaccurate information spread, regardless of intent to deceive. - Synonyms: (falsehoods, inaccuracies, untruths)
... resulting in unauthorized use for misinformation or nefarious purposes.
5. nefarious [nɪˈfɛəriəs] - (adjective) - Wicked or criminal in nature. - Synonyms: (wicked, evil, villainous)
Then of course there's the nefarious purposes of creating an actual voice...
6. endorse [ɛnˈdɔːrs] - (verb) - To declare one's public approval or support. - Synonyms: (approve, support, advocate)
...about war, politics and things that he has never said or may not ever endorse...
7. authority [ɔːˈθɔːrɪti] - (noun) - The power or right to give orders, make decisions, and enforce obedience. - Synonyms: (power, control, command)
... people recognize him as an authority, as a voice of truth...
8. concern [kənˈsɜrn] - (noun) - A matter of interest or importance; a worry. - Synonyms: (worry, apprehension, anxiety)
...it's very concerning.
9. impersonating [ɪmˈpɜrsəˌneɪtɪŋ] - (verb) - Pretending to be another person for entertainment or deceit purposes. - Synonyms: (imitating, mimicking, parodying)
...AI just impersonating Sir David Attenborough?
10. deploy [dɪˈplɔɪ] - (verb) - To bring into effective action; utilize. - Synonyms: (utilize, employ, apply)
...raising awareness about the issue and just being aware of how the technology is developing and even thinking about legal frameworks and regulatory frameworks that will help protect people.
Sir David Attenborough says AI clone of his voice is 'disturbing' - BBC News
Now for a game of Guess who. Can you tell the difference between one of the most famous voices in broadcasting, Sir David Attenborough, and his AI generated clone? Well, here's a clip of the real Sir David talking about his new series, Asia. If you think you've seen the best the natural world has to offer, think again. There's nowhere else on earth with so many untold stories. Welcome then to Asia.
Well, now just take a listen to this. If you think you've seen the best the natural world has to offer, think again. There's nowhere else on earth with so many untold stories. Welcome then to Asia. So was that the same clip that we played twice, or was it AI just impersonating Sir David Attenborough? I wonder if we can just try and listen to that second clip again. Well, we can. Well, I mean, what do you think? It's incredibly close. Could you tell the difference? Those items you just heard were found on a website by some of our colleagues right here at the BBC. And there are several sites that offer AI generated voices of the trusted broadcaster.
So the BBC also contacted David in light of all of this and just have a look at his response. He says that having spent a lifetime trying to speak what I believe to be the truth, I'm profoundly disturbed to find that these days my identity is being stolen by others and greatly object to them using it to say whatever they wish. Well, this morning, one of those websites posed another clip clarifying its stars. Let's set the record straight. Unless Mr. Attenborough has been moonlighting for us in secret and under an assumed name with work authorization in the United States, he is not on our payroll. I am not David Attenborough. We are both male British voices, for sure. However, I am not David Attenborough for anyone out there who might be confused.
Incredibly close, isn't it? I mean, you have to really double take. If I didn't know, I wouldn't know. Let's talk to Jennifer. Dr. Jennifer Williams, who researches AI audio at the University of Southampton. I mean, we're talking about this and closeness with which those voice matches together. But there's a really serious issue behind all of this. And even some of my colleagues have had their voices used as well for different purposes. AI generated voices. Just talk to us a how it can match so closely and what they're being used for.
So there's actually a few different ways that you get a voice clone that matches so closely. One would be to scrape the Internet of a target, for example, Sir David Attenborough collecting Enough of his data to create a model of his voice and then of course, putting words in his mouth. Another way is that it could potentially happen accidentally. So there are no safeguards in place to guarantee that a synthetic voice is uniquely different from a real person.
So by scraping the Internet, what kind of mechanisms? Who's doing it? We know that there are websites out there who are using these AI generated voice to get their messages across for whatever reasons. But what kind of organizations do it? Surprisingly a lot. So before this interview, I did a quick Google search and I found that actually there are tools that you could go to right now and get a clone of the voice. I don't know how they're making it. They're probably, of course, scraping the web, but any, anyone could make a clone of David Attenborough's voice. That's absolutely extraordinary.
And you can hear from his response when we told him about this, that he was disturbed. He's obviously upset about this. Just the implications of this going forward. Arguably, what could these different companies be using these voices for, especially trusted voices like Sir David's? I think some people probably see it as a creative outlet, so they may want to be doing something like humor or parody. Then of course there's the nefarious purposes of creating an actual voice and then presenting that as an authoritative figure for various misinformation or disinformation. But I think it's important to make the distinction between creative uses of voice cloning technology and these nefarious uses that present falsely.
As someone, how aware do you think that the public are about the use of these AI generated clones? It's becoming more and more commonplace, I think, to talk about voice cloning. And I don't think that we need to be in a state of fear and hide ourselves away from the Internet or from other conversations. But it is important to just be aware that this technology exists. It didn't exist several years ago and it wasn't in the hands of everyone online for free. So I think raising awareness about the issue and just being aware of how the technology is developing and even thinking about legal frameworks and regulatory frameworks that will help protect people.
And Dr. Jennifer, just in terms of learning how to deal with this, I guess arguably what could we do? Because we know that the music industry has issues with this, especially AI generated music content, are there ways of protecting ourselves from it? So I recommend an approach called the sift method. Stop, investigate the source, find other sources, and then really think about what is the context here. So anytime you see something that might be out of place, just stop, examine it. Think about is this supported by other types of evidence or other types of information sources? And then what is the context? Is this a political advert? Is this just what is the topic? What is the topic?
I just want to play in the original clip again and Dr. Jennifer, just bear with us. I would like to play in that clip again of the original clip of Sir David talking about his new series. So just to reiterate, this is the real clip, the real Sir David speaking. Let's just listen in both. If you think you've seen the best the natural world has to offer, think again. There's nowhere else on earth with so many untold stories. Welcome then to Asia.
So for absolute transparency, when we played the AI generated clip, we indeed actually played the same clip again. That's how confused we were. And we were meant to be knowing what we were doing. So I'm going to just now play in the AI generated clip. Let's just have a little listen. Donald Trump has nominated Florida Congressman Matt Goetz as the next attorney general, a move that has generated significant controversy due to Goetz's legal history. And outshoot. The James Webb Space telescope recently made a jaw dropping discovery catching sight of massive supermassive black holes from the early universe. NATO is preparing for the worst case scenario, a large scale evacuation of wounded troops in the event of a war with Russia.
So that to absolutely stress that was the AI generated clip of Sir David. And as firstly, Dr. Jennifer, just your reaction to how close those voice patterns sound? Well, it sounds like Sir David Attenborough to me when you play the second round of clips. I'm a little disgusted. I mean this is very serious to think about. When you have a trusted voice like Sir David Attenborough who all around the world people recognize him as an authority, as a voice of truth and then to have words put in his mouth about war, politics and things that he has never said or may not ever endorse, it's very concerning.
Okay, well, Dr. Jennifer Williams, a researcher of AI audio at the University of Southampton, thank you very much for giving us your reaction and for bearing with us as we got those clips the right way round. And we appreciate your insight and analysis as well. Lots more of course, also on our website.
TECHNOLOGY, INNOVATION, GLOBAL, AI-GENERATED VOICES, VOICE CLONING, ETHICS, BBC NEWS