ENSPIRING.ai: Hope has a plan - Maria Ressa - Nobel Prize Summit 2023
In the video, the speaker discusses the challenges posed by disinformation and misinformation, particularly focusing on how these phenomena are exploited by powerful entities to manipulate democratic processes. The conversation emphasizes the importance of understanding the difference between misinformation, which can be unintentional, and disinformation, which is a deliberate act used to manipulate populations. This manipulation is exacerbated by the exponential growth of technology and information warfare that attacks democracy on both individual and collective levels.
The speaker reflects on the role of journalists in this complex landscape, highlighting their crucial responsibility in preserving democracy by exposing untruths and holding the powerful accountable. Despite personal challenges and risks, including online abuse and physical threats, journalists are urged to stand firm. The narrative brings to light specific examples of historical events and contemporary challenges faced by journalists, emphasizing the importance of maintaining integrity in the face of widespread disinformation.
Main takeaways from the video:
Please remember to turn on the CC button to view the subtitles.
Key Vocabularies and Common Phrases:
1. disinformation [ˌdɪsˌɪnfərˈmeɪʃən] - (noun) - False information spread deliberately to deceive people. - Synonyms: (deception, falsehood, propaganda)
disinformation is when power and money uses the existing information ecosystem to insidiously manipulate the cellular level of our democracy, which is each of us.
2. asymmetrical warfare [ˌeɪsɪˈmetrɪkəl ˈwɔːrfɛər] - (noun) - A type of conflict where the parties have unequal military resources, and the weaker opponent uses unconventional tactics. - Synonyms: (irregular warfare, unconventional warfare, guerrilla warfare)
asymmetrical warfare. You heard about this a little bit, but this is what al Qaeda used to attack America, that we will come after America with death by a thousand cuts.
3. coded bias [ˈkoʊdɪd ˈbaɪəs] - (noun) - Implicit and systemic biases ingrained in AI systems due to the data used in their programming. - Synonyms: (algorithmic bias, systemic discrimination, computational bias)
Stop coded bias. If you were marginalized in the real world, if you’re a woman, LGBTQ, if you are from the global south, if you are marginalized, you are further marginalized.
4. microtargeting [ˌmaɪkroʊˈtɑːrɡɪtɪŋ] - (noun) - A marketing strategy that uses consumer data to target individuals with specific messages or advertisements. - Synonyms: (personalized marketing, targeted advertising, niche marketing)
Micro targeting is taking your advertising dollars, and that micro targeting takes your weakest moment to a message.
5. generative ai [ˈdʒɛnərəˌtɪv eɪ.aɪ] - (noun) - Type of artificial intelligence that is designed to create data like text, images, or videos that is similar to human-generated content. - Synonyms: (creative AI, content-generating AI, AI-generated)
generative ai is not just exponential, the first one, but exponential. Exponential you’re talking about.
6. surveillance for profit [sərˈveɪləns fər ˈprɒfɪt] - (noun) - The practice of monitoring individuals for the purpose of generating revenue by collecting and selling personal data. - Synonyms: (data mining for profit, profitable surveillance, commercial monitoring)
Stop surveillance for profit. Because all fear, anger, and hate, it follows the second.
7. astroturfing [ˈæstroʊˌtɜːrfɪŋ] - (noun) - The practice of masking the sponsors of a message or organization to make it appear as though it originates from and is supported by grassroots participants. - Synonyms: (fake grassroots, false front, phony community support)
But the other part is astroturfing, to make this side.
8. information warfare [ˌɪnfərˈmeɪʃən ˈwɔːrfɛər] - (noun) - The use of information technology to gain a competitive advantage over an opponent, often involving attacks against information and information systems. - Synonyms: (cyber warfare, data warfare, digital warfare)
disinformation that leads to information operations and information warfare
9. behavior modification [bɪˈheɪvjər ˌmɑːdɪfɪˈkeɪʃən] - (noun) - The use of psychological methods to alter people's actions and attitudes. - Synonyms: (behavioral alteration, conduct modification, attitude adjustment)
We’ve talked about behavior modification, and it goes two ways.
10. dynastic [daɪˈnæstɪk] - (adjective) - Relating to a succession of rulers from the same family or line. - Synonyms: (hereditary, ancestral, monarchic)
His supporters were still there, and dynastic families helped bring out the vote.
Hope has a plan - Maria Ressa - Nobel Prize Summit 2023
Please don't make me cry before I have to speak for 20 minutes. Oh, my God. Thank you. Thank you so much. That was actually the very first thing I wanted to say. There are so many people in this office who have helped us stay alive, stay out of jail. Thank you. Thank you. The only weapon a journalist has to fight back is to shine the light. Without you, it is impossible to do it. And this is actually what I hope we'll do in the next 20 minutes or so. Right. Is to show you the micro and the macro.
You heard from Nobel laureates, you heard from what an incredible day. We have the context of everything. That is the problem. But I'm going to kind of help focus it a little bit more because the tech has gone exponential. Exponential. And we're still moving at glacial speed. So I think the first step is please use disinformation, not misinformation, because misinformation, misinformation is like a game of telephone. It gets distorted. It's not, you know, people make mistakes. We make mistakes. disinformation is when power and money uses the existing information ecosystem to insidiously manipulate the cellular level of our democracy, which is each of us. disinformation that leads to information operations and information warfare.
In the Philippines and in Southeast Asia, we use disinformation. I think this is my theory. In America and in the west, part of the reason that you stay with misinformation is you have a very powerful tech lobby that uses that word to whitewash what exactly is happening. So Robert said earlier, you know, that 23 2nd sound bite. Well, let me give you the history of the Philippines with this map, right? This map here, our history. 200 years in a convent, 50 years in Hollywood. One sentence. My ask for you in the next few minutes is really the same question we have had to confront. And I think the only way we will come out of this, because the window to act is closing at the worst of times.
It was very cathartic to write. When we went into lockdown, I didn't realize how exhausted I was, and I just kept writing. My editor cut half of what I wrote, 200 pages. He cut 200 pages. But since then. So it came out around. It came out the same time as chat GPT, actually, as the generative ai came out. But since then, it's been translated in about 20 languages. You know, there's the French, there's Japanese, there's Korean, there's Mandarin coming in June. And Ukrainian. I just signed right, like, but I think it is this left brain, right brain that I tried to.
How will we move you to act? How can I move journalists to act? It's actually really simple. It's not about me. It's about you. It's about your courage. It's about your ability to look at the world, listen to all the problems and all the nuances of it, and then just say, this is what I stand for, because silence is complicity. I'm going to remind you, in the Nobel lecture in 2021, I talked about how the war isn't Russia invading Ukraine. Yes, there's conventional war, but the war is in your pocket. It is a person to person defense of our values, of our democracy, and each of us has to win. Right? And the second one is that.
How does it work? Madeleine Albright used to call it, you slice the salami. This is how fascism would work. In my case, I started, the two biggest stories that I worked on were all how they tested tactics to attack America in my country. 911 attacks. When that happened, it was a memory for me. In 1995, the first pilot recruited by al Qaeda had been arrested in the Philippines. Abdul Hakimarad. And there was an interrogation document in my closet that talked about a plot to hijack planes and crash them into buildings. He even named the buildings, he said, the World Trade Center, Pentagon.
Then he added a building that hadn't been attacked. The Transamerica in San Francisco. Francisco. Right. So it was. So that was the first. What's the second one? Cambridge Analytica. Americans had the most number of compromised accounts. What's the country with the second most number of compromised accounts? The Philippines. So I use this phrase, death by a thousand cuts, because this is how we lose our rights. You know, it's like you're not going to stop because, oh, well, like we lost. We didn't get access, so we'll just stay quiet because maybe this is where the fear is used against us. Maybe if we make too much noise, we won't be able to come in again.
Next time democracy is lost by giving up. You have to hold the line of our rights. So this is. That's the part, and it's ironic. asymmetrical warfare. You heard about this a little bit, but this is what al Qaeda used to attack America, that we will come after America with death by a thousand cuts. It's ironic that the very principles to combat terrorism is now being used, asymmetrical warfare against each of us. And then the last one, which I go from macro to micro, is us. We are standing on quicksand. Don't worry, I'll only be really bad for half of the time.
I talk to you, and this is what I suggest. This is what we did at Rappler. Imagine the worst case possible. The worst case, whatever it is you're most afraid of. You hold it, you touch it and you embrace it. You rob it of its sting, because if you do that, then we can move forward. Nothing can stop you. Embrace your fear. Fears. Now, with generative ai man, it's off the scale. Like all of the science fiction films that we've seen. It's worrisome. All right, let me. So. So what's our ten point action plan tomorrow? We're going to have 2 hours to go over it with you, but I think here, three buckets.
This is something that we pulled. Nearly everyone who spoke with you today had bits and pieces of it. But the reason we also focus on disinformation is because it is the business plan, right? It is money. It is power. And this business plan is the first step. We need to stop to reclaim our rights, stop surveillance for profit. What that means is, and I'll show you some of this on this phone. If you're on any of the social media platforms, every single thing that you post, every data point that's collected about you, is used to create a model of you. They'll say that the tech companies, but what they don't tell you is that that model replaced that word model with clone.
They clone us. And then they say, because they used AI to do it, machine learning, that they owned that clone. Did you give your permission? No. Stop surveillance for profit. That's the first step, because all fear, anger, and hate, it follows the second. Stop. coded bias. coded bias. If you were marginalized in the real world, if you're a woman, LGBTQ, if you are from the global south, if you are marginalized, you are further marginalized. And what happens? coded bias was actually a film that was in Sundance the same time as a thousand cuts. And this was about a female. Her name is Joy. She was an MIT student. She was given an assignment, an AI assignment, except that she couldn't do the assignment because the AI wouldn't recognize her face because she is a woman and she is black. So what she did, she put a white face on, and then she did the assignment. Stop coded bias. That's the second point.
The third is the part that has always been hard to explain. I think when we were. When news organizations had power, we never really explained the process journalism as an antidote to tyranny, because only journalists are foolish enough to stand up to a dictator and say, you're wrong. I had a 26 year old reporter stand up to President Duterte, who was towering over her, trying to bully her. And she kept asking in a very respectful way, this is what journalism does. And if you look at the trends, not only has democracy declined globally, journalists have had to pay more and more, jailed, attacked, killed every year for the last decade. So we can't do this alone. That's the, let me quickly go through this because I see I will pass through. It's all about data. And if I say this to our kids, they'll say it's really boring. But think data, not excel sheet. But I, your clone, right? You in data.
That is exactly what we're talking about. And that huge shift is phenomenal. It changed the entire system, but we didn't know it. The two buckets you heard a little bit earlier from, you know, we've been talking about machine learning. The AI of machine learning. This is what builds our clone. What is that? All of our clones, if we were all on social media, is pulled together. That is the motherlode database that's used to micro target. Micro targeting is not advertising, the old advertising, but micro targeting is taking your advertising dollars. And that micro targeting takes your weakest moment to a message.
It's almost like you went to a psychiatrist, and then you told your psychiatrist your deepest, darkest secret. And the psychiatrist went out and said, who wants Maria's deepest, darkest secret for the highest bidder? To the highest bidder, right? So that's the first one. And you heard from Christine. We haven't solved anything. No one has been made accountable. It still continues and it is getting worse. The second bucket, generative ai, is not just exponential, the first one, but exponential. Exponential you're talking about. And we played with this stuff on Rappler because if you don't, again, that arms race, if you don't use it, are you going to be left behind? But the question really is, where are the governments that have abdicated responsibility for protecting the public sphere? Why are the journalists, the politicians? It's hard to govern in this day and age, right? The scientists, the researchers, why are we left alone?
And I'll show you some of that stuff. We'll go back to generative ai. Let me quickly, I'll skim you through this, because this is the problem. In 2014, content and distribution was separated. News organizations lost our power. And the tech companies, american tech companies, became the new gatekeepers. We're frenemies. You know, I really loved the tech. I drank the Kool Aid. That was how we set up Rappler in 2012. We first set it up on a Facebook page, right? The second, and this is the original sin. If we're talking values, the system that connects us, all of them, spread lies faster than facts, six times faster. And this is a 2018 MIT study. So if you're a journalist, your hands are handcuffed.
Lies laced with anger and hate spread. This is a distribution issue, right? In the Nobel lecture, I talked about how it is fueling the worst of humanity. And then finally, what has this become with information warfare for your 2016 elections? You already have the data out, but nothing happened with the Mueller report. Look at all the footnotes in the Mueller report. I mean, it's really interesting. This is now a behavior modification system. This is warfare, right? So that's why you got. You have to win it. Sorry, last slide before I kind of take you through the solutions. We've talked about behavior modification, and it goes two ways. It pounds you. You heard Rana talk about being pounded, right?
And the end goal of all of that is to take your voice out. But the other part is astroturfing, to make this side. Like, if you see part of the attacks against me or against Rana, make you disbelieve. This is why facts are debatable, right? So the three sentences I've said over and over makes me feel like Cassandra and Sisyphus combined. Without facts, you can't have truth. Without truth, you can't have trust. Without these three, we have no shared reality. We have no democracy. Right? So that's.
I mean, imagine if you're here in the room with me and, you know, we're in Washington, DC. But I would, if I were to set information operations and pound this part of America with a million tweets that said they would all have blue verified checks, with a million tweets that said, this is happening in New Delhi. People outside would think it's happening in New Delhi. Three layers that it's coming. The personal, individual, sociological, in groups, because we act differently in groups. And then the one that we haven't even begun to talk about, emergent human behavior, right? Like, each, we cannot know what we are going to become when we are being pumped full of toxic sludge, when we cannot believe in the goodness of human nature. You have been very good to us. Right? So I believe in the goodness, but it is so hard to do that in our information ecosystem.
So what kind of, aside from the fact that it's addictive, you know, dopamine our attention span has gone from 10 seconds to 3 seconds. All of this stuff, I worry about our kids. The surgeon general just announced that, you know, social media is harmful to our children. Like, we kind of knew that a decade ago. You have to move faster. And last part, the last two minutes, right? This is the big picture stuff. What is happening all around the world.
And I'm going to reference both the Freedom House report and V dem from Sweden. You know, if Vdem pointed out that last year 60% of the world was under authoritarian rule, I thought, well, that includes India and China, so that's okay. 60% this year, it's 72% of the world. You're getting a tipping point where these trade kind of sanctions are not going to work because there will be more authoritarian countries than there will be democratic countries. I want to light a fire. Our window is closing. We, with our information ecosystem, are electing illiberal leaders democratically. I know it in my country. And they are crunching. They are crippling the institutions from within. You now know how easily that can happen. But they're not staying in their own country. They're allying globally. Would Belarus be a democracy today if Russia hadn't come in to help?
So this is it. Our window is closing. I'm going to quickly go through this. I mean, look, I wanted to show you what we lived through. I mean, Rana and Patricia Camposmelo, who is also here from Brazil, this is the first network what we did in Rappler to try to figure it out, go to the facts. What are the networks that are sharing the crap? I can say that in the National Academy of Sciences. So this is the first generation, right? But this was the same network attacking journalists, opposition politicians, human rights activists. And it was so organized that they had messages out by demographic.
I'll show you some of the attacks that I got there. Memetic warfare, right? So everything about me, it's a good thing I'm not corrupt because they couldn't find anything. But they did things like the way I looked, my sex. So gender, gender disinformation. I have eczema. Can you imagine? Eczema weaponized. You know, I have really dry skin. But take a look, it's not bad. You know, I have really dry skin. But look what they did to all the photos. And then they made up a name for me. Don't drink anything right now. They called me scrotum face dehumanization. And this becomes dangerous because that is the next step to violence. Online violence is real world violence.
Right? I mean, America looks at reasons why the shootings are happening. We are being pumped with toxic sludge. So it took like a month. My mom sent me this one. And for years, I didn't want to tell you about it. I didn't want to speak about it because it makes you ashamed. But I realized if I didn't tell you, you wouldn't know. I mean, who wanted personalization? Who wanted our own separate realities? I thought in 2014 when I heard that, oh, my God, that's going to be problematic. What happens to the public sphere? You know, I'm going to say something that's not quite politically correct. You know what they call us?
A place that has different people believing in different realities. It's called an insane asylum. We're close. I mean, they still kept going, right? So it took a few months before they took that down, but then 1000 cuts again. My skin is a little bit better than that, but a thousand. This is public, right? It. Astroturfs. Vov ph is a Facebook page. And then they did this. Creative, right? But I'm not alone. And this is where the UNESCO and the chilling interview. This is a 300 page book. Women journalists under attack, having to deal with things like this. And you can see the statistics. They are horrendous.
73% have experienced online abuse, 25% have gotten death threats. And of that, 20% have been attacked in the physical world, in the real world. Online violence is real world violence, right? This is the first big data case study done by the International center for Journalists, based in DC with UNESCO. And they took half a million attacks against me. I was getting 90 hate messages per hour. 60% was meant to tear down my credibility. 40% was meant to tear down my spirit. With your help, it didn't work. So let me fast forward and bring you back to DC. And, you know, you heard about Brazil on January 8. But this is the same thing. This is the anatomy of how you change reality with information operations. It takes a few years, but it's been a few years, right?
So if you take a look, this is violence on Capitol Hill. Hashtag stopthesteel. It was seeded the metanarrative. This is work by the election Integrity Partnership. It was seeded on RT a year earlier. August 20 mainstreamed Steve Bannon on YouTube. August 2020. So we're getting closer to your election date. Then. You had the super spreaders. QAnon dropped it October 7. And then, then President Trump came top down. It's the same thing that happened to us in the Philippines. Bottom up, exponential lies. Journalist equals criminal. A year later, President Duterte said the same thing about me and Rappler. So I immediately tweeted in his State of the Nation address. And then a week later, I got my first subpoena, and they just kept coming. In 2018, the government tried to shut us down. In 2019, I had eight arrest warrants in about three months, and then two more followed. Yeah. So, like Rana, I wound up spending, spending more time with lawyers than with journalists. This is how it happens. You heard from our Yale behavioral scientist, talk about how you can take your memories and pull it out.
We are a country where we overwhelmingly elected the only son and namesake of the man, our dictator in 1986. Who? A kleptocrat who stole $10 billion in 1980, $6. I didn't mention the name Marcos, did I? President Marcos, then and now. Right. We now have his son and namesake as our president. He was elected with two ways. The first was because of information operations that began in 2014. And the second is because, like in other countries around the world, the world doesn't change like this. His supporters were still there, and dynastic families helped bring out the vote. All right, let me quickly go through the solutions I promised.
Solutions. How do you rebuild trust? In Rappler, our elevator pitch was, we build communities of action and the food we feed our communities is journalism. Right. Our three pillars. I like the anvil drop. I use that all the time. Technology, journalism, and community. You are a powerful community. Right. So let me take technology in technology, legislation, and I joke that the EU is winning the race of the turtles. We can't wait more years. But now the Digital Services act, the Digital Markets act is. It's kicking in this year. Please. On May 31, there's a deadline for who gets access to real time data. Please just go take a look and weigh in on this. In the long term, it's going to be education. We heard that from here. In the medium term, it will be legislation, and that's where it is. So we have to look at that.
The second is, I've given up on big tech, even though we still need big tech, but we're building our own platform, and it took us much longer because we were under attack. In our first year and a half, we spent a million dollars on legal fees. That is impossible for a little group like us. We're only 100 people strong. So Lighthouse will come out by Q three this year, and this will give oomph to how do you build communities of action? How do you have safe spaces to actually speak to each other? To debate, to do the thinking slow part of both governance and democracy. The last one I want to tell you is interesting because this will be announced October 2, formally. But I've been okay to tell you about it. The Institute of Global Politics is being launched this fall at SIpa, the school of international public affairs at Columbia University.
It will be led by the dean of Sepa, Karen Yari Mildo and Hillary Clinton. And the end goal is really, you know, again, don't go into the politics, because that's part of the cascading failure. Because the original failure is lies spread faster. Remember that. When lies spread faster, we go into stranger things. You know, we go into the upside down. We're living in the upside down. So in that. That's where your politics, all our politics, become a gladiator's battle to the death. In my book, there's one algorithm that did that. It's recommendation of Friends of Friends for the growth of your social network. The title is called. It's chapter seven. It's called how Friends of Friends broke democracy.
So, the Institute for Global Politics, our goal there is to bring engineers together with lawyers together with policy together with scientists to try to. And I'm excited about IpIe. We need to pull our efforts together, but we need the short term. Let me do the short term journalism. We have no business model. It's dead. Advertising is dead. Micro targeting for you and big companies. You're using micro targeting because it's better ROI. Right? So what does that mean for journalists? A year ago, we began the international Fund for Public Interest media. I decided to co chair it, along with Mark Thompson, who is the former president of the New York Times.
He's big and tall and a white male, and I'm short and little and a brown female. So we raised $50 million in a year of new money. That is for those journalists, especially in the global south, who are putting their finger to stop the dam from falling on them and still doing their jobs. There are journalists there. But our incentive structure for our information ecosystem rewards bad journalism, right? So think about that. The second is ICCPJ, RSF, ICFJ. The global coalitions who have helped journalists come under, who are under attack. The hold the line coalition. I know. Courtney, you're here. She was one of the founders. To pull this together, we have to help journalists. There's. Evan is in Russia, right? I mean, why do we have to sacrifice so much to try to give you the facts?
And finally, the last one. I told you I wasn't going to leave you. Depressed. Right. This is something. Project Agro. So the Philippines is the third most disaster prone nation starting in 2013, globally. So our very first effort in crowdsourcing was really climate change. And we built a tech platform that we handed to the government, and we did it with help. The government accepted it. We have an average of 20 typhoons every year. This is Project Agos, which was something that worked from 2012 all the way to 2016. And what we did there is the tech platform was created for crowdsourcing for everyone. Anyone who is seeing someone who needs help, they can use the hashtag and they'll go through.
And then the journalists. We did three phases before, during, and after. How do you prepare for the typhoon? How do you respond while it's happening and recover? I'm going to show you something that actually happened in the Philippines. Right? And this is the power of the crowd. If we're not manipulated, this is Project Aglos, known internationally as Ramasun, intensifies as it moves closer to the Bcol summer area. So on this platform that we handed to the government, we do it with the government. You actually see the path of the typhoon, these lighter maps that we had. It took eight months to get our philippine government to actually release it to the public, and we did. So now, if you're in a landslide area, you know you're going to have to evacuate.
Right? And these are all came from citizens. These are all. All photos, videos that they uploaded onto this platform. Even as we followed the path of a storm, it was pretty incredible what journalists can do with our communities. And we did both face to face and remote, virtual. And the end goal is to work with government. Right? Because you don't have to always fight government, because government can't do it alone, especially not in my country. So the third part is the response. That's where it is. And then recovery. So the recovery comes in, and we all watch this. And it included the Red Cross, which had 14,000 volunteers. The goal is to turn the thousands of deaths per typhoon to hash zerocasualty.
So the Duterte administration forced this minister, who signed it with us, to resign. But now, under the Marcos administration, we are building this again. I have three cases left. I could still go to jail for a decade or so, but I hope, I hope you have to hope, but you got to have a plan. I'm going to end with this. This is, I think, something that we need in every democracy around the world with an election, because you cannot have integrity of elections. If you don't have integrity of facts, so this was something we did for our May 2022 elections. We did it with the help of Google News initiative. And the data portal was pulled together by a San Francisco startup, Medanhe. And what we did is this whole of society approach, because in the medium term it's legislation, in the short term it's just us.
So you have to take what is there with social media, the way it exists right now, and make it work your way, but don't lose your values. Right? So how do we do that? Well, this is what we did because we showed it, 16 news organizations working together for the very first time, because we tend to compete with each other. And those fact checks that we did, they're really boring and they don't spread. I bet you don't read them. You know, so these, what we did was the mesh layer. Now, this is the influencer marketing campaign for facts. We had a total of almost 150 different groups. The mesh layer. Their task every day was to take the identified fact checks and share them with their network, but add emotion and they couldn't use anger. And what we found from the data, and I will show you this, this includes the church.
The Philippines is Asia's largest roman catholic nation. It includes business. Those businesses finally kicked in. And what we found was which emotion spreads as fast as anger? Inspiration. Inspiration. Think about that, right? A little tougher, but inspiration spreads as fast as anger. We worked with the data pipeline that we collected here, went all the way up to our academic partners. There were eight of them. And their goal every week is to tell us what metanarrative is being seeded, who is being attacked, who is benefiting. They did the public before they did peer review. And that helped, right? That helped. The accountability layer is the law. How do you have integrity of facts if you don't have integrity? Sorry, you cannot have integrity.
One more time. You cannot have rule of law if you don't have integrity of facts. So our lawyers finally kicked in. They were far more excited than we were. The journalists at the bottom of the pyramid that were exhausted, and they filed more than 20 cases in three months to protect this pyramid. I want to show you some of the data since we're at the National Academy of Sciences. Look at this stuff. You know, I'm not on TikTok, but Rappler is. But it, there was creative Commons license, and everyone took what the news groups did and did it their way. Influencers across the way. Please don't call a journalist an influencer.
And then we did this. We found from the data that if we did a daily influencer marketing campaign for facts that we segregate. Segregation is the wrong word. That we identified four clusters that spread the information. And you can see here, right, we mobilized. What are boundary spanners? A boundary spanner would be Engville, between the Philippines and Norway. Because I know Engville, Engville is norwegian. Were boundary spanners to our communities. So we looked at boundary spanners and we crafted a data backed influence strategy. Because it isn't about messaging, it's about distribution, and it's about using the design of the platforms today, without losing your morals. This is what we did. We shifted them over.
Every single partner of ours, the nearly hundred, 50, we all grew. We got stronger. Yeah, that stronger together actually did work. I'm going to end it here. This. The blue are the original partners. The orange are boundary spanners. The yellow is the ripple effect. So you can see, we succeeded in taking over the center of the Facebook information ecosystem. So while there's still no law, don't give up. We organize in a different way. Stop being a user and become a citizen. Identify what civic engagement means in the age of exponential lies. I'll leave you with this. The ten point action plan. We have to turn that constitutional level to tactics. And let me just say the one thing. 2024 is a tipping point to either fascism or democracy. Sorry, and I don't use that phrase lightly. You know that word lightly? It is because we will tip at that point. And the three key elections we are looking at Indonesia, the world's largest muslim population. India. You heard from Rana, the world's largest democracy. And you here, the United States. Our tipping point, please. Thank you.
Disinformation, Journalism, Global, Technology, Surveillance, Information Warfare, Nobel Prize
Comments ()