Socially Unacceptable

How to Protect Your Brand Against Misinformation with Ant Cousins

January 16, 2024 Prohibition PR
How to Protect Your Brand Against Misinformation with Ant Cousins
Socially Unacceptable
More Info
Socially Unacceptable
How to Protect Your Brand Against Misinformation with Ant Cousins
Jan 16, 2024
Prohibition PR

Prepare to navigate the murky and slightly terrifying waters of brand misinformation with Ant Cousins as we uncover the key strategies you can use to defend your brand's reputation. This episode of Socially Unacceptable is a treasure trove of insights, where a veteran of the IT and defence sectors joins us. Ant Cousins from Cision brings his rich experience, guiding us through the complex maze of misinformation that threatens all marketing campaigns and brand integrity in 2024. Listen as we carefully dissect the anatomy of misinformation, from political manipulation to AI-generated content, and learn how to fortify your brand against the onslaught of falsehoods.

The conversation then takes a technological turn, highlighting the power of AI in revolutionizing social listening and media monitoring. With CISION's advancements, including their acquisition of FactMatter, we delve into the capabilities of contemporary tools to sift through vast amounts of social media content. Our discussion reveals how AI has grown to comprehend nuanced conversations, providing an accurate analysis of public sentiment critical for brand management. Ant shares his journey, reflecting on social media's evolution and its significant impact on communication and data exploitation, giving us a peek into the future of brand protection strategies.

Finally, we explore the tangible consequences that misinformation can have on renowned brands and how they've responded to crises, ranging from the Disney World drinking age fiasco to the misrepresentation of COVID-19 vaccine data. We examine the importance of preemptive narrative shaping and responsive action plans in mitigating the reputational damage misinformation can cause. We also look at Ant's personal experiences, including the intense challenges of leading FactMatter, in this section he provides a candid backdrop, highlighting the importance of resilience in times of crisis. Join us for the episode that equips you with the knowledge to steer your brand through the ever-evolving landscape of digital communication and misinformation.

Would you like to know if your social media and content strategy is perfect for 2024? Book a free 15-minute brand discovery call here and we will help you grow your brand today. And if you like the show, please leave us a review or even just a thumbs up is appreciated. Come on let us know you are there.....

Follow Chris Norton:
X
TikTok
LinkedIn

Follow Will Ockenden:
X
LinkedIn

Follow Prohibition:

Website

LinkedIn

TikTok

Show Notes Transcript Chapter Markers

Prepare to navigate the murky and slightly terrifying waters of brand misinformation with Ant Cousins as we uncover the key strategies you can use to defend your brand's reputation. This episode of Socially Unacceptable is a treasure trove of insights, where a veteran of the IT and defence sectors joins us. Ant Cousins from Cision brings his rich experience, guiding us through the complex maze of misinformation that threatens all marketing campaigns and brand integrity in 2024. Listen as we carefully dissect the anatomy of misinformation, from political manipulation to AI-generated content, and learn how to fortify your brand against the onslaught of falsehoods.

The conversation then takes a technological turn, highlighting the power of AI in revolutionizing social listening and media monitoring. With CISION's advancements, including their acquisition of FactMatter, we delve into the capabilities of contemporary tools to sift through vast amounts of social media content. Our discussion reveals how AI has grown to comprehend nuanced conversations, providing an accurate analysis of public sentiment critical for brand management. Ant shares his journey, reflecting on social media's evolution and its significant impact on communication and data exploitation, giving us a peek into the future of brand protection strategies.

Finally, we explore the tangible consequences that misinformation can have on renowned brands and how they've responded to crises, ranging from the Disney World drinking age fiasco to the misrepresentation of COVID-19 vaccine data. We examine the importance of preemptive narrative shaping and responsive action plans in mitigating the reputational damage misinformation can cause. We also look at Ant's personal experiences, including the intense challenges of leading FactMatter, in this section he provides a candid backdrop, highlighting the importance of resilience in times of crisis. Join us for the episode that equips you with the knowledge to steer your brand through the ever-evolving landscape of digital communication and misinformation.

Would you like to know if your social media and content strategy is perfect for 2024? Book a free 15-minute brand discovery call here and we will help you grow your brand today. And if you like the show, please leave us a review or even just a thumbs up is appreciated. Come on let us know you are there.....

Follow Chris Norton:
X
TikTok
LinkedIn

Follow Will Ockenden:
X
LinkedIn

Follow Prohibition:

Website

LinkedIn

TikTok

Speaker 2:

Welcome to socially unacceptable, from f**k up to fame, the marketing podcast that celebrates the professional mishaps, mistakes and misjudgments, while delivering valuable marketing and life lessons in the time it takes you to eat your lunch.

Speaker 3:

Why are people doing this?

Speaker 4:

Money. Right, it's mental. Yeah, money, it's a kind of content drive you off of Facebook and onto those, off of those platforms where there is some moderation, onto their own site and onto other places where they can monetize you without, without living. How big an issue is it? It's what we know. I think a few things we know. It's only going to get worse. We know we're in an election cycle, not just in the US but around the world. More than half the world's population is voting elections this year, which has a one real big driver of misinformation, because people have a political goal. And I'm walking with David Cameron to the tent about to brief him on like the lines, just to make him remember it was all about civilian casualties, it was a really serious topic. And then on the way to the tent he's like ah, you know, I haven't got time for this, I've got to do these other things instead.

Speaker 3:

What a git.

Speaker 4:

So I walked into the tent and I called across. The interpreter said like David Cameron isn't coming, they just rush from the stations they're at, grab the cameras, grab the mic. Cameras are in my face, red lights are on and I'm like, oh, they think I'm David Cameron. So I've got two choices now.

Speaker 3:

Welcome back to Social Unacceptable. The only podcast for marketers and comps professionals that celebrates the biggest mistakes and helps you learn practical lessons from other people's misfortune. Also, you can grow your brand quicker. Have you ever carefully crafted a brilliant marketing campaign, poised to launch your brand into the stratosphere, only to see it crash and burn because of a single spark of misinformation or fake news? Well, you're not alone, because this is happening to so many brands in 2024. You've seen numerous examples of misinformation for brands such as Cabris, disney, coca-cola, tesco and even the BBC, and it's not just the big brands anymore. Today's show will equip you with the tools not just to survive, but to thrive in an era of alternative facts and fake news. Join me and, as always, my business partner, will Ockenden, and today's very special guest, aunt Cousins, as we dissect the anatomy of misinformation. We'll look at the tactics for online manipulators and we'll do our best to find a way forward for you and your brand.

Speaker 3:

In this week's episode, we speak to Aunt Cousins, who is Executive Director of AI Strategy at Cision. He has a fascinating work history, having been involved in comms departments in the armed forces, ministry of defence and even the cabinet office, and there's some brilliant mistakes in there. So, as always, sit back, relax and let's hear all about how you can avoid brand misinformation, enjoy. Hi everybody, welcome back to socially acceptable. This week. We've got in the studio Aunt Cousins from Cision. Hi Aunt, welcome to the studio, thank you for having me. So you've got a fascinating career and we're going to talk about brand misinformation in a minute. But before we get into that because that's what this podcast is all about I just want to hear about your experience in communications of the armed forces, and you originally worked for the public sector, didn't you? Do you want to just tell us a bit about that, because I've heard you've got some interesting stories around that.

Speaker 4:

Yeah, so my CV makes no sense to anybody. If you read it on paper, it doesn't make any sense at all. Starting off in IT, I did it five or six years in straight IT development roles, building databases and things like that.

Speaker 4:

And then I got on a leadership scheme and they said if you do one more job in IT, you're screwed. You're going to do something different. And I saw in a newspaper this is how old I am, right. I saw in a newspaper it said an MOD spokesperson said to them I was like that seems like a cool job. How do I be an MOD spokesperson? And they're like you're not ready for a press office. And I was like I think I'm ready for a press office.

Speaker 4:

And I managed to wham myself into an interview, got a job there and this was at the height of the conflicts in Iraq and Afghanistan. So we were basically front page news every day. I didn't realize at the time how almost lucky I was to end up in a press office where that's just dealing with all the time I need to find. Some people never, really ever get a front page story or a headline news item. So, yeah, I joined the MOD press office in 2006,. I was there through 2008 and had roles there in read from media. The sex, drugs and rock and roll desk was my desk. So when guys in the armed forces do some things a bit wrong, things go off the rails, I'd be the guy dealing with that kind of story, but I also ended up getting deployed to Afghanistan during the summer of 2008, which was a pretty tough time.

Speaker 4:

That was a really bad year for the armed forces, so it was a cruel. Later I don't think I came back the same person that went out, if I'm honest there but there were some. I think the story you're referring to is the one that Darryl might have mentioned.

Speaker 3:

Yeah, so I said I'd give him a shout out. So thanks for the message on Twitter, darryl. He asked me. He said I had specifically had to ask you for this.

Speaker 4:

So the strange this is where, at the time, everyone asked me what do you do for a living? I was like I work for Ministry of Defence. I was like, oh, so you're James Bond? I was like, no, like one, he works for the foreign office. But two, I'm closer to Johnny English than James Bond, like one of my stories are called Mr Bean.

Speaker 5:

James Bond wasn't a PR man anyway, was he?

Speaker 4:

No, no, not really James Bond in the PR.

Speaker 3:

In London yeah.

Speaker 4:

So one of the things was we had David Cameron coming out for a visit to Afghanistan and we booked a live satellite link up and we've got a loaded journalist in from Kabul down to Lashkagar in Helmand province, which where I was based, which is a long and arduous journey, a risky journey for those journalists to take. We had them coming down for a member of British Parliament because this was just before his prime minister, so he's still a member of parliament at the time. So he comes out and we booked the slot, we got the 10. All the journalists are in there, they're ready to go. The satellite link up is booked for live broadcast across Afghanistan.

Speaker 4:

I'm walking with David Cameron to the 10 and about to brief him on the lines, just to make him remember it was all about civilian casualties, it was a really serious topic. And then on the way to the 10, he's like ah, you know, I haven't got time for this, I've got to do these other things instead. And I was like okay, so what's a git? So I'm like well, okay, how do I do that? So I walked into the 10 on a call across the interpreter and I'm bringing her across.

Speaker 4:

I said like David Cameron isn't coming and he's like looking at fear in my eyes because he's been dealing with a journalist this whole time and then, just because Afghan journalists, they just rush from the stations they're at, grab the cameras, grab the mics. They come to the flap of the tent right and the cameras are in my face, red lights are on, mics are in my face and I'm like, oh, they think I'm David Cameron. So I tell the interpreter, tell them I'm not David Cameron. And he's like just no, just zero. He just starts interpreting their questions. So I've got two choices now, right, I either empty chair us and walk away and the headline is David Cameron walks away and blanks a bunch of journalists asking difficult questions.

Speaker 4:

Which is what he did, just to be clear, which is what I didn't do. So what I made up for is just did the interview and I think I knew the lines.

Speaker 3:

So did they quote you. As David Cameron says.

Speaker 4:

So when I got back to my desk there were a few red lights on the phone. So yeah the ticker David Cameron was there.

Speaker 5:

And how did you get, did you just say terrible misunderstanding. I didn't explain myself.

Speaker 4:

I think I just left it, you know, because I think we all kind of to Afghans. It's kind of like the reason they thought I was. Even though I had a beard and a shaved head, they still thought I was David Cameron. I'm like they're not going to do anything about it.

Speaker 3:

I mean, that's that's. That's the only challenge I've got with David, I delivered the lines.

Speaker 5:

So, in his new role as Foreign Secretary, next time he goes out there, he's going to be. There's going to be a lot of confusion, there's going to be. He's going to do an interview and everyone's going to say this isn't. David Cameron. This isn't David Cameron. I remember Where's that.

Speaker 3:

Yeah.

Speaker 5:

You've aged badly, david, anyway.

Speaker 3:

Okay, so thanks for that Great story, great story. So today's episode is all about brand misinformation. So your current, your current title at CISION is now Executive Director of Strategy and AI. Is that correct AI strategy?

Speaker 4:

AI strategy. When I came up with that title, I was thinking well, we're, we're at some point, we're going to automate the job of creating the AI, but we still need someone to think about the AI we need, right? I just want to be the last person in before we turn off the lights.

Speaker 5:

So so tell us a little bit about I mean, we understand CISION here. Our listeners might not tell us a little bit about what CISION do and your role within CISION in an AI context.

Speaker 4:

Got it. So CISION is probably, depending on how you measure it, the largest media monitoring or social media listening company in the world, because CISION has a bunch of products, including PR Newswire, which I'm sure you're familiar with, brandwatch, I'm sure you're familiar with. So CISION itself has a media monitoring product called CISION 1, which we released in the summer of last year. So, and a whole bunch of insights, briefing and things like that which we do for some of the biggest companies in the world. So, size-wise, we're probably the biggest. We've probably got the biggest share of the biggest companies in the world.

Speaker 4:

My role is in thinking about and this kind of came out of the acquisition of FactMatter, the company I was I was running was thinking about okay, how do we apply AI to every part of this business, not just to our products, but internally, to our insights guys, to engineering, to sales, customer success. Every part of the business now has an opportunity to get more efficient and better and, frankly, more meaningful work through AI. So I'm applying my 10-odd years of experience in working in AI to that kind of problem now.

Speaker 5:

So from I think let's take the social listening side of things briefly. I think we all understand the concept of social listening, so bring it to life a bit. In which ways can AI kind of enhance what we're currently doing, then, when it comes to social listening?

Speaker 4:

Okay, well, the biggest problem with social listening is the amount of content. Right, and this is the problem not just for marketers but in PR. If you're trying to use social media, you need to make it strategic. So the biggest challenge is the volume of content, and that is something that AI is great at right. You can churn through lots of content and give you some classifications, give you some categories, give you some trends, give you some insights. So I think that's a great use of AI. Where humans don't have the time to get through that amount of content. Just tell me what this means. Tell me what is relevant about this to my brand or to what I'm interested in.

Speaker 3:

What fascinated me is we've been doing social listening since before, when it first ever came out, and I helped with a couple of tweak, a couple of the social listening platforms right at the very beginning. And it was always sentiment, because the code did the computer, if you say something is.

Speaker 5:

Shit hot is a good example. I didn't want to swear in the first two minutes, oh sorry.

Speaker 3:

We've done it now, so fuck it. Yeah, if you say something is shit hot, it thinks that's a negative rather than a positive. Has AI now solved sentiment or are we still churning out social listening platforms that still don't get it right and just guess?

Speaker 4:

I am pleased to say yes, we can now move beyond basic sentiment, right, the sentiment which, just over 10 years old now, we've been using it well beyond its purpose because it's all we've had and we've almost, because of that, normalized how bad sentiment is for any real, true measurement and outcomes, right? So there's many different ways we can do that. So when I was at FactMatter, we developed a kind of a targeted sentiment model, which is it is still sentiment, but you can tell it a specific object to orientate the sentiment around. What I mean by that is we actually had a test which you can probably find online. We looked at a tweet and it was the quote of the tweet was Excellent, I'll be voting no in any Scottish referendum.

Speaker 4:

Then exclamation mark. Now, a typical sentiment model would score that as positive Ends in an exclamation mark. Starts with excellent, no inherently negative connotation words in there. But if what you really want to do is know how is a person going to vote in a Scottish referendum, which is actually the kind of thing you'd want to know, then it's a polar opposite like absolute negative, right? So that's something that our model can now detect. So we're going to radically change. I think polling pollsters are going to get a real boost, I think.

Speaker 3:

So the way that you got to decision is you sold your business FactMatters or Matter, FactMatter. Right, okay, tell us a bit about that, then. What was that process and how long did you have that business?

Speaker 4:

So that was originally founded by a guy called Dhruv Galati, who just very helpful timing I'd just finished a role for a startup, getting them set up and off the ground. I was looking for my next challenge and it was a bit far from communications, whereas that is my kind of core. I wanted to get back to communications and the thing that drove me to it was that I was in Egypt just after the fall of President Mubarak. I was on the first diplomatic mission into Egypt to find out who's left, who could we be friends with? What's the situation here? And throughout that period, I saw for me the peak of what social media can do for the world, which is they got democracy in a country which was non-democratic just because they wanted it. They use the social media to organize themselves faster than the security apparatus had ever been able to respond in the past and to share their stories of the world and get support. Amazing. I was like this social media is amazing. It's going to change the world and this is 2010. When everything was positive. When everything was positive.

Speaker 4:

And then, of course, over the course of that period, we saw Twitter was having money troubles. Facebook back then was having money troubles, like how do we monetize? And they were moving more and more towards advertising, creating the tool set to support advertising, which was what then came to Analytica. Another company started exploiting and harvesting information. So we were heading towards the Nadir. And I think we hit the Nadir like in the middle of the teens and I was like how do I get back to working on this problem? And Dhruv had a visionary, had seen. Well, ai is nearly at the state where I think we can use it to help identify this false content, negative content, trends in that space, to allow us to act more strategically to fix those kind of narratives before they take hold and take root in people's minds. And then we end up with all the social conflict, the COVID, misinformation and undermining democracy and all the negative connotations. This is the most important mission in the world. That's why I wanted to get back into that.

Speaker 3:

Yeah, and when you use social media, listen, like if we're doing a campaign for a client and a client's got a crisis going on. We've done various campaigns where we've had to assess what. They've come to us and said can you assess the situation? Can you tell us who the protagonists are? Who is influencing whom? Now? I imagine that AI in that respect now is really cutting through that, because the Arab Spring uprising that you're talking about there, if they'd have had something that could spot the key protagonists and the people or the influencers not to use the generic term influencers, but actually the proper people that were influencing decisions and things that were taking place that would be really powerful, right.

Speaker 4:

Yeah, for good or bad right. Ai is and we'll probably I'm sure we'll come onto this. Ai is just a tool at the end of the day. What the company was when I took it over was trying to understand what is true and what is false. And I slightly changed that to suggest that I don't think any one company should own the truth and that it's not possible for any one company to own the truth because, no matter how hard you try, there is going to be bias in how you interpret that. So I changed the model to be more of a picks and shovels model. So I was like let's build a tool set which anybody can use to understand.

Speaker 4:

Okay, this might be false. Based on all the research we've done in the training of the models, we found we think this might be false. You need to take a call on it. We found this might be racist, but you need to take a call on it as a human. So we've basically built a tool set which allows people to kind of flag that harmful content, because it can read a million pieces of content in a few seconds. It can flag those trends and tell you like, rather than this post, this post, and this post is harmful. This post and this post are part of a trend. Here's where that trend started. Here's who started that trend. Who else is involved in that trend? That's what you need to make social strategic.

Speaker 5:

And when we talk about misinformation, a lot of it is about kind of a government level, typically around elections and things like that. An example of that which I found quite chilling Around the Trump election apparently, there was misinformation, possibly from Russia, advertising the day of voting in black neighborhoods as the day after the actual day of voting to make these black neighborhoods who would have voted against Trump miss their opportunity to vote. So it's quite frightening, isn't it?

Speaker 4:

And how sophisticated and how clever it is Allegedly that's actually quite a simplistic attack, if that was true.

Speaker 5:

Well, give us an example of something more sophisticated than that we see at a governmental level.

Speaker 4:

Yeah, so the Russian misinformation has actually been in place for many, many years.

Speaker 4:

So, if you look at Ukraine, which is a strategic goal for Russia, is still victory in Ukraine whatever they've defined victory, as, if you look at the telegram channels and the social media channels that are less well monitored and protected and look at the misinformation happening there, they've exploited over 10 years worth of complex undermining of social structures, particularly in the US, to get to the point where they can now convince people that it's actually Nazis running Ukraine.

Speaker 4:

Right, and that happened through take it through COVID. So they were supporting, obviously, covid misinformation theories. So you've got people in the US now thinking, if I believe that the conspiracy theories are on COVID and I find a bunch of telegram channels which agree with that, I'm going to trust those channels. And if those channels are also the channels that tell me that the election was stolen, then all of a sudden I've got two data points of why I trust these channels. And if those channels now say that there's Nazis running Ukraine, well, they agree with my worldview on these two other major points. So I'm likely to believe them on these things. But that's like years and years.

Speaker 5:

So very structured, strategic long term.

Speaker 4:

Totally. Yeah, I'm not going to say they didn't do that, but that's like simplistic compared to what they the way they think about misinformation.

Speaker 3:

So people listening to this show are brand managers, marketers, and they're going to be saying brand misinformation. Is that really an issue for us? And, like I've just done, I mean we've seen various campaigns that we've been involved in, a lot of which were in NDA to the high heavens. But there's brand misinformation campaigns for Disney, for Kola, for Nike, for Tesco, with COVID they had some COVID vaccine misinformation. I wonder what do you think brand managers and brand owners can do to minimize misinformation against their own brand, because it is coming for anyone and everyone isn't it.

Speaker 4:

It is yeah and how big an issue is it? It's, I think, a few things. We know it's only going to get worse, right, because the ability for generative AI to create more credible sounding misinformation is now there, which it wasn't there before. Previously it was quite easy to identify because the way it was written was often pretty poor, I mean like typos just the formatting, lack of references to sources. There was a number of kind of lexical similarities we could use to identify misinformation, which are now a lot harder because generative AI can create quite credible sounding content. So we know it's only going to get worse.

Speaker 4:

We know we're in an election cycle, not just in the US but around the world. More than half the world's population is voting elections this year, which is a one real big driver of misinformation because people have a political goal. The challenge for brands is that I think and this is my hypothesis, right there's not as much research for this Gen Z by more based on belief and values than any other demographic group, or at least that was the case. My belief is that whilst they, with their affinity of social media, figured out they had the power, every group now realizes they have the power if they can get to go in enough kind of numbers. So everybody's now canceling everyone else for whatever they can find the evidence for. So everyone is wielding this power, but that power is based upon their understanding of the facts and understanding of the world, which makes misinformation strategically more risky for brands. It makes it even more risky seeing as also there is another trend every brand is becoming more political and I think that is happening because we are demanding of the brands we buy for Gen Z by more based on belief.

Speaker 4:

Zara, I don't want you using sweatshops in X or Z right. The misinformation around Zara came out. Adidas had a similar risk on that kind of thing if you trike their supply chain far and off. Back their slaves in the supply chain in China was a misinformation.

Speaker 3:

And very, very recently, main story on the news Boohoo misinformation on their labeling of their own products in the UK Right.

Speaker 4:

So every brand and they're going to get punished for that right. People will undoubtedly make buying decisions because of that. So every brand is going to get politicized to a point where they need to understand what are the groups and their values and how they support me. What is the kind of misinformation that would cause those groups to stop buying from me? And I need to understand those now. And my advice is don't wait for the misinformation to catch you in the crosshairs. Get ahead of that and be clear on your values. Be clear on your ESG FX. Get those publicized. Don't wait until after you're accused of something to then produce evidence to the contrary. Too late, too late You're on the back foot.

Speaker 4:

You're on the back foot. Get ahead of that.

Speaker 5:

But what's dropped? I mean we'll come into the practicalities in a minute. I think about how brands can actually sort of protect themselves against this and be on the front foot. But I mean, whether this is an obvious question or not, who's driving this? You know the Coca-Cola example. We've got a Disney example. Is it Pepsi doing it against Coca-Cola? What's the agenda here? Why are people attacking brands? Is it just activists that don't agree with the brands, or they don't agree in capitalism, or whatever it might be?

Speaker 4:

Yeah, the actors, it's really interesting. So there's a lot of people making a lot of money from misinformation and disinformation. Alex Jones in the US wasn't a multimillionaire because of his charisma, so there's a lot of people making a lot of money from coming up with stories people will buy into.

Speaker 3:

Correct.

Speaker 4:

So right, people are playing on people's fears. They're playing on their interests. They're inherent beliefs and biases, they're inherent challenges. They're playing on that to create content. It's not like they care at all about the content they're creating. They'll create whatever it takes to get clicks. And so this comes back to the fundamental power of internet. Right, it's driven by advertising. It's driven by clicks and attention. So the number one reason is money.

Speaker 4:

The number two reason is, I think, political or social power. So people build followerships, they build groups and they can use that to achieve either social influence, political influence, economic influence, and if you can get a group of people to follow you, you can wield that power. And the way you get people to follow you is to create fear of other groups, fear of consequences and anger. Fear and anger are typically the methods you'd find.

Speaker 5:

So I think you're.

Speaker 4:

Hello.

Speaker 5:

Brexiters I mean, yeah, this is playing into the whole idea that we've never been more divided as a society and it's so polarizing on social media now, isn't it? I've got an example actually to bring us to life of misinformation. I'm sure you've got some brand examples, so we were kind of doing a little bit of research around some of the big examples of brand misinformation. This is a little bit old, but in 2022, disney World. Somebody posted a TikTok video about Disney World, claiming that they were going to lower the drinking age to 18. And it created a huge shitstorm in the media. It got millions and millions of views on TikTok, and Disney were dragged into this debate about the fact that how could they ethically lower the drinking age to 18 in Disney World? And it was completely falsified and there's certain websites behind it which obviously got millions and millions of views. The TikTok channel obviously got huge amounts of influence and you would think why, but I think you've answered that it's about gaining community, isn't it? And it's about gaining some sort of power and followers.

Speaker 4:

Yeah, so I mean Disney is a really good example. They caught a number of different crosshairs so they had a real challenge a few years ago with the Don't Say Gay, bill Fiasco in Florida. So that brought out, I think, a number of different conspiracy theories. And the challenge with these kind of examples is a lot of NDAs. But they weren't our client at the time, so I can talk about this, although I'm pretty sure they may be a decision client, I'm not sure. So I need to be a little more careful.

Speaker 5:

You'll find out when this goes live, won't you? I'll find out.

Speaker 4:

But a couple of years ago they had a few different misinformation issues. So, yes, there was that one. There was also allegations that were involved in child trafficking, right, which is a classic one for Disney. It's like we have access to children, so, of course, the most easy to believe conspiracy theories is the one that makes the most basic common sense to the people who are going to be believing that kind of stuff. So they caught a number of different crosshairs, but the fundamental driver for a lot of those is Disney is seeking, with its content, to be more inclusive right, to balance the scales of many, many decades of a lack of inclusive content, right, and that I think that's amiable, that's kind of believable, that's understandable.

Speaker 4:

But there's a whole bunch of people who aren't happy with that, who feel like they're being disempowered and they're being targeted and attacked for being in the group that was previously well supported Star Wars, great example of the same thing. So I think they're paying the price for that group, and that group happens to be one of the groups that is most happy to get online and just abuse and create. So, yeah, that's my kind of personal view on the challenges for Star Wars. They've been targeting a specific group which is more than happy to take their abuse out on other companies Any brands in the UK, then that listeners will be used to that.

Speaker 5:

You can have. You got one, I've got an example?

Speaker 3:

Yeah, in 2021, nike faced a backlash for advertising on websites. So this is the reverse Nike was advertising on websites that were spreading COVID-19 misinformation and, although unintentional, the association with the platforms tarnished the brand and got people talking about who were health conscious and misinformation during COVID. Wow, there was a guy that spoke at the PR conference down in Brighton His name escapes me, but I'm hoping to get him on the show Because he was in charge of the misinformation of COVID-19, like minimalising it, not doing it, and it was fascinating. What's your experience of some of the worst examples of misinformation in the UK?

Speaker 4:

Yeah, so it was hard, specifically in the UK because I think a lot of those were global narratives but we were tracking. This was when I was at FACMATO. We were using our air to detect those kind of issues and a good example we found Actually, this is a pretty sharp example now I think about it we found a narrative that started in I think it started in the US where people were taking authoritative numbers for death figures for COVID and taking the whole number and using that online and saying look, more people have died, have taken the vaccine than have not taken the vaccine. Now, of course that's true because more people have taken the vaccine, many more people have taken the vaccine, but they were leaving out the percentages and leaving out that context. So if you just look at that whole number, you're going, oh my God, vaccines are killing people. That was their narrative. That came out of a few people who weren't followed by many people, they weren't particularly interesting in accounts in the US. There was a Scottish version of that where they took numbers from a release specifically to Scotland and did the same thing. They were effectively localizing and targeting that same global narrative, using statistics from individuals to make it relevant and feel close to people, and we saw that kind of a bumble around.

Speaker 4:

For a couple of years, a bumble around for a couple of months. It got picked up by a Texan lawmaker who had more followers. He gave it a bump and that narrative became sort of starting to become established. But it still wasn't getting picked up by anyone, still wasn't being dealt with, wasn't being addressed, no one was coming out and explaining the reason why it was untrue. And then I think it was an American influence. Candace Owens picked up on the same thing. She retweeted it. She had like 5.1 million followers and after that it was the number one COVID conspiracy, going for a period of three or four months. So not hundreds of thousands, if not millions of people have been exposed to that. You've probably heard of that and it started in August, september of the year, before it really picked up.

Speaker 3:

Well, the scariest example he gave in Brighton was in the UK. What they do is they use tools like what you're talking about social monitoring to identify the key things to engage with, because there was that much misinformation around COVID-19. And one of the biggest stories was right at the peak of lockdowns is that if you've got COVID-19, drinking boiling water, because it's so hot, kills the virus. And they were like, oh my God, this is a serious health risk and they had to release. And it's just like why are people doing this?

Speaker 4:

Money, it's mental. Yeah, money. I mean the people who create that kind of content drive you off of Facebook and onto those, off of those platforms where there is some moderation, onto their own sites and onto other places where they can monetize you without limit.

Speaker 5:

And on that, actually, I've got a stat here which is frightening and I'm keen to get your view on this. Campaign Magazine did a study about brands unwittingly advertising on misinformation sites and in the US, for every $2.16 spent on news websites, $1 of that is spent on misinformation. So almost half of all their ad budget online is inadvertently spent on misinformation sites. Is that-?

Speaker 4:

I feel like we might have contributed to that study that. I feel that sounds familiar to some work we did in interviews back. So yeah, it's a real problem. I think there's a really good bunch of folks on you find them on LinkedIn and Twitter called Check my Ads. They're trying to shine a light on this problem Because the whole infrastructure supporting ad deployment doesn't really give you, as a brand, much control and insight to exactly where those ads are going, is all controlled and automated. So and there's a lot of people making a lot of money from fraudulently placing the adstxt on our sites and grabbing the numbers and making their sites even legit. Like there's a lot of different ways. People are kind of manipulating the system Again just to make money. So I think that some ways people look at this to go there's like an ideological drive for this. It's what drives. Everything else is money.

Speaker 5:

So there's no grand sort of conspiracy or political agenda. It's basically a load of people wanting to make a load of money.

Speaker 4:

I think if it weren't making them so much money, we wouldn't be worrying about it as much as we are.

Speaker 3:

And what's scary now is the generative IR is just churning out content. It can just do it. It can just churn out tons and tons of content. Okay, so if I'm in a small marketing team because there's a lot of people out there listening to this that will be working in small marketing teams that are just going okay, this is really fascinating, because it is fascinating. But what do we do to check that everything is in order and we are as prepared as we can be for misinformation against my brand?

Speaker 4:

Yeah. So, like I said, I think definitely don't wait for it to happen, and there's a few things you can do before that. So pre-bunking has been proven to work. We did a lot of that in the early days of the Russian invasion of Ukraine. So knowing what the other side is going to claim helps you. If you can think about that. What are the risks to your business? And there are, depending on your industry. So in fact, this is a really good example. I've not used this for chat GBT, but you could probably ask it to think about some of that kind of stuff, like what are the likely risks to your industry or to a brand? Think about those things, get your lines to take in advance.

Speaker 3:

Oh yeah, I'll do that.

Speaker 4:

The other thing is making sure if you are doing something which proves that, get that out there. Like I said, don't wait until someone accuses you of being a racist company or non-inclusive to then go oh but we've got these stats. Get that out there beforehand. Publish that. Get it on your website before the date of the accusation, because everything you say after the accusation is going to get seen through the lens of the light. Yeah, so plenty of things you could do before. Another thing you can do before is get your lawyers ready. So don't be afraid. We've seen this with the misinformation on election machines in the US. Sewing people really does work.

Speaker 3:

So if you and there is- there's a particular ex-president that loves doing that, isn't there.

Speaker 4:

Right. So it works. It has an effect. Get your lawyers prepared. Know what your lines are going to be Like. What point are we going to take people to court when they accuse us of doing something that's not true, etc. We have laws in place where we have libel right. We have other ways you can punish people for lying about your business.

Speaker 5:

So and okay. So we've pre-bunked to the nth degree. We've published proactive content saying you know what? You shouldn't drink boiling water. It's going to hurt your throat or whatever it might be. We've got our lawyers teed up. But how do we actually know when misinformation is published about us as a brand? Is it just a case of suddenly the chief exec knocks on our door and says what's this about? Or can we be much more proactive about it?

Speaker 4:

Yeah. So that's something where I think you either throw a lot of humans at it or AI is the, frankly, is the only way, because the challenge with it is it starts in small numbers. Right, it often starts in small numbers and you have to keep. You have to spot it early. It's like it is. Honestly, it works like a virus. You have to spot it early and understand where it's happening and target it at the source. Who is driving it? If you can get those accounts taken down, great. Do that as much as you can.

Speaker 4:

If you can flag that content, do that, but spotting it early is key, and for that I think there is only AI. So and that's something that obviously we developed a fact matter, there are other companies out there playing in that space, so AI is probably your only source of that, and it's like it's a classic case of AI is can be used for good or for bad, but it is definitely being used by bad people. So we only have the choice. We have to use it as a tool to defend ourselves.

Speaker 5:

So a lot of the work we do at prohibition is in crisis management and we do what we call kind of next generation crisis management, where we actually use social listening tools to track evolving narratives. We look at antagonists and how those conversate and we often you've got a hundred kind of conversations and you kind of select the most influential before we engage. And that's the approach we take, because you just you can't engage with everybody. Is that the case with misinformation? You know so there's an awful lot of false claims about, let's say, you're a utility company or you work in FMCG. If there's 100, 200 false claims do you make? How do you make a judgment on when and where to engage?

Speaker 4:

It's a really good question. I think it's going to vary slightly by sector, by industry, because in some industries you're right, there's a large number of very spurious claims which almost drown each other out, right, so you don't need to worry about those until it gets to an account with a level of reach or popularity or fallow ship that's going to cause you problems. So you might want to set some some fairly standard bars, right, we get a hundred spurious claims a week, but until someone's got more than a thousand followers, that's our bar for causing action and you might be able to build that over time. In other cases, in other industries, you might not have that much for a problem right now, in which case you need to spot everything and understand the relative risk of that.

Speaker 4:

But that's where I think the narrative aspect comes in. So what you're doing is going through a huge amount of content and developing what you kind of form as those narratives. Narratives change slightly over time. They can meld and kind of morph into other areas. They can split and branch into different sub narratives, as we saw with the COVID misinformation. Right, covid vaccines will kill. You became individual, targeted, like Scottish, and people are dying because of you know so and so so, spotting the way that the narratives branch, I think, in mapping, that is another way you can kind of get back to source.

Speaker 5:

And it sounds like this needs. I mean, I remember when Chris and I, years ago, were writing crisis plans and so many clients didn't have a social media crisis plan. And you know, this is 12 years ago. And we always used to say look, social is absolutely part of your set, you know, and we'd meld the two together, but presumably it feels like we need misinformation clauses now in crisis plans. It feels like it's absolutely part of a productive.

Speaker 3:

But that is a, I suppose, misinformation, because you're talking about when we were doing crisis plans. When we do them for clients, we're looking for if they've got an issue or a problem, a real issue. This is like is it also an issue for you're talking if people make it up as well?

Speaker 5:

Yeah, I mean, I just feel like this. I mean, would you typically include this in a traditional I say traditional in a brand's crisis plan? I mean, it's preemptive crisis management, in effect, isn't it?

Speaker 4:

Absolutely, because if you treat kind of earned media and social separately, sometimes things start with a slow board on social media and you've got to catch them before they flip right Because once so many people have seen it, a journalist is going to go. I need to address this. So you kind of need to have social media in your plan from a listening perspective and from an engagement perspective, such that when the journalist gets involved they see your voice and your point of view, because if all they have access to is the misinformation it's going to colour their story. Then you've got the stuff that comes out directly from journalists because they've had access to a source which might be misinformation, or maybe they break something which was actually untrue and that causes social media as a reaction. So it can flip between the two.

Speaker 4:

The stuff we developed at FactMatter allows you to track across both earned and social and combine it to a single view. So for any narrative you can see, actually this one started on Reddit, jumped to Twitter, then it was on both and then it grew to the point where it became, you know, a journalist or a traditional news outlet got involved and you can see that track over time and trace it back to. It was this individual post in this Reddit thread, so you've got to have the mapping in place. But it is hard labour. I'm glad you guys are putting a focus on it, frankly.

Speaker 3:

So I think we've covered a brand of misinformation quite a bit. I'm quite interested to see the show is called where our strap line is fuck ups to fame. I mean, I know you're very good at your job, so have you ever?

Speaker 4:

We've had five years.

Speaker 3:

Have you ever made any fuck ups in your career?

Speaker 4:

Other than doing an interview with David Cameron which was a great story.

Speaker 4:

Would I do that again? Probably not. So I would say I think On the way in we were talking about the fact matter journey. So when I took over fact matter, there wasn't a lot of money left. There was really good tech, but we need to work at how to monetize it and productise it and get it to market. And I end if this is pre AI boom right. So it was difficult to get funding, which is not difficult now If you've got AI, just slap generative on your AI startup and you just 10 X your your funding potential.

Speaker 4:

So when I arrived, there was a plan and I think I did. I did as best as I could, as I could, to get through the plans as fast as possible to get out to market. I want a few decent clients, but it just wasn't enough to prove product market fit in order to get more investment. So we're in a position of really need to sell the business when it gets some kind of outcome for the investors, for the team, and because I didn't want to see that technology go to waste, I wanted to see it being used. So we ended up in kind of an acquisition discussion. We see which kind of went through. But that whole period, from taking over the business to selling, was the most stressful and difficult period of my life.

Speaker 3:

How long is that? How long that was that was a year right.

Speaker 4:

Year in a bit effectively from taking over and it was like always short of cash, always desperately trying to get the next sale, always trying to kind of share progress, keep the team motivated and engaged, all of which were super talented and could have got a lot more money anywhere else. But we all believed in the mission, right, we believe the technology was worthwhile, so we took into the acquisition conversation. I think I got my last paycheck in April of 20, april of 22, was when I got my last paycheck until we completed the sale in November of 22. And that whole period was basically building up debt and putting my way through incredible stress, like we just had a baby in October of 21. So basically in that period I'd like a six month to a one year old baby and a job that wasn't paying me any money at all and everything hung on getting that sale to go through and I believe that it would.

Speaker 4:

But in hindsight and this is kind of like the definition of success or a fuck up right you could look on paper and go I sold a business which was in kind of a tough situation great job, well done and if you just look at the financials you could say that was a success, but you also have to look at the toll that it took on me. I definitely didn't have this much gray hair going into that year. Look at the toll it took on my wife, on our relationship, and I think you have to look back and judge it by those factors as well.

Speaker 3:

Would you do it again?

Speaker 4:

On any given day. I don't know the answer to that question, maybe, maybe not. I went through. I was in Iraq, I was in Afghanistan, across the Middle East, during difficult times. No PTSD, that sale, ptsd, wow, like just. Sometimes the hair on the back of my neck stands up. I get emotional thinking about it, because it was by far the most difficult thing I've ever done.

Speaker 5:

So what did it feel like when it went through? Was there a sense of relief or just chronic tiredness?

Speaker 4:

You can't imagine the relief really, because this is the fuck up. I had no plan B. I was so committed and dedicated to making that happen. It was all I did all day, every day, and you won't believe the amount of effort that goes into selling a business, even a small one. It was all I did all day, every day. So the relief was unbelievable. But yeah, in hindsight, like, and I said we had an advisor advising us through the process and I said to him like that felt really, really hard. And he's like, yeah, that's the hardest acquisition I've seen in 20 years. That's not what you want to hear.

Speaker 3:

That's not what I want to hear. You're a nice swift one, don't you? Yeah?

Speaker 4:

So I think in hindsight, like on paper, a good outcome for the business, but the toll that took and this is the advice for anyone going through a position. I had that confidence it was going to happen, but I think in hindsight you need that kind of confidence to be a CEO. If you're easily kind of rebuffed or easily swayed, CEO isn't a role for you. You have to have that unbelievable absolute dedication and commitment and that, maybe, is what made it happen. But he's also blinded me to. Yes, but a plan B would have made a lot of freaking sense, Cultivating some job opportunities elsewhere, but that would have taken time and I didn't spend that time.

Speaker 5:

But if you'd started to doubt yourself, the whole thing would have fallen apart.

Speaker 4:

But it would have fallen apart from a financial perspective, from the business perspective, but it's a private limited company. There was going to be no fallback. I would have had a debt and I would have had a waste of time, but I would have got a job, no doubt kind of, somewhere else. I wouldn't have put that much stress on my wife. So again it comes down to is it a success or is it a fuck up? If you look at on paper it might have been a success, but that was not a success. From the perspective of my relationship, from looking after my child, from all those factors, that sounds like a particular gritty one, and anyone that's thinking of sending a business will be having second thoughts.

Speaker 3:

What about your story about Ross Kemp? We like a Ross Kemp story.

Speaker 5:

We're not a Ross Kemp story.

Speaker 4:

So that was so unlucky. So in Helmand Province we had Ross Kemp filming documentaries with I think it's normally with the Royal Anglons I'm not sure if that was the original, it was the Paris that was over there mostly. So he was based at one of the Ford operating bases just doing interviews down with the guys etc. It just so happened that that particular Ford operating base and Apache crew right one of the attack helicopters was taking off. I don't know what happened. They managed to clip their Runners on the wall on the way out of the base and then stacked the helicopter like a hundred feet outside of the wall of the fob. Both the guys are okay Well, you obviously get out and legged it back into the fob and they were fine, but we've just stacked like a multimillion pound helicopter.

Speaker 4:

It's hard to imagine. Back then that would have been front-page news, although, like we lose a helicopter, you don't often lose helicopters. They're very expensive, they're very protected and it would have been a massive PR win for the Taliban. So not only would have been really embarrassing for us, real battlefield consequences of an emboldened Taliban, of them using that to engage more fighters To convince the locals that they're more powerful, all those, all those consequences.

Speaker 3:

Yeah, and what they would do misinformation to take it back to misinformation is take a video of it crashing and make out like they do it themselves.

Speaker 4:

Exactly a hundred percent, and they would have done that in every region, right, because they would have. Really that's what they they were experts at, so Really bad in Roskem just happened to be fricking there.

Speaker 5:

With his producer, just to complicate it, just to go.

Speaker 4:

Oh, look about Apache stacking into that, into the ground. So I get a call from the fob gang. We've got a bit of a problem. We've we've crashed a helicopter. It's like oh, that's bad, yeah, it's really bad. He's like that's not the problem. From his Ross Kemp's here I was like, oh shit, so um. So I was like thinking quick, thinking quick. Okay, let's talk to Ross Kemp, because he said to film a documentary. Right, he might have been married to the lady from it, was it?

Speaker 3:

the use of Brooks.

Speaker 4:

So there's a real link there to real tabloid front-page news. But let's talk to him first and see if we can make a deal here. So we spoke and actually Ross Kemp such a legend he's like I'm here to film a documentary, I'm not here for news, right, that's not my job and if I do that I lose credibility, I lose relationship, I lose trust. And he's a really good guy with. The troops is always.

Speaker 4:

Yeah, that's great to hear that and so and and so I had that conversation and I was like I trust, I think, I think, I think he's gonna stick on, I think he's gonna stick on this line. I call the press office and go look, got a bit of a problem. Ross Kemp's there. They're immediate responses. We need to release this, we need to get ahead of this. And I was like, on balance, I think we'll be okay. I think I believe Ross Kemp and it was a massive call at the time but it it worked because he didn't leak it. He absolutely kept his word. It for all I know, this is the first time it's been discussed. Wow, an exclusive.

Speaker 3:

Yeah, I think exclusive and there was.

Speaker 4:

There was a claim by the talent about three months later About them downing a helicopter, but they did it in a different region of Afghanistan, and so we're like, no, that's not true at all.

Speaker 5:

Well, let's get different region. We'll get Ross on the next show, showy, oh yeah, I think the best thing.

Speaker 4:

I? This is weird thing. Like thinking back, was this in my memory? Did I actually say this? I think I said could you hide the helicopter?

Speaker 3:

Could you hide the?

Speaker 4:

helicopter, because we do. When I was sitting and they're like well, I think someone's got a parachute because there's Paris, right, I think someone put a parachute over the top of it, like just to hide it, you know, so people wouldn't get evidence of it, of a crash helicopter, just what.

Speaker 5:

I see here.

Speaker 3:

No, I've got another story, I mean.

Speaker 5:

I'm enjoying these fuckers. It's slightly higher, higher scale than we normally get. Yeah.

Speaker 3:

What about the story that you had a little bit of a editing issue with the Sun?

Speaker 4:

Oh, oh man, that's a. That's a really Painful example, right? So we have this constant problem. If you're a press officer, especially for a government department, you have this problem that no journalist trusts anything you say. It's such a combative relationship in many cases. Sometimes you get some working relationships and an element of trust, but often it's quite combative and non. Also with the case with Tom Newton done from the Sun, who is the defense editor, I think at the time we was a just a journalist.

Speaker 4:

At the time I was in in the press office. So the way it works in the press office is press office for the MOD, the MOD, yeah. So the way it works is at the end of each evening you run through the lines of the stories that you're thinking gonna break that night and different journalists work on different areas. Sorry, different press officers work on different areas, right. So they all work on their own lines. They format to a single kind of addition of these lines to take so that when you're doing the Overnight shift because it's a 24-7 press office You've got all the lines you need to defend against any story that could come up. So that's kind of how it works. I was on the overnight shift like a battle plan in itself 100%, 100% and definitely in the middle of a war.

Speaker 4:

So it's like it's routine, it's diligent, it's really. I honestly I was interested. I loved my time in the press office. It was really formative for me. But in that period we had a story about kind of the exact detail to it was it was about the number of troops. I'm gonna say it was mental illness. It's about the number of troops that leave the armed forces and suffer from some kind of mental illness. And I think it was. There was a whole number in there. In there and in my lines that number was something like 2000.

Speaker 4:

So when I see the early edition of the cuts and I see the Sun story and it's running 20,000, I'm like they've made a massive mistake. I've got that, I've got the numbers. Here is 2000. I've seen, I've got the lines from the policy experts. So I call up the desk and like we've got a problem. I think you've made a mistake. I don't need to run with 20,000. I think it's wrong. They couldn't get hold of Tom To check the numbers with. I couldn't get rid of my this specialist from my team to press officer. So we had this agreement, me and the guy running like I think he changed it. So they changed it, and the next morning that story comes out at 2000, only to find my colleague had forgotten to update the number from 2000 to 20,000. The Sun had the right number and I just nixed Tom's story completely.

Speaker 5:

He was furious, absolutely furious, and rightly so. Were you dreading that phone call from him? Oh, that was not the worst you know when you get a bollocking from a journalist.

Speaker 4:

Yeah, and you deserve it.

Speaker 5:

You know, yeah, that was not I've had a few of those in my time and it's not.

Speaker 4:

The realization, the only thing. I mess up that story. But I've just set back the any relationship I had with Tom and any relationship him, any trust he might have had in the press office, because now he believes that we're that kind of press office that we would automatically just try and, you know, screw him over on a number. Well, my day one on the press office I was told we never lie and we never say no comment and I was like that's tough and so I held to that, absolutely held to that.

Speaker 3:

But never say no comment. The Ministry of Defense Wow, that's quite impressive.

Speaker 4:

Yeah, there's a line when it comes to special forces operations. We don't comment on the details of special forces operations. They're very specific kind of no comment. Otherwise, we always kind of give give some kind of line, because no common just allows the journalist to create the story for you, right, I ask?

Speaker 3:

it. I ask everybody who comes on the show now that you've done some interviews and you've talked about your mistakes and what you've learned and the fascinating misinformation side of things If you were us, who would you next get on the show to talk about their failures and fuck ups?

Speaker 5:

apart from David Cameron and Ross Kemp.

Speaker 3:

Lord Cameron, if you listening, we are available.

Speaker 4:

That's interesting, but so so naming a person would suggest I think that person has fucked up a lot right.

Speaker 3:

Just that they've got a good story to tell.

Speaker 4:

I would honestly not, not just because of the the AI aspect of Greg Matusky. He's just a great talker, he's a really fun guy, so and he's and he's definitely got, I think, a forward-looking vision of what's going to happen to the industry and I think he's old and bold enough to probably have a fair few of those stories. So I'd probably go with Greg Matusky from Gregory FCA in the in the US right.

Speaker 3:

And if people, if our listeners, want to find you online, where can we, where can they find you?

Speaker 4:

LinkedIn's my network of choice at the moment.

Speaker 3:

You can find me that and you use the name that your mother uses, my full mother name yeah, anthony Cousins, without an H.

Speaker 5:

Yeah, cool, excellent. Well, thank you very much for that massive value and massive Entertainment, I think, for our listeners, and that was just a brilliant podcast.

Speaker 3:

Which I think we both really enjoyed it because they got them for an hour to be honest, it was brilliant.

Speaker 1:

Thanks for coming up and seeing us. Thanks for having me.

Speaker 3:

So will and was a fascinating bloat. What a guy. What I mean? His work history was fascinating. His, his fuck ups were brilliant. He had another one that I didn't. I didn't have time to ask him and I the after, after he left the studio, as we were walking out, I said we're gonna have to get him back on because he's got a whole other, a whole load of other stories to tell me yeah, I agree, and I think it was a really interesting take on AI.

Speaker 5:

That's a little bit different to the ground we've covered before. I think you know a lot of the conversation in the industry at the moment is about things like the ethical implications of AI, quite tactical applications when it comes to PR. But that was really Insightful and actionable and quite concerning for brands really, this idea of misinformation, and I think it could impact any kind of brand. You don't have to be a coca-cola or a disney, you know you could be a, an SME, you could be a, you know, a smaller FMC, g type brand. But yeah, really, really interesting.

Speaker 3:

I mean. So I suppose the interesting thing is we've been doing crisis management All our careers, but and we've been used in social media listings since it came out over 2006, 2007, something like that and we use it to identify the, the key protagonists, as we discussed on the on the interview. But now that AI can do it so much quicker and get right back to the source of where things are coming out, that is just going to save people, brand guardians and marketers out there. It's just going to save them so much more time To stop the disinformation, because you have to make a decision, like you you mentioned it during the interview will, where you pick the top 10 to engage without the. You know 100 conversations if you're a big brand. So that that's. It just tells me that everything's speeding up, if anything.

Speaker 5:

Yeah, and for me, the the big takeaway I got from this was, just as you would be with crisis management, when it comes to misinformation, brands need to be prepared, so it's kind of preempting those misinformation situations that might impact you and it's publishing content in line with them, preemptively, so that when misinformation, when and if misinformation does happen, you're prepared and you've got a good story to tell. Otherwise, you're just on the back foot.

Speaker 3:

Thanks for listening to this episode of social unacceptable. If you haven't done it already, please click subscribe and drop us a comment if you want to let us know what you think of the show, and and if you've got a question for any of our guests, feel free to drop it through and we will see you in the next episode.

Speaker 2:

Thank you for listening to socially unacceptable. Please remember to subscribe to the podcast and leave us a five star review. Don't forget to follow us on social media on instagram, tick tock and linked in at prohibition pr and twitter at socially ua, we would love to hear some of your career fuckups so we can share them on the show. For more information on the show, search prohibition pr in your search engine and click on podcast. Until next time, please keep pushing the boundaries and embracing the socially unacceptable.

Avoiding Brand Misinformation in Marketing
Applying AI to Enhance Social Listening
The Impact of Misinformation on Brands
Brand Misinformation and Challenges Faced
Preventing and Responding to Misinformation
Ant's F*ck Up Number 1: The Toll of Selling a Business
Ant's F*ck Up Number 2: Crashed Helicopter and Media Relations