Democracy 101 – Mis-, Dis-, and Mal-information with Matteo Bergamini

Resource type: Podcast

Democracy 101 is the monthly podcast series from Cumberland Lodge giving a comprehensive introduction to democracy. In this episode, we speak to Matteo Bergamini, Founder and CEO of Shout Out UK, about what mis-, dis-, and mal-information are, what issues arise because of false information, and what can be done to combat it.

You can subscribe to Cumberland Lodge’s podcasts on Apple PodcastsSpotifySoundCloud, and other major podcast platforms.

If you want to find out more about the work of Cumberland Lodge, find out more here.

The views expressed in these podcasts are those of the speakers and not necessarily reflect those of Cumberland Lodge.

Episode transcript

00:05 – 00:32

Munny Purba (MP)

Hello and welcome to Democracy 101, a monthly podcast series by Cumberland Lodge offering a comprehensive introduction to the workings of democracy and its significance in the world today. In this episode, we are joined by Matteo Bergamini, Founder and CEO of Shout Out UK, a multi-award winning social enterprise that provides impartial political and media literacy training and campaigns focused on democratic engagement and combating disinformation online.

00:32 – 00:54

MP

In this conversation, we discuss the rising spread of false and misleading information, the role of the media in regaining the public’s trust and good information and strategies for building a holistic ecosystem to battle online echo chambers. Thank you so much Matteo for joining us, for this podcast episode around mis- and disinformation. I’m really happy to have you here.

00:54 – 00:55

Matteo Bergamini (MB)

Thank you for having me.

00:55 – 01:13

MP

Of course. So we really try to start off getting to know about our guests, but also asking them an interesting question. So for you, I wanted to ask if you could think of a fictional character that represents either mis- or disinformation, what would it be?

01:13 – 01:17

MB

So, I mean, this is a bit of a retro one, but for me, it’d have to be Bugs Bunny.

01:17 – 01:18

MP

Oh yeah, why’s that?

01:18 – 01:19

MB

You know, from the Looney Tunes?

01:20 – 01:21

MP

Yes, of course.

01:21 – 01:35

MB

The reason why. If you remember the cartoons, I used to be a huge fan when I was a kid. The character was really, really clever and would always be able to reframe the situation and make himself out to be the victim.

01:36 – 01:36

MP

Okay.

01:36 – 01:52

MB

And you know, he’s being hunted by the, I forget the other guy’s name, but he would always manage to one up him and kind of looking back on it as an adult, you kind of realise who the bully and the bullied were.

MP

Yeah.

MB

But that would probably be my go to.

01:52 – 02:06

MP

Okay. That’s a good way to start, because then I think we can get into thinking about what those terms mean, and then maybe that’s a good way to relate it back to Bugs Bunny. So what do the terms mis-, dis-, and mal-information actually mean?

02:06 – 02:30

MB

Sure. So misinformation is information that is inaccurate, factually inaccurate, made up, but is shared by the individual, not for a nefarious purpose. So, say you come across something that you genuinely believe to be true. You share it, but it isn’t. That is misinformation. And that is the majority of the false news we see, it’s stuff that people share because they genuinely believe it’s true.

02:30 – 02:49

MB

They’re worried about it. Maybe they haven’t done enough research around the subject or it feeds into their biases. You know, if you come across an article that, I don’t know, I like chocolate. If an article talks to me about the health benefits of chocolate, my want to disprove that article is going to be lower than if it were stating the opposite, and you might be more likely to share it for that reason.

02:49 – 03:21

MB

Disinformation is false information specifically created and shared for a nefarious purpose, and that could be to discredit an organisation, discredit an individual, mess with an election, could be a number of different things. And the best way of looking at disinformation isn’t by each individual piece of content, but actually disinformation is often a campaign. So it’s often a series of fabricated images or synthetic video content created specifically to target something and to create a mess in the info ecosystem, basically.

03:21 – 03:43

MB

It differs a little bit from propaganda in the sense that, you know, this kind of stuff has always existed, and historical propaganda has always been predominantly about making you believe a alternative narrative about something. So if the truth is A, it’s trying to make you believe B. Often what we see with disinformation is that it’s not necessarily attempting to make you believe an alternate narrative, although sometimes it is.

03:43 – 04:08

MB

But it’s more about making you believe that nothing is true. So it pumps out, you know, if if the truth is A, it pumps out B, C, D, E, and so on, specifically to create this nihilistic view that nothing is true, that we can’t believe anything. Because that is just as damaging as believing an alternate narrative. And then mal-information, which is, I would argue, is the most interesting, potentially the most controversial.

04:08 – 04:42

MB

It’s factual information that’s taken out of context, again, with the specific purpose of harming someone. So it could be, for example, taking something that someone did 10 years ago, for example, and bringing it to light, having no relevance to the things they’re doing now. It could be outing someone’s sexuality if they’re running for office. It’s something that we saw when we did a project in Moldova, for example, around countering specifically Russian inspired disinformation in the region, was a lot of examples of mal-information coming out around targeting a specific female candidate, for example, that wasn’t married, but she was living with a man. Them being quite a conservative society, this was deemed a problem.

04:42 – 04:54

MB

Again, it’s true. She was living with a man. She wasn’t married. But of course, as we know, that has absolutely nothing to do with your ability to run for office or to govern.

MP

Yeah, of course.

MB

So those would be the differences.

04:55 – 05:16

MP

Okay. So it’s quite a breadth there of different versions and different things that people could be seeing across multiple platforms, multiple sources. So I think it can be hard really to understand what to do with that. You know, where to go with it, where to understand what good information is. And I think we’ll talk a bit more about strategies later on.

05:16 – 05:33

MP

But I think thinking about the idea of good information and why mal-, mis-, and disinformation can be quite damaging. If we’re thinking about democracy, why is talking about this important and why this is something that we need to be talking about when we’re thinking about strengthening our democracy?

05:33 – 05:54

MB

I mean, it’s everything to do with democracy, like democracy can’t function if people can’t tell fact from fiction. You know, democracy relies, in its core, relies on an active and engaged electorate to hold their government to account and vote based on their interests. But that puts a heavy reliance on being able to find information that shapes your voting decisions.

05:54 – 06:14

MB

And it happens year on year. It doesn’t just happen just before an election. Quite often, people have their voting decisions, their lines drawn, years prior from getting fed consistent information. And if we get fed the wrong or false information that can really hamper the electorate’s ability to hold the government to account on the things that are actually true.

06:14 – 06:38

MB

So countering dis-, mis-, and mal-information is paramount to a healthy democracy. And it’s something that, not just the UK but all over the world, all democracies are going through. And for us as an organisation at Shout Out UK, like our big mission statement is the fact that to have a healthy, active, and engaged democracy, you need to one give people an understanding of how to engage with their democracy and that is political literacy.

06:38 – 06:51

MB

The barebone basics of how to engage with the structures and system. But then at the same time, we need to give people media literacy skills, and that is the critical thinking skills and emotional resilience to have debates, discussions and be able to tell what good information looks like.

06:51 – 07:12

MP

And I think thinking about critical media literacy and political literacy, what do you think is the state of it now? And do you think things need to change around that? And how can we make sure that we’re gaining that knowledge and understanding? What do you think? Like I said, are some strategies or what do you think the best way is for us to be able to become more literate in these things?

07:12 – 07:37

MB

I would say. I mean, just to start off with with your first point, I’d say that the state of political and media literacy in the UK is a bit of a postcode lottery. It depends on which school you’re in, which area you’re in, the level of support you get. And we don’t think that’s fair. At the moment, elements of of both political and media literacy feature in the curriculum, either as citizenship, but most schools don’t do it because the lack of funding, and then in PSHE, but it’s also rammed in with sex-ed and a bunch of other things.

07:37 – 07:58

MB

Arguably, this is way too important to just be rammed into another catch all subject. We also, as part of our role as the secretariat for the APPG on political literacy that existed obviously before the dissolution of Parliament before the General Election, we commissioned a report with the University of Sheffield written by Dr James Weinberg and in that report, I mean, there’s a bunch of different stats.

07:58 – 08:17

MB

And we looked at kind of parental attitudes and teacher attitudes to political and media literacy. And what was interesting was that although over 50% of teachers believe they had a duty and a want and a need to deliver this kind of work in schools, less than 1% of the teachers surveyed said that they had the skills, the knowledge, and the confidence to actually deliver it.

08:17 – 08:33

MB

So we’re kind of dealing with a two-pronged problem with dealing with the fact that obviously young people aren’t getting these kind of skills, but they’re spending their time online on social media where arguably you need this more than more than anywhere else. But two, we don’t have necessarily at the moment, the staff in schools to deal with this problem, to actually deliver this kind of work.

08:34 – 08:48

MB

And obviously a lot of our work at the beginning, when we first started, was around direct delivery to young people. More and more, we’ve seen demand for CBD training, so training for teachers to then deliver this kind of work, which is actually, I’d say 80% of our role now.

08:48 – 09:05

MP

So yeah, I mean, you mentioned there then that there’s definitely a need for and there’s a want for it from teachers and students, really. And actually, if you think about us as individuals and every day we’re thinking about these questions, and you mentioned previously that, you know, that we’ve always had, forms of mis- and disinformation, you know, propaganda and things like that.

09:05 – 09:20

MP

But why are we now in an ecosystem do you find that it’s actually an issue that we’re having to talk about a lot more? I know we talked about kind of social media and but what about the idea of algorithms and echo chambers? How does that feed into this idea as well?

09:20 – 09:37

MB

Definitely. I mean, there’s a number of reasons why we need to talk about this now. And I mean, arguably, as I said, like it’s always been a problem. I would say that technology has brought a lot of good to the world. But like with any tech, it can be used for great good and great evil, right? And I would say there’s a number of things that have changed now.

09:37 – 09:54

MB

I’d say one, one of the biggest changes is the barrier to entry is a lot lower for spreading this kind of content. So now, cost you next to nothing to create a website that looks as swish as The Guardian’s, for example, or The Economist and start pumping out content. Not least because AI can pretty much do most of the writing and design for you.

09:54 – 10:23

MB

So actually you could be quite easily pumping out thousands of articles on a regular basis, whereas before that production line would have cost a lot of money. So I’d say the barrier to entry is a lot lower. I would also say the other thing that’s changed, for better or for worse, the gatekeepers have lost control of information. You know in our parents’ day and our grandparents’ day, you got your information predominantly from legacy news media, from papers, the government, from potentially friends and family.

10:24 – 10:42

MB

But most people got their information from from newspapers, which, you know, is a problem in and of itself because you you have certain organisations, businesses controlling the news agenda. Now, they’ve lost complete control of the flow of information. The gatekeepers, they no longer exist. That means that we have to be our own gatekeepers, and we’re just not ready.

10:42 – 11:02

MB

We haven’t we haven’t received the training and information required to to be able to deal with that. And the final point I would say as well, is that it’s not necessarily about telling people that mis- and disinformation exists, but is showing them what it is. So to give you an analogy if I may, you know, mushrooms, right? Mushrooms are they come in many different forms.

11:02 – 11:03

MP

Yes, I know mushrooms.

11:03 – 11:21

MB

I promise I’m going somewhere with this. But like mushrooms come in different forms. Right. And and some are delicious and good to eat and some aren’t. And, and that can land you in hospital.

MP

Yeah.

MB

Just because we know this doesn’t mean me and you after this podcast can go out and start foraging mushrooms.

MP

Right.

MB

Just because, you know some are good and some are bad.

11:21 – 11:43

MB

Doesn’t mean you know how to find the good and the bad. And that’s kind of where we’re at with mis- and disinformation, that most people know that it’s a thing, but they just don’t know what that thing looks like. And that’s where our biases start to come into play. Because you look at something and you know, I’m a big fan say of chocolate.

11:43 – 12:07

MB

If I come across an article that tells me about the health benefits of chocolate, I’m going to be more likely to want to believe that to be true. So the effort I put to discrediting that article, and my research is going to be lower than an article that suggests the opposite. And that obviously, internal bias around food is pretty harmless, but those biases can come into things like race, can come into things like ethnicity and nationality, gender, sex, all the rest of it.

12:07 – 12:32

MB

So although we know this stuff exists, we don’t know how to look for it so we rely on our biases to help guide us, but they’re often a problem because they are biases. They’re not based on fact. They’re normally based on emotional feeling and the way we grew up. And that is then exacerbated by social media echo chambers, because what we then end up doing is, take the example of chocolate or whatever, you end up on a social media site could be any site.

12:32 – 12:54

MB

You start to look at content that feeds your bias, that feeds your narrative of the world, and then the algorithm feeds you more of that same narrative. So your version of the world becomes whatever your biases want to be and want to make you angry about or happy about. So we’re all living in this kind of little bubbles of what we think the world actually looks like.

12:54 – 13:14

MP

Yeah. And you can see how that could be detrimental to a healthy democracy. And also just general foundations of society, which is cohesive and really kind of able to be respectful of each other. So yeah, I think it’s something that we should, as I say, definitely be talking about. Of course, that’s why we’re here, talking about it right now.

13:14 – 13:35

MP

But also I think you mentioned there that we do need to gain more skills around this, but I think if we aren’t getting the training and we aren’t really finding the spaces that enabled us to do that, are there any other ways that us as individuals are able to make sure that we’re noticing our own biases, to understand what’s going on in these ecosystems?

13:36 – 13:54

MB

Definitely. And I think it’s based off of something you said earlier about having kind of healthy conversations. And I think the problem often with social media is that it exemplifies or exacerbates certain issues. It can kind of put you in this bubble where everyone kind of thinks the same or is angry about the same thing

13:54 – 14:18

MB

that may or may not be true. And actually, I think a lot of the things that we can all do, one is having conversations with people offline, especially people that don’t agree with you, now that require as a healthy level of emotional resilience, which again, because we don’t have these conversations offline, we tend to only have them online. We’re kind of losing the ability to listen to someone that disagrees with us without getting emotionally triggered by that conversation.

14:18 – 14:37

MB

And that’s again a problem, because democracy only functions if people are able to have healthy debates with each other about issues, even if they disagree with each other. So I’d say regaining that by just having conversations with your friends and family, even if you don’t agree with them. But having when I say healthy conversations, I don’t mean screaming or, you know, using expletives across a dinner table.

14:37 – 15:03

MB

But I mean, just having conversations.

MP

Yeah.

MB

I also think we need to be done with this nonsensical idea that politics isn’t for the dinner table that sometimes is brought up, of course, it’s for the dinner table. Everything’s political, and politics affects absolutely everything you want to or you are doing in life. So removing that and having conversations with real people means that we’re not looking for validation online, which I think is something that we can all realistically do.

15:03 – 15:27

MB

And the last one is probably a lot of introspective thoughts because we all have biases. Anyone that says that they don’t is either incredibly blind or lying because of bias is a part of human nature. It’s just reality. It’s built up. It’s built within us. It’s part of our lived experience. Recognising what those are. You don’t have to tell anyone that you have them, but recognising what those are means that when you come across something that does trigger one of your biases,

15:27 – 15:46

MB

you know, actually, maybe I should put a little bit more work than I normally would to see if this thing is true and if it is, be righteously angry about it all you want. But knowing that you’ve done your due diligence is so important. But you can only really do that if you understand your biases.

15:46 – 15:56

MP

So it’s much bigger than actually just getting a bit of training about deepfakes being blurry around the edges and things like that. It goes much further than that, you know, going back to…

15:56 – 15:59

MB

I mean, it is that as well it does help as well, of course.

15:59 – 16:19

MP

It does indeed. And yeah, I think it’s, it’s a bigger conversation here that we should be having around why do people get so caught up or ride the wave of these bits of information that potentially are not true at all? You know, why is that happening? So it’s also about our psyche and our socialisation as well.

16:19 – 16:43

MP

So a lot more to think about then just kind of one or two strategies and then we will be perfect we’ll be fine. I think it’ll be key also to talk a little bit about the idea of fact checking, because of course, you know, we just had a General Election. But also going forward when we’re thinking about our politics, but also understanding what’s true and not with what’s being said around policies and information.

16:43 – 16:52

MP

There are the idea of fact checking organisations and people verifying information. What is their role, do you think and are they getting a bigger role now and are they important?

16:52 – 17:19

MB

They are, fact checking organisations are definitely important, I’ve known and had many conversations with Full Fact in my day, and they’re an amazing organisation and it’s just one of many across the globe that are attempting to fact check and verify and kind of call out dis and mis info. The one thing I would say, though, is that fact checking, I think their role is misunderstood because at the moment it’s quite often branded as kind of the answer to everything and the answer to countering dis info.

17:19 – 17:48

MB

It’s not. Unfortunately. I’d say their role is very much useful for organisations like ours, because we can point to their content and we can point to the processes of how they’ve gone about debunking something and use it quite often in our in our work, in our teaching, and point people to them as, as verifiable sources. But they are never going to be at the forefront of combating dis info, because the people that believe and share this kind of content are not going to go and verify on a fact checking website.

17:48 – 18:15

MB

They’re not going to go and check legacy media, let alone a fact checking website. So I think we kind of misunderstand their role in general. And actually, I think more needs to be done before we effectively call pre-bunking, which is the work that we do around going to and helping people understand how to critically analyse the information they receive so that you’re dealing with it before it becomes a problem, because quite often, and you see this of almost every bit of fact checking that’s ever happened.

18:15 – 18:39

MB

The fact check, the verification will always get less shares than the initial mis info.

MP

Yeah.

MB

Partly because, you know, fiction is always going to be more exciting than the often quite dry reality of where the world is. And I think there is some really interesting creative ways where pre-bunking and also fact checking are kind of meeting in the middle in a weird way.

18:39 – 18:49

MB

To give you an example, I don’t know if you remember, this is maybe about a month ago, only a few weeks ago, the police squad car that rammed into that cow twice in England.

18:49 – 18:51

MP

I don’t know. I actually didn’t know.

18:51 – 19:14

MB

The video went viral because it was a police, car that rammed into a cow, twice. The video went viral, got covered by the media. But what was interesting was that this video obviously, did the rounds on social media, on a variety of different platforms and the accounts that were sharing it were crediting this to a variety of different police forces around the country.

19:14 – 19:48

MB

So some in London, for example, were crediting it to the Met Police, etc.. So essentially spreading misinformation, let’s assume they’re doing it by accident. What was interesting was that on X, the social media platform, they brought in something called community notes. Now, there’s been a ton of like, debate around if community notes will work or not. The idea is to give context to a specific bit of content, and what I saw, which was unique to that platform and I didn’t see on any others, was that when that video was being shared and being either misrepresented of being one police force or another, the community units were able to link a statement from Surrey Police, which is

19:48 – 20:05

MB

the police force that was responsible for the action. And in the statement, they talk about the fact that, you know, the video only started of the ramming, actually, there was a bunch of different methods that were attempted before the police officer decided to do that quite drastic action. Now, whether they should have ever done that drastic action, or not is a different story altogether.

20:05 – 20:28

MB

But what was interesting was that it allowed context and the facts to not only verify where that video came from, but it was placed on the tweets and the posts that were actively spreading the video with no context or with misleading context, which I thought was really interesting. So one of the big issues of fact checking is that the fact check is on their site, on their socials.

20:28 – 20:45

MB

But realistically, that’s not where the misinformation is being spread. So there’s a disconnect from where maybe accounts are spreading the misinformation and the fact checkers. Whereas this puts the fact check or puts the context on the account in the ecosystems of where that mis info is being spread, which is the only time I’ve ever really seen it.

20:45 – 20:53

MB

Now, obviously community notes are relatively new, so the long term impact is still to be seen. But I thought that was quite an interesting go between.

20:53 – 21:13

MP

Yeah, I think that if the innovators of that are seeing that working and you yourself, understanding that actually that’s kind of something that could be positive in actually countering mis- and disinformation at its heart and in the moment. It seems like that’s the way to go with it. And actually, that would be an interesting thing to keep track on.

21:13 – 21:36

MP

It seems like that would work. You’d think that if in the moment people are able to, see the actual information to hand and not have to go elsewhere to find it, they’re more likely to be believing the truth. But then again, that’s another thing I’ve had, because recently I’ve been traveling the country, speaking to a lot of young people about democracy, about how they feel they need to be aided to engage with democratic processes better.

21:36 – 21:57

MP

And one of the big things that we’re talking about is trust. You know, who can I trust? I don’t know who to trust. And I think when it comes down to it, this whole discussion is around trust in the end, you know. It’s causing a lack of trust in people and institutions. And, and that’s something that is definitely an issue when it comes to democracy.

21:57 – 22:01

MP

So I wonder, what do you think the future of trust is?

22:01 – 22:24

MB

I mean, first of all, like I said at the beginning, this is the point of disinformation campaigns. It’s about creating chaos in the information ecosystem. It’s not necessarily about making you believe one specific alternate narrative. And I would say in that respect, they’ve been very successful, in, in breaking down trust and breaking down institutional understanding, probably, I’ll say more so across the pond than maybe to, to us.

22:24 – 22:45

MB

But the damage is definitely visible. And if you don’t trust the institutions, you don’t touch the democratic process, then what do you have? Like democracy is are hard won, but incredibly fragile. And I think we don’t recognise that nearly enough. In terms of kind of moving forward, I think we also need to be honest and say that we haven’t really helped ourselves.

22:45 – 23:16

MB

The you know, the expenses scandal wasn’t that many years ago. As a minor example, the list of COVID contract scandals and everything else wasn’t that long ago. So let’s say realistically, we need to first of all, look at the people that lead us and they need to be exemplifying what good democracy looks like. And I think there needs to be a lot more done on that side, because quite often with disinformation, what you often see when you come across people that believe this kind of like, well, this happened, so why wouldn’t this other thing be true?

23:17 – 23:38

MB

And it’s that kind of narrative that could easily be avoided if we had a better standard of accountability in politics in general. The other element in terms of trust is, again, building up an understanding of what good journalism looks like, because for a lot of people, they don’t know what journalism looks like, they don’t really know the amount of effort that goes into creating good journalism.

23:38 – 23:58

MB

So actually demystifying that, which is part of what we do, in media literacy does builds trust in those credible journalistic institutions, but also builds value, because at the moment there’s a massive financial crisis in the media sector and actually showcasing why people should pay for good content. Good quality content is actually a need for democracy to have that structure in place.

23:58 – 24:20

MB

But the last one, I’d say, as well, in terms of the media, is the media need to do better. There is so many instances, and I’m not even talking about the, you know, the hacking scandals that happened not too long ago. Again, with what was it the News of the World, for example, but also just the level of clickbait style journalism that’s been growing partly, I would say, obviously due to financial constraints on those publications, but that’s not helping trust.

24:20 – 24:27

MB

So I think actually, if the media wants to be seen as a credible source of information, again, they need to do better.

24:27 – 24:53

MP

It’s for us, I suppose, also to hold people to account to as individuals, as voters, as members of the public. If you think of what democracy means it’s people rule, right? And if we’re not seeing people ruling us well, or, you know, taking charge and leading by example, we should have a voice to say that we’re not happy. So I suppose it’s also on us, as you know, the public, to stand up and do something about it if we’re not happy.

24:53 – 25:08

MB

Oh 100%. But again, I’m always a little hesitant on blaming the public because again, if we, you know, we have all these opportunities to engage in our democracy and we do. We’re very, very fortunate to live in the country we live in. You know, there are other parts of the world that just having this conversation will get us into trouble.

25:08 – 25:29

MB

But if we’re not taught those opportunities that we have, those rights we have, those obligations we have, because there’s no right to our obligations, then in our world, they don’t exist. Like I remember when I was going through school, no one told me I could register to vote at 16. But then you were expected, at age 18, to suddenly have this light bulb moment of like, oh okay, this is everything I need to know about engaging with our democracy.

25:29 – 25:45

MB

It doesn’t work like that. It’s like, you know, your car breaks down and being told to go fix it without ever being trained to go fix it. It’s like, yeah, there is a way to fix the car. Doesn’t mean you know how to do it without being taught to do it. And dealing with our democracy, even as an average citizen, is very much the same.

25:45 – 25:55

MB

We need to arm people with the tools to be able to make a better society if they want to. Or keep it the same if they like the way things are. But again, that needs to be provided for. We need to be taught this stuff.

25:55 – 26:21

MP

Yeah. So again, it comes down to that training that that literacy and that understanding. So again, like I said, this discussion is not just about 1 or 2 key pieces of information or strategy that we can take away. It’s really understanding that this is a bigger picture thing. You know, it will require a lot of time, actually, and a lot of effort on everyone’s part to really make it a positive and easily inhabitable environment.

26:21 – 26:31

MP

I think this will be a constant conversation. Hopefully it will be one that is moving in the right direction. I wonder if you’ve got any last bits to add or anything you want to say that at the end?

26:31 – 27:01

MB

I think one last thing which kind of touches on this but goes into that a bit at the extremes, if I may. One of the things we’re definitely have seen in the kind of the last couple of years of work is the growing level of misogyny or extremist misogyny, as we call it, and conspiratorial thinking. And a lot of this is kind of the product of everything we’ve just been talking about, like severe distrust in institutions, severe distrust in the legacy media, a feeling of voicelessness.

27:01 – 27:30

MB

And what was interesting, we did a podcast series for the US Embassy a few years ago, and it was a series of episodes where we looked at all the different styles of conspiracy theories, and we interviewed people that had kind of gone into that conspiracy theory and kind of come out the other end. And then we showed the raw recordings to psychologists and academics to kind of try and understand if there’s a profile of a person that kind of went into this, went into these belief systems, and we interviewed people that were QAnon supporters, people that they used to be flat earthers, anti-vaxxers, you name it.

27:30 – 27:49

MB

We even interviewed someone that was in a satanic cult for a little while, which was a pretty interesting conversation. But what was interesting was that although there wasn’t, and it was two things that kind of came out of that, one that was kind of relatively scary was that there was no real educational profile of someone that went in.

27:49 – 28:09

MB

So like often this idea of, you know, the higher educational attainment you have, the less likely you are to believe it. And although there is some truth in that and said there is not a surefire way of making you immune. And actually, to give you an example, the anti-vaxxer we interviewed, both of his parents were academics. He is highly educated himself, and there was a series of events that led him down that path.

28:09 – 28:31

MB

So being educated doesn’t necessarily make you immune or being university educated, should I say. So that was kind of scary. The other part that we found was that although there wasn’t a specific profile, as it were, there were a couple of indicators that made someone more susceptible to being radicalised or believing conspiracy theories, and those were that the commonality was often feeling voiceless, so feeling powerless to make a change in society.

28:31 – 28:54

MB

So feeling like they’re on the peripheries, unable to make any change, and feeling like they’re either being left behind or left out. Now, I don’t know if any of that rings a bell in the last couple of months. but that’s something that, that was that was very, very loud and clear within, within the project. And the other aspect was the lack of emotional resilience and critical thinking.

28:54 – 29:06

MB

So when you mix these two together, that makes people a lot more susceptible to misinformation, to be manipulated into believing either an extremist ideology or just a general conspiracy theory of some kind.

29:06 – 29:26

MP

Some really good indicators there to understand how we can be led through various factors to, you know, a belief in these things. So it’s, I think, really good thing to mention because I think some people would say that would never be me or that would be there will never be anyone that I’m kind of dealing with or in my circle, but actually, like you said, there’s no specific profile.

29:26 – 29:47

MP

We can’t determine when these bits of information will affect us and how they will affect us. So it’s less about being judgmental and in a sense, and more about really trying to understand the roots and the causes of this. Thank you. That was a really, really interesting conversation and I hope it was helpful and interesting to our listeners as well.

MB

Thanks for having me.

29:47- 30:01

MP

Thanks for listening. You can find out more about the work we do at Cumberland Lodge by visiting cumberlandlodge.ac.uk. You can also find us on LinkedIn, Instagram, Facebook, and TikTok at @CumberlandLodge.