Episodes

Friday May 30, 2025
Friday May 30, 2025
David Spiegelhalter, Chair of the Winton Centre for Risk and Evidence Communication, University of Cambridge, addresses data-based claims and trust.
About David Spiegelhalter
"I'm a retired professor in the Statistical Laboratory at the University of Cambridge and Chair of the Winton Centre for Risk and Evidence Communication.
I'm a statistician, and my passionate interest is in the use of statistics in society: the way that they're communicated in the media, the way that they're used by policy, the way that the public understand them and, of course, all the fake news, misinformation, and the claims made on the basis of numbers."
Manipulating the numbers
We all know this phrase: "There’s lies, damned lies and statistics". I get so fed up when that gets wheeled out in interviews as an accusation against statisticians. But I can see the reason why that is used as a casual dismissal of claims based on statistics because numbers can be manipulated so much. As I’ve said, they do not speak for themselves. I can make a number look big or small if I want to.
Let me give you an example. We had this EU referendum in 2016, and one of the famous claims on the side of the big red bus was: "We send £350 million a week to the EU. Let’s spend it on the NHS". Now, that number is wrong, but I don’t care about that. It was a brilliant use of a claim because it provoked so much discussion. You could argue that that single claim almost won the referendum.
But let’s assume it’s right. Is that a big number? £350 million a week? There are 60 million people in the UK, so that’s about £6 a week. That’s about two cups of coffee per week, but let’s do it per day. That’s about 80 pence a day, about a euro, a bit more than a dollar. That’s about a packet of cheese and onion crisps. They could have put that on the bus: “We each send the EU the price of a packet of cheese and onion crisps every day”. If they had, I don’t think that would have won them the referendum.
Key Points
• Numbers can be manipulated to look frightening or reassuring.• Organisations and politicians need to demonstrate trustworthiness in order to earn trust.• Transparent data-based claims should be accessible, comprehensible, usable and assessable.• We can counter misinformation by “pre-bunking” it; be the first to tell people the misinformation and explain why it’s not true.

Friday May 30, 2025
Friday May 30, 2025
One of the things that concerns me as an historian is the spread of conspiracy theories in the last 20 years or so.
About Richard Evans "I'm Regius Professor Emeritus of History at the University of Cambridge.
I'm the author of a number of books, particularly on Nazi Germany, such as The Coming of the Third Reich, The Third Reich in Power, The Third Reich at War and The Third Reich in History and Memory. My latest book, The Hitler Conspiracies, is a study of conspiracy theories either allegedly influencing the Nazis or developed about them."
The spread of conspiracy theories
One of the things that concerns me as an historian is the spread of conspiracy theories in the last 20 years or so. Essentially, it’s a theory that whatever happens in the world, particularly in political events, is caused not by chance but by a conspiracy by a small group of people behind the scenes manipulating what goes on. An obvious example is the assassination of President John F. Kennedy of the United States, which the contemporary investigation showed was carried out by one man, Lee Harvey Oswald. However, there’s an enormous amount of conspiracy theory that alleges that that can’t have been possible, that it must have been a larger group of people in the CIA.
Key Points
• The mindset of the conspiracy theorists is that somehow chance events don’t happen and anyone who benefits from an event must be responsible for it.• The rise of the internet and social media have encouraged conspiracy theories to spread because they bypass what you might call the gatekeepers of opinion formation.• Since the economic crisis of 2008 to 2009, there’s been an emergence of populist politicians in a number of different countries. One thing that unites them is disbelief in their refusal to accept science.

Friday May 30, 2025
Friday May 30, 2025
Daniel Pick, psychoanalyst and Professor of History at Birkbeck College, University of London, talks to us about subtle and obscure power.
About Daniel Pick"I’m a Psychoanalyst and Professor of History at Birkbeck College, University of London.
My research has explored various issues, questions and problems in modern history, politics, culture and the human sciences. I’m also interested in the history of psychoanalysis and the ‘psy’ professions."
Hidden persuasion
A critical set of anxieties centres upon hidden persuasion. During the Second World War and the Cold War, there is concern about its use in extreme and coercive situations. This includes entire populations in thrall to a dictator or party with control over their lives.
Naturally, the matter is explored in the post-war human sciences, as well as in popular culture. Dramatic movies feature prisoners who are captives of a totalitarian regime and are broken down, manipulated and ultimately subjected to brainwashing.
Key Points
• Hidden forces of persuasion have long been of cultural interest. In particular, several movies throughout the 1950s and 1960s highlighted this theme.• While hidden persuasion may be associated with totalitarian societies, it has been effectively exercised in Western societies through advertising.• Concerns around hidden persuasion remain highly relevant today as big data platforms allow for the manipulation of behaviour and political sentiment.

Friday May 30, 2025
Friday May 30, 2025
David Runciman, Professor of Politics at the University of Cambridge, discusses whether the internet and social media help or harm democracy.
About David Runciman"I’m a Professor of Politics at Cambridge University and Fellow of the Royal Society of Literature. I explore the history of politics and political ideas, and democracy.
My interest is in the history of politics and political ideas and particularly, democracy. Where does it comes from? How different is our democracy from democracy in the past? What might it become in the future? In parallel to my research, writing and teaching work, I hosted a weekly podcast called Talking Politics with 300 episodes between 2016 and 2022."
Good or bad for democracy?
Is digital technology good or bad for democracy? What the age of the internet has shown us is that we often want to get a straightforward answer to a question about some new technology or some new social phenomenon.
But democracy is a complicated set of institutions, values and principles and most complicated phenomena are good and bad for democracy at the same time. I think that’s definitely true of digital technology.
It would be a mistake to assume that it’s all bad. In some respects, digital technology, social networks and social media have been a wonderful enhancement of democracy because a core democratic principle is that everybody should have a voice. In non-democratic systems, authoritarian systems, individual voice is stifled and you have to be a certain kind of person with a certain kind of power or connection to be heard. In a democracy, the ideal is that you can be heard whoever you are. By getting rid of many of the barriers in the way of ordinary people having a voice, this technology has allowed much more access not just to information but to platforms from which people can communicate. And that is a plus for democracy.
But something can be good for democracy and bad for democracy at the same time. We also see that this same technology, in the same moment when it’s giving everybody a voice, is also concentrating power and concentrating voice and authority in narrower and narrower spheres. The platforms on which people are able to communicate are owned by a very, very small number of people, almost all of them men, most of them American, some of them Chinese, who are, relatively speaking, very unaccountable.
The platform that Mark Zuckerberg created, Facebook, gives many ordinary people real access to democratic forms of community. And at the same time, that platform is owned by one person, effectively, who makes the rules and is answerable to almost no one. And it’s the same platform.
So, the same thing can be both good for us and bad for us democratically.
Key Points
• Major social networks and digital platforms give many ordinary people real access to democratic forms of community, but they are owned by one person who makes the rules and is answerable to almost no one.• The digital information age favours speed and immediacy, but this means we can act impulsively, rather than take the time to come to a collective response.• Fake news has been around forever, but the emergence of the digital age has enabled the spread of misinformation, and allowed people to only look for news and information that validate their pre-existing beliefs.

Friday May 30, 2025
Friday May 30, 2025
Peter Pomerantsev, Senior Fellow at the London School of Economics, talks about how an abundance of information created a new reality.
About Peter Pomerantsev
"I’m a Visiting Senior Fellow at the Institute of Global Affairs of the London School of Economics and at the University of Johns Hopkins.
I research disinformation, hate speech and polarisation to try to work out what we can do about it."
Fighting for pluralism
In the 20th century, we had a dream. Those people who fought for democracy had a vision of what a democratic information space should be, and some risked their lives for it. They were fighting for freedom of expression against censorship. They were fighting for pluralism, having a varied and abundant range of media which they thought good for democratic debate. They all believed in a metaphor about the marketplace of ideas: that the best information would rise to the top.
Throughout the 20th century, dictatorships would try to suppress information through censorship and arrested people who tried to speak their mind. But in the 21st century, even authoritarian countries like Russia have much less censorship. We live in an age of what some academics term “information abundance”. You can even get through China’s internet firewall quite easily if you want to reach good sources of information. It’s incomparably easier to get different, better, accurate information than it was in the 20th century, but that has also brought a new set of problems: people feel very lost in this chaos of overabundance of information.
Often, when I travel through zones of intense information conflict, like in east Ukraine or in America, which is very similar to east Ukraine in some ways, I hear people say the same things: "We don’t understand what’s true or false anymore. We’re surrounded by this kind of deluge of information, disinformation, misinformation. We feel everybody’s biased. We feel everyone has an agenda. So, we just have to kind of follow our emotions, our gut instincts, to guide our way through."
Key Points
• The liberalisation and multiplication of media outlets and information platforms have led to an overabundance of information which confuses citizens around the world.• Propagandists and other political actors use new technologies and social media tools to target micro-groups and spread Manichean ideology.• The artificial divisions of “us and them” makes it easier for populists to push their message and reach their political goals.

Friday May 30, 2025
Friday May 30, 2025
Peter Pomerantsev, Senior Fellow at the London School of Economics, discusses the information war and how we could win it.
About Peter Pomerantsev"I’m a Visiting Senior Fellow at the Institute of Global Affairs of the London School of Economics and at the University of Johns Hopkins.
I research disinformation, hate speech and polarisation to try to work out what we can do about it."
The digital “Trojan horse”
This idea of the “information war” has become a very common metaphor through which we try to understand what’s going on in our new, digitally-driven information landscape. But I believe that the most dangerous part of “information war” is the very idea of information war. Let me explain what I mean.
There’s always been a tradition of militaries and secret services doing “psy-ops” – psychological operations – in order to confuse, dismay, undermine the opponent. That’s a long and glorious tradition. I think maybe the earliest example of this is the Trojan horse. The Trojan horse is one of the finest information operations ever launched in the history of warfare and a metaphor that we still use all the time when talking about the information space. And, obviously, with these new digital tools, someone in the military or in the secret service can do a lot more to perturb the other side. There are now bots, trolls, closed online spaces that can be used to destabilise an enemy, and we urgently need to adapt to that. Countries probably need a new military and security doctrine to make sense of these technological attacks that are not exactly military aggression.
For example, NATO hinges on the premise that all the member states of NATO will defend each other in case of a military attack. So, if tanks were to ever enter Estonia, the other member states would respond accordingly to defend their military ally; that’s quite straightforward. Now, say they suffer a cyber attack instead, a hit on their digital infrastructure, how does a military organisation include something like this kind of information attack? It’s not as straightforward because information is not actually warfare. There are a lot of problems in just defining this legally, let alone in trying to counteract this. This is part of an old story that has a new iteration. The other great difference with conventional warfare is that information attacks are not just launched by governments now. The barrier to entry is very low: anyone can do one of these operations. So that’s something serious that needs to be dealt with, and this is a legitimate way of thinking about “information war” and information war doctrine. It’s an important debate that is led by experts in the field.
Key Points
• “Information war” refers to a very real change in conventional warfare and military doctrine on digital attacks.• “Information war” is used also as language that legitimises conspiracy theories.• The only way to shield ourselves from the hysteria is to learn to take a step back and contextualise the information we find online.

Friday May 30, 2025
Friday May 30, 2025
Sander van der Linden, Professor of Social Psychology in Society at Cambridge, explains why people have been duped by misinformation.
About Sander van der Linden"I am a Professor of Social Psychology and Director of the Social Decision-Making Lab at the University of Cambridge.
I study the influence process, so I study how people are persuaded by information, and especially misinformation, and how we can help people resist persuasion when they don’t want to be persuaded. As part of that I’ve written a book, The Truth Vaccine: An Antidote to Fake News, where we break down the influence process."
The illusion of truth
Why do people fall for fake news? There are a multitude of reasons. One is what we call the illusory truth effect, which comes from research showing that if you expose people to a bunch of statements —some true and some false— and you repeat them over time, people are more likely to think that repeated statements are true even when they are false.
Unfortunately, a lot of fake news is often repeated by influential actors time and time again, which really causes it to stick in people's minds and memories. This goes back to classic propaganda. Germany’s minister for propaganda, who's credited with the big lie rule, said that if you “tell a lie big enough and often enough, eventually people will believe it”.
Key Points
• When influential people broadcast and amplify fake news, it sticks in our minds.• Our political worldview and desire to belong to a particular “tribe” may make us more receptive to misinformation.• Thanks to the Internet, it’s never been easier to spread propaganda on a vast scale.

Friday May 30, 2025
Friday May 30, 2025
Stephan Lewandowsky, Chair of Cognitive Psychology at University of Bristol, explains the post-truth crisis and its implications.
About Stephan Lewandowsky"I’m the Chair of Cognitive Psychology at the University of Bristol, and I research why some people support misinformation over science.
I am a cognitive scientist with an interest in computational modeling. I currently study the persistence of misinformation in society, and how myths and misinformation can spread, specifically those that rise in opposition to scientific fact, as we see with the lies and misconceptions surrounding climate change or vaccines."
Accepting the evidence
Most of us have grown up in democracies, and therefore, we’re used to living in a democracy. However, perhaps we have forgotten that democracies work well because citizens engage in reasoned, evidence-based debate to make decisions about our societies' future. That's what democracies are about. They aren’t just about voting but also about having a constructive deliberation about how to solve problems.
At the core of this process is the acceptance of evidence. Some things are true, and others are not supported by evidence. We have a responsibility to navigate this landscape while having constructive debates. However, misinformation is undermining that process.
Misinformation has always been around. I'm old enough to remember the weapons of mass destruction in Iraq in 2003. Although they didn't exist, as we now know, they were conjured up by governments to go to war and invade Iraq. So, misinformation has always been problematic.
Yet, not all misinformation is equal. I would argue that there has been a shift over the last five to ten years. Misinformation no longer attempts to convince us of a reality that might be false, but, instead, tries to alter the state of reality, shifting it toward something completely different. For example, weapons of mass destruction, climate denial, all these misinformation attacks dealt with reality while trying to convince us that something is true when, in fact, it wasn't.

Friday May 30, 2025
Friday May 30, 2025
Barry Smith, Director of the Institute of Philosophy at the School of Advanced Study, explores the nature of explanation and human consciousness.
About Barry Smith "I'm a professor of philosophy and Director of the Institute of Philosophy at the University of London School of Advanced Study.
I'm a philosopher of mind and language, and I'm interested in how these systems help put us in touch with the world around us and with ourselves, and I'm especially interested in the senses and our sense of taste and smell."
What would satisfy us?
Imagine we have a complete science of the world that’s given us an account of the nature of reality, the nature of the mind and our engagement with the world. It seems to me that philosophers would still want to ask questions. There’s a gap that science doesn’t close. There’s something that the philosopher’s after when science is finished. And the question is, what would count as a philosophical explanation?
Wittgenstein was puzzled by and haunted by that question. We know what scientific explanations are. We know that hunger to really understand and put ourselves in the right place to get a grip on what’s going on in cell biology, or the brain, or even the chemical structure of the foods that we eat. But what is a philosophical explanation, and what would satisfy us?
Wittgenstein worries that sometimes philosophers are trying to do a sort of über-science. It’s as though we want not just the physics, but something beyond the metaphysics; we’re looking for the essence or the structure of reality, or language, or mental life. Wittgenstein begins to doubt that such a quest will deliver anything useful, but he thinks we still do seek explanation.
Key Points
• Wittgenstein thought that philosophical explanation is a matter of seeing things as they really are, instead of looking for underlying causes – and sometimes no explanation is needed.• For Wittgenstein, the question of how we know whether someone else has consciousness is ridiculous – all we have to do is look at their behaviour to see the expression of the mind.• Modern neuroscience has shown us that our sense of self is just that – a sense, which relies on the proper functioning of many different subsystems.

Friday May 30, 2025
Friday May 30, 2025
Susan James, Professor of Philosophy at Birkbeck College, London, explores Spinoza’s philosophy of the embodied mind.
About Susan James
"I’m a professor of philosophy at Birkbeck College in London.Most of my work is about early modern philosophy, particularly the social and political aspects of philosophy in that period. My most recent book is called Spinoza on Learning to Live Together."
The relationship between the body and the mind
Spinoza is very interested in this question of the relationship between the body and the mind, and he approaches it at several levels. To understand the significance or the interest, perhaps, of his view, it’s useful to consider who his opponents are. He rejects the materialism of his contemporary Thomas Hobbes, who says that our thoughts are just motions – material things. Spinoza never doubts that bodily motions and thoughts are ontologically of quite different types and that one cannot be reduced to the other.
The other giant figure who looms in the landscape and who Spinoza opposes is Descartes, who famously thinks of the human being as a composite made up of two substances: a body and a mind. Spinoza rejects the idea that it’s enough to think of one’s body and one’s mind as these two separate entities which are somehow stuck together.
Key Points
• Spinoza rejects the idea that it’s enough to think of one’s body and one’s mind as two separate entities, which are somehow stuck together.• He says that the first idea that constitutes your mind is an idea of your body. The body is there from the start and the existence of the mind depends on the existence of the body.• Spinoza is emphatic that our body is affected by other bodies; the other bodies, the external world, are there from the start.