Paradigm
Paradigm
Lee McIntyre: Truth, lies, and Twitter
0:00
-1:08:42

Lee McIntyre: Truth, lies, and Twitter

Lee is a leading voice on the topics of Disinformation and Science Denial.

Lee is a Fellow at Boston University and a leading voice on the topics of Mis- and Disinformation and Science Denial.

He's the author of several books on these topics, as well as articles in the New York Times, Nature, Scientific American, and other respected publications. He’s also appeared as an expert on CNN and the BBC, and at the United Nations, NASA, and elsewhere.

We discuss:

  • The Flat Earther phenomenon

  • The Antivax movement

  • The rise of disinformation campaigns on Facebook and Twitter

  • Social media algorithms and attention-based business models

  • How individuals can protect themselves against disinformation

  • Conspiracy theories

… and other topics

Watch on YouTube. Listen on Spotify, Apple Podcasts, or any other podcast platform. Read the full transcript here. Follow me on LinkedIn or Twitter/X for episodes and infrequent social commentary.

Subscribe for free to never miss an episode and get access to subscriber-only perks.


Episode links

  • Guest website: https://leemcintyrebooks.com/

  • Books:

    • On Disinformation

    • How to Talk to a Science Denier

    • The Scientific Attitude

    • Post-Truth

    • Dark Ages

    • Respecting Truth


Timestamps

Timestamps for audio episode

0:00 Intro - disinformation

5:05 Flat Earther phenomenon

9:30 What is disinformation?

11:40 How big of a problem is disinformation?

16:40 What should we do when the truth sounds conspiratorial?

22:15 Origins to Big Tobacco

27:50 Amplification of disinformation

31:25 Is social media to blame?

34:30 How should social media companies be regulated?

46:30 Social media business models

50:15 Information diet - what is healthy?

56:45 How can individuals protect themselves against misinformation?

59:25 Book recommendations


Introduction: On Disinformation

A core theme of this podcast is the question of how we come to know what we think we know.

Today’s conversation with Lee McIntyre is about the architecture of our contemporary information environment itself, and what this means for the kinds of information we tend to be exposed to. As you’ll hear, our information ecosystem has developed several problems that in recent years have been spiralling out of control in a fairly drastic way that we’ve not really seen before. This has led to an explosion in the amount of mis- and disinformation people are exposed to. And this is happening on several levels all at the same time, with the various levels having a compounding effect on one another.

On one level there is the autonomous proliferation of misinformation on social media platforms. A growing number of people rely on social media for their information, which in and of itself is not a problem. These can be incredibly useful, rich and dynamic information sources. However, because the business models of social media platforms depend on attracting and retaining user attention at scale, they are designed to feed users individually curated and highly engaging content. And as a side effect of our biological and cultural evolution, our attention tends to be drawn towards more inflammatory and conspiratorial content, and content that confirms existing biases rather than challenging them. So this is the direction on which our social feeds are algorithmically curated, at a speed and scale that’s vastly greater than anything we had even just a few years ago.

On another level there is the deliberate spreading of disinformation by malicious actors. In one sense this is not new. People have been lying and deceiving one another for thousands of years, and often as part of large, coordinated propaganda campaigns. But disinformation has never existed at anything remotely close to the scale, low cost, and high quality that’s become possible in the past few years. Today the barriers to generating extremely convincing disinformation and broadcasting it to millions of people online are close to nonexistent, and the number of people exploiting this fact has been growing extremely quickly. Lee and I discussed a few examples in today’s conversation, and I must say the figures are truly worrying.

With that context in mind, this episode is somewhat of a public service announcement. I don’t want to come across as scaremongering, and I certainly don’t want to perpetrate the same misbehaviour I’ve just been criticising by exaggerating the scale of this problem. But mis- and disinformation are very serious problems that are worth being aware of, so I hope you find this conversation useful.

Before we get going, if you’re finding this podcast valuable, please do share so that others may benefit as well. There’s a fast growing and increasingly active community of Paradigm listeners all around the world, and the reason for that is that listeners have taken the time to spread the word. Some have even kindly donated to the podcast, and for that I am deeply grateful.


Thank you for reading Paradigm. This post is public so feel free to share it.

Share


Transcript

This transcript is AI-generated and may contain errors. It will be corrected and annotated with links and citations over time.

[00:04:16] Matt Geleta: I'm here with Lee McIntyre. Lee, thank you for joining me.

[00:04:19] Lee McIntyre: Thank you so much for having me.

[00:04:21] Matt Geleta: Lee, I want to have a conversation with you about our information environment and how it is we come to believe the things that we do. Uh, let's start off with conspiracy theories. Um, when it comes to conspiracy theories and science denial, the archetypical example one often hears is that of flat earthers.

You know, people who believe, or at least claim to believe, that the earth is flat. And, um, I know that you've done some research in this area and you've even taken part in an international, uh, flat earther conference. Um, could you tell me a little bit about that and maybe expand, you know, at its core, what is it about the way that these people see the world that enables them to believe something like the flat earth idea?

[00:05:04] Lee McIntyre: it's a very compelling two days at the Flat Earth International Conference, because what you realize when you go in, is that they're very serious. Um, going into it, I'd had a number of friends say, well, these people are just trolling. They don't actually believe it. They're just having fun. But when you get there, you figure out that they're actually dead serious.

And they're also joyous, they're happy to see one another because they've been fairly isolated in their own communities and The you know, the conference is as much about community as it is belief And I think those two things are you know related in a way I'm not sure I can account for what allows them to see the world the way that they do except to say that virtually everyone that I spoke to had a an origin story.

They had some story about how it was that they had become a flat earther, and it always involved a sort of a radical, um, lack of trust in other people. Uh, it was, maybe they'd had some sort of a breach of, uh, a breach of trust. Some, something had happened in their life. And once they believed that people couldn't be trusted, then they wanted to see how far that rabbit hole went.

And so it, uh, it unfortunately led a number of them to very seriously believe a conspiracy theory that every pilot, every scientist, every teacher, every government official in the world was in on the conspiracy about flat earth.

[00:06:40] Matt Geleta: Yeah.

I find it really interesting, this particular conspiracy theory, because, um, you know, for some theories, there, there's a clear motive behind them. Uh, you know, for example, there might be political motives. There is a, there is a reason that somebody wants to propagate a theory like this. And then for, for those cases, I don't find it very surprising that there are many adherents, but the, the flat earther case, I, mean, I I really can't think of what a, what a compelling motive would be for somebody to sort of promote this view.

Um, and so, so what is it about this particular. theory, this idea that has attracted so many people. What did you learn there?

[00:07:13] Lee McIntyre: I, I don't, I don't know. I mean, they don't get anything out of it. Um, uh, and so, I mean, I think they genuinely believe it. But, you know, if you're looking for sort of the material cause of it, you know, what, what would be behind it? I, I mean, it's hard because Climate denial. You sort of think, well, you know, that's something where maybe there's an economic interest or a political interest.

Even vaccine denial. You know, you'd think, well, you know, maybe there's something that they're, uh, that they're getting out of it. It doesn't seem that way for Flat Earth, at least, you know, to the contrary. It's always seemed to me that they're sacrificing a lot. A lot of them get kicked out of their church.

They lose family members. They lose jobs over this. Now, people can... have that same thing happen to them over, you know, being anti vax or climate denial. But let me put it this way. I've come to the conclusion that Most science denial is caused by disinformation. That is that people don't just happen to believe these things, but they're compelled to believe them because somebody is out there creating a falsehood.

Uh, and you know, lying to them. And, you know, they believe it, and then they go along for the ride. And it's the person creating the disinformation that's benefiting from it, not the person who believes it who's the victim. You can see that with climate denial. You can see that with vaccine denial. There are nefarious people behind it, sometimes money, sometimes political power, who are, you know, profiting by this.

I don't know anybody who's profiting from Flat Earth. I don't know how it could possibly be the case that there's somebody behind it who's benefiting from having all these people believe that the Earth is flat. Maybe I'm naive, maybe it was there in front of my face, maybe it's hidden, I don't know. Or maybe my theory is just wrong and it's not always disinformation.

Or maybe this is just the anomaly. But Flat Earth in some ways... Seems unique. Uh, I've studied science denial for decades now. And, you know, evolution, climate denial, GMO denial, vaccine denial. I kind of know where those come from. Flat Earth? I don't.

[00:09:27] Matt Geleta: Okay. Well let's, uh, let's get to the topic of disinformation, which is the main topic of your most recent book on disinformation. Um, people will be very familiar with the concept of misinformation, which is basically just, you know, false or inaccurate info. Um, but disinformation on the other hand is knowingly false information.

It's information deliberately designed to deliberately mislead and obscure the truth, often malicious purposes behind it. How does disinformation differ from, you know, good old fashioned lying? People have been lying for as, as, you know, long as one can remember. Um, but it feels like the word disinformation is, has only been in the public consciousness for, um, you know, relatively and lying?

Hmm.

[00:10:10] Lee McIntyre: Yeah. Well, disinformation is a type of lying. Not every lie is disinformation, because I think of disinformation as being a coordinated campaign of lying. It's where, you know, there are a host of lies. that are strategically told toward a specific purpose. And I mean, you, you can find that, but it's not just, you know, random lying, or even just merely lying that, you know, benefits the person who's a serial liar.

Disinformation is, is warfare. Disinformation is the, the, you know, the weaponization. of information and it actually goes back about a hundred years that the modern Disinformation the term disinformation goes back to the 1920s when v. i Lenin appointed felix dzerzhinsky as his first director of the cheka during the russian revolution He wanted to find a cheap and effective way to fight back against the counter revolutionaries And he found it in the invention of uh, you know disinformation warfare so Lying has been around since human speech, I'm sure.

But, you know, lying can be opportunistic. You know, a lying can be random or careless. Disinformation is not like that. Disinformation is coordinated, organized, and planned as a strategic campaign.

[00:11:38] Matt Geleta: Yeah, I find it a little bit hard to get a gauge on how big of a problem the disinformation crisis or phenomenon really is. You know, you mentioned climate change earlier. When I look at something like climate change, I think there are metrics that people look at and the scientific consensus around how we're doing, you know, things like concentration of CO2 in the atmosphere.

But when it comes to, you know, measuring how much of my information diet is disinformation, I find this Very hard. I really have no idea. And there's this general sense that it's a lot higher than we would like. Um, but I don't have a good way to gauge it. And my sense is that for many people, the feeling is the same.

How big of a problem is this?

[00:12:22] Lee McIntyre: You ask the question of the moment. I was just looking at my Twitter feed in which one of my philosophical colleagues was saying, what's all this alarmism about misinformation? I mean, yeah, it exists, but it's rare. And the people who consume it are the people who go looking for it. And it doesn't have that great of an effect, et cetera, et cetera.

Whereas I see disinformation as being much more insidious and widespread than that. Now you might think, well, of course, he's just got a book on it, he sees it everywhere. But, I mean, the reason that I wrote the book is because after doing a little bit of research and asking the question, how did denial get as big as it is?

How, how did this become so prevalent? I started to think about... The fact that deniers aren't born, they're made. They're created by somebody else wanting them to believe a falsehood. And so then the question comes, why would somebody want you to believe a falsehood? Now, I don't want to go too far into sounding like a conspiracy theorist myself, but what I can say is that there's evidence that disinformation is more widespread than we think.

That is, I can probably name things out there. that you have heard, maybe not that you believe, but that you've heard, that you always wondered where it came from, but it turns out to be from a disinformation campaign. And I can give you an example of that if you're, if you're interested. Now, you've read my book, so you already know what I'm about to say, but the, the one, the best example that, uh, I, I use these days is Most people have heard the idea, the false idea, the rumor, that there are microchips in the COVID vaccines.

But, and thousands of people died from that stupidity, right? They, because they didn't take their, um, they didn't take their COVID vaccines because they were worried about tracking microchips. Where did that come from? Uh, it came from Russia. It came from an intelligence campaign out of the SVR, which is the, The new version of the KGB, which perhaps didn't invent the idea, but they amplified it, which is where, you know, the, the real track is.

Uh, in April 2020, there was a story in the Oriental Review, which is an online English language publication. Which said, any future vaccines developed in the West will have tracking microchips in them, courtesy of Bill Gates, who holds patent 666 on this technology. Then it said, share on Facebook, share on Twitter.

The following month, 28 percent of Americans thought there was something to this story. Now, what nobody knew, until much later, is where that idea came from. And it's now been shown through the American Defense Intelligence Agency, and was reported in the Wall Street Journal. That that's where it came from.

But here's the problem. People still don't know that. I use that example. When I speak to large groups, if I can get a hundred people in a room, I'll say, well, how many of you have heard that there might be tracking microchips to vaccines? Almost every hand goes up. And then I say, how many of you know where that came from?

I've only ever had a couple of people put their hands up and they both worked in army intelligence. Now that the idea here, I mean, I didn't have to break into CA headquarters to find this. It was in the wall street journal, but it was behind a paywall and the cable news networks didn't pick it up. So many people don't know it.

So that's just one example of a, a very virulent life threatening, um, disinformation campaign that was right under our noses. And most people don't even realize it. And I think that that is also what's happening for. The millions of people who believe Donald Trump about the 2020 election being stolen and all sorts of other denialist beliefs that you could name where there's a denier, there's usually a disinformer again, flat earth, maybe the exception, but, uh, I, I think it's, it's more widespread than we think

[00:16:38] Matt Geleta: Yeah, there is an issue here and I want to get to the topic of just the information environment and the architecture of our information systems in a little bit, but you know, there is an issue here with that particular example. Um, you know, the people who believe it, they're seeing it everywhere. They're seeing it on all the different social platforms.

They're, they're seeing it on the, whatever news channels that they subscribe to. Um, and, uh, then, you know, they hear it from you that there is potentially another story behind it. And I can imagine from that perspective, you know, your view and, and, you know, what turns out to be the correct view sounds conspiratorial.

It sounds like, Hey,

[00:17:16] Lee McIntyre: Doesn't it sound, it sounds conspiratorial except it happens to be true. Now that's what every conspiracy theorist would say. But I mean, there is evidence actually behind this, the, the defense intelligence agency of the United States, uh, discovered this. I mean, uh, there is somebody could claim, well, they're lying.

That this is just part of a conspiracy. I mean, this is, you know, what, what always happens, but. There are actually real conspiracies. I mean, there are sometimes conspirators and conspiracies. What marks off the difference between a real conspiracy and a conspiracy theory is that a conspiracy theory doesn't have any evidence, or it has minimal evidence, and then the person covers over it by saying, When there is no evidence.

Well, that just shows you how good the conspirators are, but I'm not doing that. I'm saying where they can fact check me. They can fact check me in the, in the wall street journal. Uh, if you can get past the paywall, uh, and also there, there were several other, uh, at the time, um, uh, news reports on this, which cited the defense intelligence agency study.

[00:18:26] Matt Geleta: There is, um, you know, when it comes to the topic of, you know, updating one's, updating one's beliefs based on evidence, I know you've, you've framed the scientific attitude as one that is open to, to updating beliefs based on evidence. Something that concerns me, I don't know if you come across this, but, um, you know, if, if one looks at.

the concept of Bayesian updating. So you have some prior belief about the world, you're exposed to new evidence, and on that basis you update your belief. One can actually show that depending on what your prior beliefs are, the same evidence can direct people in different directions. You know, and an example here would be suppose you have two people, one of whom actively distrusts the information source, And one of whom trusts the information source.

Um, let's say it's the, the, the vaccine example you gave, then in hearing information, you know, such as such as vaccines are safe as an example, that information can be interpreted in the opposite directions and can actually further divide people's views. So how does one get around that in the case you just put forward, you know, you're, you're presenting evidence and.

If one had all the time to do their own research to the same level as you had, they would get to the same conclusion. Um, but, you know, absence of that time, all they've got is, you know, the evidence coming from you, and if they have different, uh, sort of prior beliefs about how trustworthy that evidence is, it could even serve to further divide.

How do you think about that issue? Hehe,

[00:19:52] Lee McIntyre: such a sophisticated question, and I'm not going to be able to solve the conundrum at the heart. of statistical reasoning, uh, you know, right here on the spot. What I can say is that there is a fierce debate in the, uh, you know, in, in logic, you know, in, in people who think about Bayes and confirmation theory, you know, in the, in the philosophy of science about this.

And I'm one of the folks who think that Um, you're right that there are, that there is a sort of a, a sense in which, you know, the Bayesians will say, well, it doesn't really matter where you start because once you've gathered enough evidence, that's all washed out. You know, whatever your prior hypothesis was, it will eventually be swamped by the evidence that you gather.

Okay. Which I, I took to be the, the, um, criticism that you're making. But as you point out, depending on what that prior position is, that prior hypothesis, if it's perverse enough, that can really screw you up. So, you know, I'm, you know, I, I've read a number of people who have, um, You know, I mean, they've written whole books criticizing Bayesian, uh, Bayesian thinking, you know, so that, I mean, this question about prior probabilities, how much they should count, that, you know, this is, this is all a big debate going on in philosophy of science.

And in the scientific attitude, I'm, I'm relatively agnostic about how that gets solved, which is to say, I don't have a horse in that race. My claim is that evidence counts. There are competing theories in the philosophy of science about how you should count evidence. And, you know, the Bayesians are, you know, they've got a pretty big tent on this, but I don't think they're necessarily the last word on that, uh, on that question.

And I can, uh, point you, um, towards some other resources to read if you're, uh, if you're interested. I don't want to go searching on my bookshelves right now while we're, while we're on air, but there, there are some, uh, terrific, uh, resources. Uh, things, one is, uh, Deborah Mayo, Deborah Mayo's, uh, book, uh, I, I forget the title of it, but Deborah Mayo is maybe the, the, one of the most important critics of Bayesian reasoning who has some things to say that I, I find pretty interesting.

[00:22:13] Matt Geleta: Yeah, yeah, that sounds good. I'll link, uh, I'll link that resource into the show notes here. Um, you know, someone who you talk about in your work, um, and someone actually spoke to a couple weeks ago on the podcast is, uh, Naomi Oreskes. And, um, we talked about her book on free market economics, the big myth.

Um, but we also touched on her earlier work on big tobacco

[00:22:35] Lee McIntyre: She's wonderful.

[00:22:36] Matt Geleta: Various propaganda

[00:22:36] Lee McIntyre: She is just

[00:22:37] Matt Geleta: Excellent. Really, really excellent work. Um, and obviously Eric Conway as well. Um, her coauthor, um, in your work, you, you reference them and you draw a connection between their work on sort of the big tobacco situation and the more modern day phenomenon of science denial and, and even what you call reality denial.

Could you explain these phenomenon and their links back to the big tobacco.

[00:23:05] Lee McIntyre: I'm not going to reiterate Naomi and Eric's work, because you've already had them on, and I'm sure people have heard it. And they've done all the... You know, the brilliant spadework already to show that the tobacco strategy from 1953 that the executives came up with was followed by the climate deniers, acid rain, the ozone hole, all, you know, all the...

I mean, it was the blueprint for science to know for decades afterward. The piece of it that I... picked up. The piece of it that, you know, I, you know, I was reading their work and I was reading some cognitive scientists who were talking about, you know, the, the five, um, path, the, the, the five stop lights along the road to science denial, you know, the, the types of, uh, bad reasoning, the techniques in their reasoning, for the people who believe it. And I was thinking about all this and it occurred to me It's the same, what the science deniers are doing following the tobacco strategy is the same strategy that Trump is following when he is denying the 2020 election. Let me, let me very quickly give you the, the five steps. This, this is due to, uh, the Hufnagel brothers, uh, Mark and Chris, uh, originally, but it was also developed by, uh, Stephen Lewandowski and, uh, and John Cook.

And it goes like this. Every science denier. Uh, cherry picks evidence, believes in conspiracy theories, engages in illogical reasoning, relies on fake experts, and thinks that science has to be perfect to be true, to be credible. Those five steps are, you know, individually necessary and jointly sufficient, to use a philosophical term, for science denial.

You know, you, you, you do not find a science denier who doesn't hit all five of those. And if somebody hits all five of those...

[00:25:10] Matt Geleta: um,

[00:25:11] Lee McIntyre: They're a denier. And it was the day that I was listening to some, I don't know how much I can curse on your broadcast, but let me just say some poppycock or bullshit on cable news about the Arizona ballot recount, uh, after, you know, Trump had made his claim about the fraudulent election and they were looking for bamboo in the ballots in Arizona because these were allegedly ballots that were flown in from China.

So there's your conspiracy theory. And they were cherry picking out the ballots that they thought were questionable. There's cherry picking. And engaging in illogical reasoning. And they were relying on fake experts. Uh, the, the company that they used to do what's now called the fraud it. They're, they were the fake experts.

And I, so as I was counting it up, I realized they hit all five steps. But the topic here is not science denial. It's something else. It's denial about an election. But it's the same. It's the same pathway. It's the same technique of reasoning, right? So the content is different. The technique is the same. And that's when I really got the idea to write on disinformation because I realized that there was another step in it.

Uh, Naomi Oreskes and Eric Conway paved the way for this. The Hoofnagle brothers paved the way for it, but I wanted to say, look, look, this is the same as what Trump's doing now. He's just using, he's creating disinformation and he's engaging in a denialist campaign, maybe about something different. A different topic, maybe a different motive.

I mean, the tobacco industry wanted more money, the climate, you know, the fossil fuel companies was, you know, climate denial was about money. Well, what Trump's doing is really maybe a little bit about money, but it's mostly about power. But I mean, the point is that a disinformer has an agenda. They have something that they want that is furthered by having other people.

Believe their falsehood and that's what I saw and that's when I I don't think I coined the term, but that's when I realized Reality denial was also a thing and there there was a straight line I think I think I even say this in the book. I think there's a straight line from the tobacco executives meeting at the Plaza Hotel and December 1953 And the steps of the Capitol on, uh, January 6th, 2021.

I think it's a straight line of a denialist campaign from there to there.

[00:27:52] Matt Geleta: so you mentioned in your book that for disinformation to be effective at scale and, and these campaigns you mentioned from, you know, the big tobacco industry and elsewhere, um, three things need to be true. You know, you, you need to have creators for the disinformation. You need to have amplifiers that amplify the information, uh, disinformation and believers.

You have to have people who believe it on the other end. And when I look at disinformation from an historical lens and, and think about these three things in isolation, it seems to me that potentially there hasn't been a drastic change in, you know, the number of, of creators or the creator or believer categories.

Um, and I would love to get your thoughts on that, but what, what does seem true to me is that potentially the amplification category is very different today than it has been in the past. And maybe a lot of the large scale problems that we're seeing come down to this amplification step. Does that view seem right to you?

Is amplification to a large extent at the, at the heart of the disinformation crisis we're seeing?

[00:28:56] Lee McIntyre: I think it is. I mean, uh, there's, it's very hard to measure how many disinformers there are, but I mean, I wouldn't be surprised if it's the, you know, the, the, the same as it has been. But the, the thing that one notices now in the age of social media is that, you know, even one person who's a disinformer can have a pretty big microphone and do a lot of damage.

The statistic I found compelling is that if you look at the, uh, In 2019, the Center for Countering Digital Hate found that 65 percent of the anti vax propaganda on Twitter was due to 12 people. They call them the disinformation dozen. Now, so you might say, well, that's not that many people amplifying it.

And so maybe that hasn't changed that much. The number of amplifiers either, though, I think that probably has, but with social media, the same number of amplifiers. can get their message out to the four corners of the world, and that's the problem, right? It's, um, if you look back at what happened in the, for the tobacco executives, they had to run a whole public relations campaign.

They had to pay for full page ads in American newspapers. They had to, you know, go out and talk to newspaper editors and, and, uh, writers and, you know, get their word out. How hard is it now to get the word out? You go to Twitter or Facebook. If you've got a website, you've, I mean, it's, it's so much easier now.

And, and I have to say one thing that's probably changed is that the quality of the disinformation is higher. Uh, especially in an age in which we have, you know, AI. You know, there are more technical assistance. Um, one of the barriers to the creation of good disinformation is To have people communicating in the language that, you know, you have your target audience and, you know, that has been a barrier.

So, if you look back at some of the Russian memes around the 2016 election, they're, they're laughable. They've, they've got elementary grammatical mistakes. You, you could tell that's not, it's not who they say it is, yet they were effective enough. I think now probably the quality has. gone up or will go up very soon.

Um, but the corollary to what you just said is that if we're going to fight disinformation, the best possible way to fight it is to clamp down on the amplification. If that's what's making it worse, then if we clamp down on amplification, that will make it better.

[00:31:24] Matt Geleta: One of the most common amplifiers that will come to people's minds is that of social media. Um, and I've, I've read, you know, Pew Research released, uh, an article last year that says something like 50 percent of Facebook and Twitter users claim that they get a substantial portion of their news and information from these platforms.

And, um, many people are aware that this could potentially be a problem because I mean, for many reasons, but one key one is that these, these feeds that people get are curated individually. And that means individuals are kind of getting a personalized. view of what's happening in the world and they will not be the same across individuals.

And so that's one problem, but there are many others. And so I would love to turn to the question of news media, but social media specifically, to what extent do you think social media is, um, you know, one of the root problems or, you know, lies at the heart of this disinformation crisis that we are experiencing?

[00:32:21] Lee McIntyre: it's a really important question because, I mean, disinformation goes way back. It was invented in the 1920s in Russia, as we've spoken about. But they didn't have the internet back then. And the thing with disinformation is that it's, it's actually fairly useless unless it's amplified. So, I mean, the, the cost of getting disinformation amplified used to be high enough.

That it was just, you know, really hard for people to, to get the word out. They'd have the, you know, the mimeograph sheets. Um, you know, if they could afford it, they'd take out an ad in a newspaper. But, you know, what do you do? Well, now it's easy. Now you just have to go on social media and they will amplify it, uh, for you.

Um, the, in 2019, the Center for Countering Digital Hate found that 65 percent of the anti vax propaganda on Twitter was due to 12 people. So 12 people can do a lot of damage. And you know, for this, you can blame those 12 people. And you know, I, I think we should. But I think you can also blame the social media companies for not deplatforming.

So, I mean, one thing that they can do is content map moderation. They can play whack a mole. As they see inauthentic accounts, take them down. As they see false messages, content moderate. Put a little box below to say that, you know, this claim is disputed or, you know, here's where you can look for more truthful information.

And they've done that sort of thing. What they're very reluctant to do is to deplatform people. And that's, I think, what they primarily need to do. Uh, the night before Elon Musk took over at Twitter. I checked and found that eight of that disinformation dozen that I just spoke about were still on Twitter.

You can't blame that on Elon Musk. They were there the night before he took over. So while were eight of the 12 top disin informers on vaccines still on Twitter the night before he, he took over. It makes no sense to me.

[00:34:29] Matt Geleta: Yeah, but what about the counter argument that says social media enables everything to be put out in the sunlight? And, you know, if you've got somebody or some body or some system moderating content, moderating, you know, users of the platform, then that will introduce a certain political leaning or a bias of some sort, and the views that would otherwise get public scrutiny are not now finding their way into...

the sunlight. And so if it feels like, you know, suppose we could moderate content and moderate, um, uh, you know, usage, uh, then we have to make a call about which, you know, political leaning we, we choose in many cases. How do you, how do you think about that sort of counter argument?

[00:35:09] Lee McIntyre: Well, it's a very popular one right now. Elon Musk himself has made it. Others have made it. Basically the claim is that. It's one of free speech. It's one of, um, against censorship. You know, we don't need somebody deciding what's true and what's not true. Get it all out there. Let a thousand flowers bloom.

And the truth will out. We'll discover it. You know, people will, in this crowdsourced world, they'll figure out what's true and what's not. Doesn't work. Um, and it doesn't work because in a polluted information environment, um, it's very difficult for people usually to, to figure out where that pollution is, uh, is coming from.

I'll, I'll say a little bit more. The, I think that some of the arguments about censorship are disingenuous. Because it's not as if we're saying that people with wrong opinions, or even false opinions that they know to be false, can't express them like we should throw them in jail or, you know, muzzle them. All we're asking is that they not be amplifying those voices. The social media companies, uh, don't have to amplify someone else's lie. And suppose you were a radical free speech proponent. Suppose you thought that everybody had the right to speak. Well, this comes up sometimes in the United States where we have the First Amendment and we find ourselves in a situation where the Ku Klux Klan wants a parade permit.

And we have to give it to them because under the First Amendment we have to do that. Even if we hate what they're going to say. What we don't have to do then is go down to the rally and help them pass out their flyers. That's amplification. That's what Twitter is doing. So, this idea that... All we need is to just let everyone speak and truth will rise to the top. Do we believe that in science? I mean, do we, do we want to just let the fraudsters in? I mean, scientific fraud is a terrible thing and I think it's worth thinking about why it's a terrible thing. It's because it pollutes the information stream where people are trying to figure out the truth. So what if somebody said, Oh, yeah, but you're just censoring the fraudsters.

Just let them in. And I'm not talking about censoring the gadflies or the oddballs with the radical theory that might be true. I'm talking about the people who have cheated, lied, who know that their theory is not true. Should we give them a voice at the table? I don't think there are very many people who would.

Uh, agree with that because it would ruin science in the same way that I think disinformation threatens democracy One more thing if elon musk believes so much in the importance of transparency And that's why he wants to give everyone a voice. Why doesn't he make his algorithms transparent? Why doesn't he have a board of governors who are not affiliated with twitter? where you know, you blind all user data and these experts can look at the Algorithms and warn in advance. This one's dangerous. This one is going to kill people. I mean, we always have to wait for a whistleblower to come forward, and then we find out once the damage is done too late. So if Elon Musk really believes in transparency, open up your algorithms, let the sun shine in, and let's find out what's in them.

[00:38:37] Matt Geleta: Yeah, in one lens, I'm really sympathetic to, to that view. And I think a lot of people would be, but there is a different lens if you, if you consider these, you know, businesses as businesses and you consider the business models that they need to run in order to operate, um, you know, a lot of these platforms, let's take YouTube for example, but, um, basically all of them.

In some sense, work on a business model that relies on, uh, attention, you know, either advertising or they're selling something. And if you just play through the logic, you know, suppose it's advertising, the platforms make money from advertising. This means they have to attract and retain user attention.

And this means that inevitably, you know, the content that does this and the strategies that do this is they're going to proliferate. And unfortunately, coupled with our human psychology, that does often tend to be conspiratorial and inflammatory. And so it feels like in one sense, if we're making this ask of these businesses, we are also, um, so putting them at a, at a disadvantage.

Um, or, you know, we ask them to go against their business models. I could think of, for example, um, candy companies or something. Uh, you know, the product is candy. It's inherently unhealthy for people. Yet, this is also the product that people are choosing to, to purchase. And, uh, In a way, the, the sort of moderation that you've just talked about could be considered to be like analogous to, you know, banding certain types of candies and, um, and so how do you, how do you think about the, the business model underlying this, uh, this whole social media issue?

[00:40:13] Lee McIntyre: I mean, businesses always complain about regulation. They don't want anything to stand in the way of, of making profit. But why are business regul why are businesses regulated? It's for fear of public harm. It's probably why there's no more cocaine in Coca Cola. Um, you know, it's, it's, I mean, could they make more money if they did that?

They, they probably could. Um, look, Why are social media companies the only business that's protected from lawsuits for libel, slander, defamation? Under Section 230 of the Communications Decency Act, they're protected from lawsuits if they leave something up. That's horrible, that's defamatory, that's false.

They're also protected from lawsuits if they take something down that's perfectly innocuous but they just don't like it. They have a, they have so much more freedom than other forms of media. Television is regulated, radio is regulated, newspapers are regulated. The same story that appears on Twitter. If it appeared in the New York Times, the New York Times could get sued.

So there's a real asymmetry, and of course nobody in business wants to be regulated because they can make more profits if they didn't have it. Imagine a cable television landscape that was unregulated. CNN has ratings problems. They have for years now. What if they started airing public executions? They could triple their ratings overnight.

But they don't, and I think that the reason that they don't is not necessarily some moral reason. It's regulation that would not be allowed, but we don't have those sorts of regulations for social media. I think they're coming. I mean, look, one thing that we could regulate is transparency. What you were just talking about.

Couldn't we, couldn't the U. S. Congress Uh, uh, regulate Twitter, YouTube, Facebook, that they had to have an independent panel of experts looking at their algorithms to assess in advance. Whether it was causing, you know, more suicide amongst teenage girls or, you know, whatever it is that you wanted to, to check, they could regulate that without the government being the fact checker in chief, without having the government decide, you know, what was true or false, simply to regulate them in some way.

I mean, if you look at the laws surrounding all other media, TV, print, all of it, um, it's, it's quite a bit compared to what they have in. social media. So yeah, they always, the cigarette companies complained when they had to start putting the labels on their packages too, but people still bought cigarettes.

[00:43:13] Matt Geleta: Do you think this um, this argument that you put forward suggests that there is more of a problem of, of bad systems and incentives than, than bad actors? Or, I mean, obviously it would be a combination of both, but I'm interested in your thoughts on to just where this line lies, um, because You know, in a sense, if social media companies currently are in a situation where there isn't much regulation, um, I do find it very hard to believe that they wouldn't behave somewhat in this way, or at least that the product wouldn't turn out somewhat in this way, uh, simply because that is the direction that the business model pulls in.

Um, but, but I'm not, I'm not sure to, to what extent, uh, this is also driven by bad actors. So what is your sense? Where does that line sit?

[00:44:01] Lee McIntyre: I think bad incentives create bad actors. I mean, you, you can shape human behavior through incentives. I mean, isn't that what government does? Isn't that what law is? I mean, I'm thinking of that book, um, nudge. Few years ago about behavioral economics. I mean, you can nudge people in the direction of saving more for retirement by having, you know, automatic enrollment on your first day in a new job, and then you're free to opt out of it.

But if you're automatically enrolled, then you're more likely to leave the money where it is through inertia and you'll save more. So, I mean, we're, it's perfectly okay to change the incentives to try to change the behavior. The question I have right now is, what's the incentive for social media companies to change?

What's the incentive for them to change their algorithms if they're tweaked for engagement? They're making a lot of money. And they'll also tell you, and maybe rightly so, that whatever they do, they're going to get criticized. If they do more content moderation, they get criticized from the right. If they do less, they get criticized from the left there.

They might feel like, well, we just can't win. So let's just make money. They do have a social responsibility. Um, there are genocides that have been caused from disinformation on Facebook. The Ro, the Rohingya people that, that was, you know, a famous example in Mark Zuckerberg after the fact said, well, we're going to get more language, uh, and you know, cultural content. folks on this. Well, yes, too late. And why didn't they do it before? Because they weren't regulated. They had no incentive to do so. I mean, look, people, people can play within the rules and still make a profit. But if there are no rules and they're making a profit anyway, why would they want them? I mean, why would they want that kind of regulation?

So I think this is a case where the incentive, where the incentives lacking, it creates bad actors. It makes people do things that they wouldn't otherwise do. I mean, would car companies have put seatbelts in cars without government regulation? And I don't know the history of this, but my suspicion is, this was not an idea that the car companies came up with by themselves.

Now, maybe I'm wrong about that. Regulation is for the public good.

[00:46:31] Matt Geleta: Yeah, I, I, I think some people, um, you know, look towards this optimistic scenario in which, you know, the car companies, for example, do choose to do it on their own because it leads to a better. You know, more valuable end product for the user. And I think, I think a lot of people think about, um, business models for, for media and content in a similar way.

Um, you know, there's a proliferation of individual creators who are very thoughtful about their, their business models, for example. Um, but what strikes me at, at bottom is always going to be a, um, you know, a. Basically selling, selling attention in some way, even, you know, subscription to the wall street journal, for example, you're still selling something that the users want.

And I wonder in principle, then if it is possible to extract oneself from the human desire to, to look at inflammatory content, to look at conspiratorial content, you know, and maybe that flips then the question around to the question of individuals and how. Individuals should be thinking about exposing themselves to information more than, um, you know, what we should be doing about the information system itself.

And so maybe let's turn to that topic because I know you've put a lot of thought into this, uh, in your book. Um, with that lens, what do you think individuals themselves should be doing to guard themselves against, uh, being exposed to myths and disinformation?

[00:47:57] Lee McIntyre: The first thing they have to do is to realize that they're being exposed to disinformation. I think that that's a threshold question. You know, you, you said a minute ago that over 50 percent of users at Facebook and Twitter, that's the main place they get their news. If you do a poll to ask people.

Whether they think that they've been exposed to false information on Facebook or Twitter, something like 4 percent will say that they have, when the actual number is, you know, closer to 80 or 90 or even 100%, I mean, yes, we've all been exposed to it, they just can't tell. So, I mean, the first thing that we have to do as individuals fight back against disinformation in the environment that we currently live in is to Wake up to the fact that we're in an information war that the people who are competing for our eyeballs Are not all good actors acting in good faith trying to get us to believe true things Because it benefits us, you know, and i'm not just talking about we are the product of free You know free stuff on the internet free free websites, you know, they're they're getting something out of us I'm talking about the fact that we are duped often Uh, people are taken in, go down the rabbit hole, as it were, on ideas that were invented for someone else's profit.

That's the real danger with this information. The problem is this story that I think I told you about earlier, where the... microchips in the vaccines was invented by on a Russian troll farm. Do they care that people are not taking their COVID vaccines and dying because they're worried about microchips in their vaccines?

I don't think they do. So that's the problem. I think that people have to be aware that they live in an environment in which that kind of thing not only can happen, but it's actually quite common. That's the first thing they can do.

[00:50:13] Matt Geleta: Yeah, when I, when I put this to an analogy like diet or nutrition, for example, you know, I think although nutrition science is, is not in a, in a very sophisticated place, I think there is this awareness, um, that. The traditional diets in many countries are not very good for us and, um, people in, develop systems to sort of protect themselves from it, they develop certain rules and, um, you know, it's debatable what, what is really working, but there are certain things that are known, you know, highly processed foods, it's known that, uh, this is, is generally not good for you, uh, very sugary foods and so on.

And so individuals do construct their own lifestyles in order to protect themselves from, you know, overindulging, overexposing to these. unhealthy types of diet, and I wonder if there's an analogy to information diet here as well, um, where if an individual thinks about their own information environment that they create, you know, once they've got that awareness that you talked about, how does one construct an information environment such that it is, if not a healthy information environment, at least not an information environment that's killing us and, uh, and overexposing us to these malicious things?

[00:51:25] Lee McIntyre: It's hard, isn't it? Because the inflammatory stuff is very titillating and you want to look at it. You want to see it. It's, I mean, it's, and I'm, I mean, I'm talking about everybody. So how do you self discipline not to read a story that makes fun of a candidate that you don't like?

[00:51:47] Matt Geleta: Hmm.

[00:51:48] Lee McIntyre: if you know that that story is going to contain a few lies, or at least hyperbole. How do you, I mean, when that kind of thing is easily available. And you're right, it's just like the diet problem. If you've, if you're trying to keep healthy, but you can't really afford good organic food, and the fast food with all the fat and salt and sugar is cheaper, and it tastes good and it's right there, you're more likely to grab it.

So again, one thing can be awareness. I mean, for the information diet that you're talking about, um, I think that one of the, one of the most dangerous things has happened is that people have become of a mindset, well, you can't trust any media. They're all the same. They're not all the same. Um, because in that environment, I mean, you've heard people say, uh, next year they're going to find out that steak is uh, good for you, so I'll just eat it now.

It doesn't matter what I eat. Whatever I eat ten years from now, they're going to find out that it's bad for me, so I'm just going to eat what I want. But if you do that in the information sphere, you end up watching Fox News and reading Breitbart, and you never, you don't even know which channel the BBC is on, where, you know, there are good journalistic values and integrity.

So I guess part of my answer is trust. I mean, discipline is hard. It's, it's hard to condition oneself to do the right thing, even when it's to our benefit. But I think that what, again, once people realize that, you know, you're either in control of your own beliefs or someone else's going to control them.

You know, there are plenty of people who want to show you things that are not true that would be perfectly happy to have you believe them. Things sometimes that can take your life. The diet, I can't stop from coming back to the diet example because I just read something about a guy who claimed that he, he lived in Japan for 10 years and he sort of ate anything that he wanted.

And he was not fat, his BMI was something like 19, which is pretty low.

[00:54:16] Matt Geleta: Hmm.

[00:54:16] Lee McIntyre: And then he moved back to the United States, and he just sort of fit in, and his BMI went up to 29. So, and what he attributed it to, I'm not saying this is true, but what he attributed it to was Um, liquid sugar, um, the, the, the corn, what, what do they call it?

Um,

[00:54:41] Matt Geleta: High fructose corn syrup

[00:54:43] Lee McIntyre: high fructose corn syrup. That's in everything. It's in everything. I mean, in the United States, high fructose corn syrup, you just, you find it everywhere. Why do you find that everywhere? Well, Iowa is the first state. In the primaries for both the Democratic and the Republican Party, and so a lot of politicians in the United States pander to Iowa.

That's why we have so, you know, corn subsidies. That's why ethanol is a big thing here because corn is a, you know, has a big lobby in the United States. And for liquid sugar. So that's what he attributed it to. Now, I don't know if that's true. See, here I am. I don't even remember the source. I don't even remember whether it was reliable source.

All I remember is what he said. So I'm a perfect example of what you're talking about. Me, the guy who studies disinformation for a living, I'm not disciplined enough to know whether that story that I read was accurate or not. So, shame on me. That, and yet that's how easy it is not to be, to be disciplined about it.

I should only watch the BBC. I should only watch the BBC and then I would be in better shape. And I should only eat organic food and never eat any meat either, but it's really hard.

[00:56:08] Matt Geleta: Yeah, it reminds me a little bit of, uh, I once listened to an interview with Daniel Kahneman, who people will know wrote the, uh, wrote or co wrote the book, uh, Thinking Fast and Slow, and has done a lot of research on cognitive biases. And I think he was asked, you know, having done all this research, um, you know, what are your strategies to protect yourselves from these biases and and how much better have you gotten?

And I think his answer was not not only that has he not gotten any better but he's probably gotten worse at a lot of them just um and it's it's fascinating and uh

[00:56:36] Lee McIntyre: it's funny, isn't it? I, it's, I love that. I love that answer.

[00:56:44] Matt Geleta: Nonetheless at the at the sort of closure of your book you do present several practical steps uh for individuals um I think that it's it's very helpful and I should mention one thing that I appreciate about the book is it's Very concise and, uh, and readable, you know, it's not, uh, it's not good information buried within thousands of difficult to read pages.

It's, um, very practical, so I appreciate that and, and I would recommend people who are interested to take a look. Um, several of the rules, uh, that you put down here are interesting. Um, I would want to dig into one in particular, and then I might ask if there are any that you would want to comment on, but

[00:57:24] Lee McIntyre: Sure.

[00:57:25] Matt Geleta: one that you put, I think it's rule number, I'm counting here, six, is to stop looking for facile solutions to disinformation.

Uh, can you comment on, on that, on that rule? What are these facile solutions that people are looking for and how can we stop ourselves from, from doing this?

[00:57:43] Lee McIntyre: I think that the problem with fast thought solutions is that it morphs over into wishful thinking, where we think that somebody else is going to save us. If we just, Yeah, we can count on the media or if we just elect, it's our representatives fault in Congress. One of the facile solutions is to say, um, we just need better education. We need to, you know, we need to teach critical thinking. And the only reason that's a facile solution is because they're right. We absolutely do need better education, more critical thinking, but we need it fast. And that's not sufficient to solve the problem.

I mean, we can't wait for the kids to grow up to save us. Now some people say we don't have to wait for the kids to save us. You know, we can, we can do this now for adults, and that's absolutely right as well. Uh, my friend Andy Norman, wrote a book called Mental Immunity, where he talks about how adult people can learn how to reason better and to condition themselves and even inoculate themselves against, uh, miss and disinformation.

And it's a very good book and we should all, you know, follow its tenets. That's not all we should do. We should not just count on ourselves to be resistant to disinformation. We should try to stop the amplification of disinformation. It would be like in a pandemic saying, well, try not to get sick. Yeah, good advice.

Try not to get sick. But why don't we also look for the vector of transmission that's making us sick? That's what we also need to do.

[00:59:25] Matt Geleta: Yeah, well good recommendation. I'll link that book to the show notes and that's a nice stepping off point to some of the questions I like to close with which relate to books. You know, personally throughout my life I've found that, um, books, you know, it's a, it's a bigger bar to getting a book out there than pretty much anything else and, and several other quality checks.

And so typically I think the information that one finds in, in a book is going to be of a lot higher standard. And, uh, that's certainly my preferred. place of getting information. And so on the topic of books, the question I have for you is, which book or books have you most gifted to other people and why?

[01:00:06] Lee McIntyre: going to change the question simply to say, which book have I recommended to people? Because, gifting books to people, I really I haven't, I'd have to say my own, and that's not the answer that you're looking for, right? Because that's self serving for a writer, but it's really true.

Most of the books that I give away are my own books. But, let me Take the spirit of the question. The book that I've recommended the most recently is actually free. You can get it, uh, you can get a pdf on the internet or you can write and get a free copy from the publisher. It's called the Handbook of Russian Information Warfare and it's published by NATO. It is a training manual for NATO soldiers and commanders to learn about disinformation tactics, specifically about Russian disinformation tactics. Because even in the army, even in, you know, the, the military around the world, they don't realize that we're in an information war and have been for the last 20 years with Russia, they, Russia already considers itself to be in an information war with the West about science, about democracy, about a lot of things, that book, the handbook of Russian information warfare.

Um, I can't remember the name of the author, but if you just put that in Google. It'll pop up and you can get a PDF immediately. And here's the thing I wrote to NATO and asked, said, can I have a free copy? And they sent me one postage paid. So anybody, and it's a very thin, readable, nice, uh, book that will light your hair on fire because they tell stories in there that, you know, the person who wrote it is a, you know, a NATO researcher. Every single thing in that book was public access. There's no classified material in that book. It's all public access, but some of it he translated from the Russian. Some were, you know, publicly available things in the Russian language that he translated. Very scary things. I won't spoil the plot of the book because it's, it's so short you could actually read it.

Just about as quickly as you could read mine. I mean, it's, it's physically a little larger than my book. But it's, you know, it's, it's small. It's not that many pages in it. You could read it in about an hour and a half. It will change your life in an hour and a half if you read that book.

[01:02:45] Matt Geleta: Wow. Interesting recommendation. I think, uh, and NATO is about to be sending thousands of book copies around the world.

[01:02:50] Lee McIntyre: ha ha ha! They, they should. That, that should be the, that should be the, the point. Or people will go for the, for the PDF. It's a, um, it's a, it's a terrific... I forget who recommended it to me. But I, I, I like paper copies. I mean, I'm a writer. You can see behind, I've got all these books, you know, that I keep for myself.

I don't give away. And then I, um, so I like physical books because I mark them up. And, uh, that, that, that, uh, handbook is, is very dog ear.

[01:03:20] Matt Geleta: Yeah. Fantastic. Well, for my non US listeners, I recommend getting the PDF and not shipping the paper over the world. But if you're in the US, which I think is about half of you at this point, um, go for it. Um, uh, second, last question. Um, So non fiction books, that's sort of where I focus and my guests are typically recommending non fiction books, but, uh, fiction is often a place of great value and, and other types of lessons and truths.

Have there been any fiction books that come to mind that have had a big influence on you or that you've particularly enjoyed?

[01:03:49] Lee McIntyre: I love to read thrillers. Um, I, I, I love to read fiction. When I'm on vacation, I take a John Grisham novel with me, you know, something, and it's now hard because I've kind of read them all, so I'm branching out to, to other folks, and in fact, I love thrillers so much that one time, I thought, this, this couldn't be that hard, I'm a writer, I'll try to write a thriller, took me 10 years, because I was very bad, as everyone is when they start, and I had to take classes, and you know, hire somebody to read my manuscript and tell me what was wrong with it, wrote it, tore it down, wrote it, tore it down 10 times.

And after 10 years, I published my first novel, which was a thriller. And the reason I enjoyed that process, the reason I wanted to do it is because I think you're absolutely right that some truths can only be told through fiction, which is interesting because fiction is Made up. I mean that's you know novel.

It's a book length work of fiction It's just it's a made up story and I studied this information and fiction and lies all the time So why would I want to write a novel? I think it's because a novel changed my life and that is George Orwell's 1984. That was my favorite book when I was 14 years old. It really changed my thinking About the world and what the world might be. And so that that's another, I mean, if you want to talk about my most gifted book, I did buy several copies of 1984 and give it to people. You know, earlier in life, what I'm reading right now, I'm reading, uh, Cormac McCarthy's last book. I didn't know it was his last book when I started it. And what happened is I basically stopped reading when he died because I liked his stuff so much that it was kind of like that last box of candy and you realize if you eat it all up, there's no more candy.

So I, you know, he published two novels at once. Um, it was, uh, the passenger and Stella Stella Maris. And I read them out of order, because I got Stella Maris first, and so now I'm about three quarters of the way through The Passenger, and I simply cannot bring myself to finish it, because when it's done, there's no more Cormac McCarthy.

There's no unfinished book by him. He finished two at once. So I'm kind of reading it a page at a time, like, you know, you enjoy the candies in the box, but just discipline yourself one at a time. So I haven't read any Grisham in a long time, because I'm just spending months on the Cormac McCarthy.

[01:06:21] Matt Geleta: Yeah, wonderful. Uh, well, Lee, this has been a, this has been a really beautiful conversation. Two, two things before we close. First, could you say the name of your fiction book that you wrote? And then secondly, any final words for the, for the audience before we wrap?

[01:06:37] Lee McIntyre: I wrote two novels, because after you write one, you know how to write a novel, so then I wrote another one. My debut novel is called The Sin Eater, and, you know, if you don't like crime and mayhem, don't buy it, because it is, uh, it's a thriller. And the other book is called The Art of Good and Evil, uh, and it also, you know, same genre, it's a, uh, it's a thriller.

So. You know, I've, I've, uh, in my whole career, I've, uh, written, edited, you know, contributed to, you know, uh, things that have my name on them. Sixteen books. Uh, those two are among my favorite because they were the hardest to write. I mean, there was just blood on every page because writing fiction is so, so difficult.

Nothing in life had prepared me for it. I'll leave to other people to judge whether they're very good. And my nonfiction outsells my fiction, bye. You know, a hundred times, but there are some people who like thrillers, so thank you for asking. And then the last question was...

[01:07:44] Matt Geleta: any final words for the audience?

[01:07:46] Lee McIntyre: If you have any curiosity about any of my other work, my events, uh, other shows I've been on, or you want to get in touch with me, please go to my website, leemackentirebooks. com. It's, uh, it's got all my social media handles. Uh, you can see all my other books if there's anything that you want to, uh, buy.

You can see where I'm going to be speaking next if I'm coming to, to your area. I was just in Australia, by the way, I'm sorry we didn't get a chance to, to meet. I was, uh, giving a talk at the University of Sydney and, uh, so, uh, I'm, I'm sorry I, uh, missed you, uh, when I was, uh, when I was there, but, um, yeah.

[01:08:28] Matt Geleta: Oh, well, next time, um, in any case, Lee, it has been, uh, it's been a great virtual conversation. Thank you so much for, uh, for making the time to speak to me.

[01:08:35] Lee McIntyre: Thank you so much. I really enjoyed our conversation.

Paradigm
Paradigm
Conversations with the world's deepest thinkers in philosophy, science, and technology. A global top 10% podcast by Matt Geleta.