"It's a kind of information that we find particularly appealing that people looking to fool us will use, but it can lead to innocent spreading"
Host Paul Brandus explores the concept of truth bias and its implications in today's society with guests Daniel Simons and Christopher Chabris, authors of the book "Nobody's Fool, Why We Get Taken In and What We Can Do About It." They discuss how our innate trust in familiar sources can make us vulnerable to misinformation and the importance of skepticism in evaluating information. The conversation delves into the challenges of focusing on only what is directly in front of us, potentially overlooking crucial context. Meredith Wilson, CEO of Emergent Risk International, joins the discussion to provide insights on how trust and skepticism play a crucial role in navigating the complex landscape of information consumption. The episode highlights the need for critical thinking and awareness in an age where information is constantly at our fingertips.
[00:02:06] Truth bias and deception.
[00:05:47] Calibrating trust and skepticism.
[00:09:50] Familiarity and trust on social media.
[00:12:25] Focusing on selective information.
[00:17:07] Trust in societal systems.
[00:21:29] Familiarity and trust in information.
[00:24:27] Human nature and information consumption.
Got questions, comments or ideas or an example of disinformation you'd like us to check out? Send them to paulb@emergentriskinternational.com. Subscribe wherever you get your podcasts.
Special thanks to our guests Daniel Simons and Christopher Chabris, our sound designer and editor Noah Foutz, audio engineer Nathan Corson, and executive producers Michael Dealoia and Gerardo Orlando. Thanks so much for listening.
Learn more about your ad choices. Visit megaphone.fm/adchoices
[00:00:00] Who Do You Trust?
[00:00:05] It was one of the more popular TV shows back in the 1950s.
[00:00:09] Who do you trust?
[00:00:10] Who do you trust?
[00:00:14] The host, by the way, was a young man named Johnny Carson.
[00:00:17] It was based on a series of questions that a couple was asked and, if they trusted,
[00:00:23] what their spouse or friend was telling them.
[00:00:26] It's an interesting premise and one that holds relevance today.
[00:00:31] We get a great deal of information today from those who are closest to us, friends, relatives,
[00:00:37] coworkers.
[00:00:38] We're inclined to believe what they share.
[00:00:40] It's healthy.
[00:00:41] It's also necessary.
[00:00:43] As the late educator Stephen Covey, author of Seven Habits of Highly Effective People,
[00:00:49] put it, trust is the highest form of human motivation.
[00:00:53] He was right.
[00:00:54] Disinclination is called our truth bias.
[00:00:57] But sometimes what others share with us, perhaps inadvertently, might not be accurate.
[00:01:03] It could be misinformation or it could be disinformation.
[00:01:13] I'm Paul Brandus and that's the name of this award-winning podcast series, Disinformation,
[00:01:18] a co-production of Evergreen and Emergent Risk International, or ERI, a global risk advisory
[00:01:25] firm.
[00:01:26] As usual, I'll be joined by ERI's Chief Executive Officer, Meredith Wilson, in a few minutes.
[00:01:33] The issue of truth and trust is one subject of a fascinating book.
[00:01:38] It's called Nobody's Fool, Why We Get Taken In and What We Can Do About It.
[00:01:44] I spoke with the co-authors, Daniel Simons, a psychology professor at the University of
[00:01:49] Illinois and cognitive scientist Christopher Chabrie, he's a professor with the Geisinger
[00:01:55] Research Institute in Pennsylvania.
[00:01:58] The discussion began with their explanation of so-called truth bias.
[00:02:03] The first voice here is Chris.
[00:02:06] So truth bias is simply put a default assumption that we all make that whatever we're being
[00:02:13] told or what we're seeing or information that's being conveyed to us is true, as opposed
[00:02:18] to just automatically assuming everything is false or for that matter not even assigning
[00:02:24] a truth value, either true or false to information but just remaining uncertain about it.
[00:02:30] So our habit of thinking that things are true sort of automatically is the first thing
[00:02:36] that makes us susceptible to misinformation, disinformation, other kinds of deception.
[00:02:41] And it's not a dumb thing to have a truth bias because if you didn't have a truth bias
[00:02:46] and you stopped and checked every little bit of information that was coming at you from
[00:02:49] all channels all day long and so on, you'd never get anywhere in life.
[00:02:53] But people can exploit the truth bias to sort of get in information perhaps before
[00:02:58] you've had a chance to check it or be skeptical about it or to question it deeply.
[00:03:03] I mean, if somebody tells me for example, if I say well what time is it?
[00:03:07] When they say it's two o'clock, I'm inclined to believe them.
[00:03:11] I'm not going to necessarily challenge that.
[00:03:15] At what level though do should we begin to challenge or be skeptical about something
[00:03:21] that someone says?
[00:03:23] I think there are a couple of times.
[00:03:25] One is when we know the consequences to be big.
[00:03:28] So you're making a big investment or you're making some decision about who you're
[00:03:32] going to vote for or things that have direct and large consequences potentially for you.
[00:03:39] Those are times when you should check a bit more.
[00:03:42] But another time you should check is when you're passing the long information.
[00:03:47] So one of the things we tend to do is when we get something say on social media,
[00:03:52] a friend of ours on Facebook or on Instagram or someplace post something
[00:03:56] that sounds interesting to us and sounds right to us,
[00:03:58] we tend to not think as critically about it.
[00:04:01] We tend to accept it as true and we don't stop to ask,
[00:04:04] wait, is that really true before passing it along?
[00:04:07] And that's often how unintentionally we spread misinformation.
[00:04:12] Are we simply too trusting?
[00:04:15] Well, it really depends on the situation.
[00:04:18] Like I think overall we are probably not too trusting.
[00:04:21] Probably on average, we're probably trusting about the right amount.
[00:04:23] And one reason I say that as a species, we've kind of gotten this far
[00:04:27] with some kind of truth bias and we're still here.
[00:04:29] So it can't be too far off, but it does make us in certain situations too trusting.
[00:04:35] And it does make people who are aware of our truth bias
[00:04:38] whether explicitly because they've looked up the concept
[00:04:40] or just because implicitly they've learned that they've learned about
[00:04:44] and they've learned what they can get away with and what they can't.
[00:04:46] It does make us vulnerable to those people.
[00:04:51] The real trick I think is calibrating your amount of distrust
[00:04:55] and the amount of effort you put into checking things.
[00:04:58] You just mentioned talking to Leon Panetta.
[00:05:02] There's a great example in all the accounts of the raid
[00:05:07] that killed Osama bin Laden of being vulnerable to overconfidence
[00:05:12] and truth bias and believing what you want to believe,
[00:05:15] but beforehand actually checking it
[00:05:18] and having a red team look into how alternative scenarios.
[00:05:22] So one of the ways to fight truth bias is sort of generate alternatives,
[00:05:26] generate questions, estimate your uncertainty
[00:05:28] as opposed to just believing or not believing.
[00:05:30] A lot of people there said, well,
[00:05:31] we think there's a 70% chance that he'll be there when we go
[00:05:34] or we think there's only a 40% chance.
[00:05:36] All of those techniques are just not things we do in our everyday life.
[00:05:39] Like we don't stop engaging all that stuff before we make every decision,
[00:05:42] but we should for the important ones, right?
[00:05:45] Well, that's a difficult distinction though
[00:05:49] when things move so quickly.
[00:05:51] You mentioned that we need to calibrate our thinking
[00:05:54] where the way we evaluate things from time to time.
[00:05:59] To me, that does not seem like an exact science.
[00:06:02] I mean, it's just very amorphous.
[00:06:04] I mean, how do you do that?
[00:06:05] What have you learned about how and when we should do that?
[00:06:09] Well, there's sometimes when it's really obvious
[00:06:11] that it's not worth their time.
[00:06:13] So for example, you go to the grocery store
[00:06:16] and there's a sign that says organic apples.
[00:06:19] And you're probably best off just believing
[00:06:22] that those actually are organic apples.
[00:06:24] You're probably not going to go out to the orchard
[00:06:25] and watch them to make sure that they only use organic fertilizers
[00:06:28] for the last 10 years, right?
[00:06:30] That's just not something you can do.
[00:06:32] So there are times when it just doesn't matter.
[00:06:34] Or for example, you go to the grocery store
[00:06:36] and it's possible that the grocery store
[00:06:38] is adding pennies to every item when you check out.
[00:06:41] So is it worth checking every item
[00:06:44] against its price on the receipt
[00:06:46] to the price on the shelf?
[00:06:48] Probably not unless you're really in financially dire straits
[00:06:51] and you have to count every penny.
[00:06:53] Probably not worth your while
[00:06:54] because the consequences aren't that big.
[00:06:56] So some cases when it's obvious
[00:06:58] that you probably don't want to spend a lot of time
[00:07:00] and effort being cynical.
[00:07:02] What's obvious is a objective thing to each individual.
[00:07:06] Exactly, yeah.
[00:07:07] People may vary in this, right?
[00:07:08] Some people will want to be very careful
[00:07:10] to make sure that they don't get taken
[00:07:12] for even small amounts of money
[00:07:13] and other people are going to be calibrated
[00:07:15] the other way and say, well, you know
[00:07:16] if I lose a little bit, okay.
[00:07:17] The end points, the extremes are really pretty easy.
[00:07:20] And if you're investing, you're buying a house,
[00:07:23] you want to make sure that you're not buying a bad house.
[00:07:26] It's a lot of money for everybody.
[00:07:28] If you're buying groceries and it's pennies,
[00:07:31] probably not worth your time.
[00:07:33] But it's the intermediate cases
[00:07:34] where you don't quite see the consequences from the start
[00:07:38] and that it can ramp up
[00:07:40] and become much more consequential
[00:07:42] that we sometimes,
[00:07:43] that's where it becomes a more challenging problem
[00:07:45] to calibrate well.
[00:07:46] Part of the rum like you mentioned
[00:07:48] is speed and distraction, right?
[00:07:50] Like those two things may get hard
[00:07:52] for us to actually evaluate
[00:07:54] how important it is to get this right.
[00:07:56] How likely is it that someone might be misleading me
[00:07:59] or have a desire to mislead me, right?
[00:08:02] When we're just scrolling through social media
[00:08:04] we don't necessarily think
[00:08:06] what are the odds that someone's trying to deceive me
[00:08:08] by posting this
[00:08:09] or by making sure it gets into my feed
[00:08:10] or something like that.
[00:08:13] There's no panacea, but you know, slowing down
[00:08:17] and I think consciously thinking about the idea
[00:08:19] that it's okay to be uncertain
[00:08:21] about whether something is true or not.
[00:08:22] You don't have to have a belief in everything, right?
[00:08:24] You don't have to put a truth value in everything.
[00:08:26] You could just say, I don't know
[00:08:27] or maybe or whatever
[00:08:29] and people have done studies where they
[00:08:32] generally conclude that people do want
[00:08:35] to pass on true information.
[00:08:36] People don't wanna pass on false information.
[00:08:38] People don't want to be taken in by false information.
[00:08:40] I think a lot of the time though
[00:08:41] they just don't spend the time or effort
[00:08:44] for valid reasons, you know
[00:08:46] to actually think about that
[00:08:48] or just to stay uncertain
[00:08:49] and not pass along stuff that they're not sure is true.
[00:08:54] On that point, Chris, that's an interesting observation.
[00:08:58] It's no secret that for example
[00:09:01] on social media, Facebook,
[00:09:03] I'm not singling them out
[00:09:04] but Facebook and other platforms
[00:09:07] where people share things
[00:09:09] and the phenomenon is if you get something
[00:09:12] from branded work or relative
[00:09:15] somebody you know tend to trust
[00:09:19] you're more likely to believe
[00:09:23] place a greater value in whatever it is
[00:09:25] that they are sending you
[00:09:28] and not necessarily give it any additional thought.
[00:09:32] Tell me about that phenomenon
[00:09:34] and maybe it's a subset of that.
[00:09:37] Is that sort of an environment
[00:09:39] that sort of scammers to use a lack of a better word
[00:09:44] can take advantage of?
[00:09:45] Is it taking advantage of the inherent trust we have
[00:09:47] and those who are closest to us?
[00:09:50] Yeah, in fact it's what we call a hook.
[00:09:52] It's a kind of information that we find
[00:09:54] particularly appealing
[00:09:56] that people looking to fool us will use
[00:09:58] but it can lead to
[00:09:59] in this sense spreading of misinformation as well.
[00:10:02] So that's the hook we call familiarity
[00:10:05] which makes a lot of sense.
[00:10:07] Again, in most of our daily lives
[00:10:09] we should be more trusting
[00:10:10] with people we're really familiar with
[00:10:12] people we've known for a long time
[00:10:14] if they deceived us all the time
[00:10:16] they probably wouldn't be people we hung out with anymore.
[00:10:18] So we should have good trust
[00:10:21] of people who are highly familiar to us.
[00:10:23] One of the challenges with social media
[00:10:25] is that we have a lot of friends in scare quotes
[00:10:28] who we might not know all that well.
[00:10:31] We might have a huge following
[00:10:33] number of followers in social media
[00:10:34] or a number of friends in social media
[00:10:36] who pass stuff along
[00:10:37] and we see their content all the time
[00:10:39] but we don't necessarily know
[00:10:40] how well calibrated they are
[00:10:43] and we have this tendency
[00:10:45] when we have a friend who provides information to trust it
[00:10:48] which is a reasonable thing
[00:10:49] to have developed over time to do
[00:10:53] but we tend to do that maybe when it's not merited.
[00:10:55] It's the whole basis of using celebrities
[00:10:57] and advertisements, right?
[00:10:58] These are people who are familiar
[00:10:59] we're used to seeing all the time
[00:11:01] and we maybe trust them more than we should
[00:11:03] but in social media that's amplified
[00:11:05] because people are sharing information
[00:11:07] that they think you would like
[00:11:08] and be interested in
[00:11:09] and we kind of take it as true
[00:11:12] more when it comes from somebody we know.
[00:11:15] Again, my guest, Daniel Simon
[00:11:17] a psychology professor at the University of Illinois
[00:11:20] and cognitive scientist Christopher Chabry
[00:11:23] he's a professor with the Geisinger Research Institute
[00:11:26] in Pennsylvania.
[00:11:28] Their book is called Nobody's Fool
[00:11:30] Why We Get Taken In and What We Can Do About It.
[00:11:34] In their book they wrote about how
[00:11:36] and again this is a very human thing
[00:11:39] very understandable
[00:11:40] how we tend to focus on things
[00:11:42] that are directly in front of us.
[00:11:45] That means though that we can often overlook
[00:11:48] additional data or context
[00:11:51] that can help convey a richer
[00:11:53] more accurate understanding of something.
[00:11:55] If those things are peripheral
[00:11:58] again not directly in front of us
[00:12:00] how can we notice it?
[00:12:02] Chris answered this one first.
[00:12:04] We call this problem the problem of focus
[00:12:06] and of course focus is a good thing
[00:12:07] because when we focus on something
[00:12:09] we're able to do a lot more with it
[00:12:10] we can understand it better
[00:12:11] we can process it more deeply
[00:12:13] we can think more detailed thoughts about it
[00:12:15] but when we're focusing on one thing
[00:12:17] or a set of things
[00:12:19] other things might get no thought at all
[00:12:21] no attention at all
[00:12:22] might not enter our consideration at all.
[00:12:25] And of course marketers and sales people
[00:12:27] and influencers will often be aware of this
[00:12:30] and use it to direct our attention
[00:12:31] towards let's say only their success stories
[00:12:34] and their success stories
[00:12:35] if we pay too much attention to them
[00:12:37] we won't even think about
[00:12:37] well how often did they fail?
[00:12:39] What did their worst customer engagements look like?
[00:12:42] What are those people saying?
[00:12:44] Or stock picks, anything like that
[00:12:48] it's I think a general problem of disinformation
[00:12:50] by the way also that you can disinform people
[00:12:53] by telling them only true things.
[00:12:56] So if someone fact checks
[00:12:59] a bunch of true stories about
[00:13:01] let's say immigrants who committed crimes
[00:13:03] all those stories could be true
[00:13:05] but if no one tells you any stories
[00:13:06] about crimes that not committed by immigrants
[00:13:08] or about immigrants who didn't commit crimes
[00:13:10] or about non-immigrants who didn't commit crimes
[00:13:14] you're missing almost all of the relevant data
[00:13:16] to evaluate the meaning of what you've been shown
[00:13:19] and then you could be massively disinformed
[00:13:21] by completely true but unrepresentative information
[00:13:24] because that's all you paid attention to.
[00:13:26] But taking something just completely out of context.
[00:13:28] Yeah, we're just giving you a partial view
[00:13:30] which of course is what happens in any
[00:13:33] in a negotiation where one side has more information
[00:13:35] or any sort of performance where a magician
[00:13:38] will not tell you everything that they're going to do.
[00:13:41] And con artists of course present exactly what you want to see
[00:13:43] and not all of the information they don't want you to see.
[00:13:47] So that happens a lot even in completely innocent ways.
[00:13:51] People will pass along the examples
[00:13:53] that you see on your social media feed
[00:13:54] are probably the things that you agree with
[00:13:56] and you're not seeing the counter examples
[00:13:58] because nobody's sharing those.
[00:13:59] You mentioned advertising a minute ago.
[00:14:03] Advertising is often crafted in a way
[00:14:08] that appeals to our desires, their expectations,
[00:14:11] makes us want to go out and try a new product
[00:14:14] or go to some destination or something.
[00:14:17] That is a very powerful thing.
[00:14:20] We want to believe that these products are better
[00:14:26] or that this restaurant is better than something else
[00:14:28] it makes us want to, you know, tell me about that.
[00:14:31] You want to believe the message
[00:14:32] that we're being exposed to, don't we?
[00:14:36] Well, often as you mentioned expectations, advertisers
[00:14:40] and many other communicators are well aware
[00:14:43] of people's expectations.
[00:14:45] And if you receive a message that you were expecting
[00:14:49] to get, you're more likely to believe that it's true.
[00:14:52] So sometimes advertising and other communications
[00:14:55] can work well on a principle of surprise.
[00:14:57] And often those are the ones that, you know,
[00:14:59] sort of maybe like get the awards for a clever
[00:15:00] as Super Bowl ad because something really surprising happened.
[00:15:02] But that's an ad for aesthetics,
[00:15:04] not necessarily for success in sales, right?
[00:15:07] You know, like that's an artistic achievement in a way.
[00:15:09] Even outside of advertising, let's say in science
[00:15:14] which is our field, sadly there's fraud in science.
[00:15:17] There are scientists who don't really deserve the name
[00:15:20] who actually just sort of create, you know,
[00:15:24] fraudulent data, fraudulent papers and so on.
[00:15:26] But they don't create fraudulent papers
[00:15:28] that claim to have discovered, you know,
[00:15:30] a new planet orbiting like right next to Earth
[00:15:33] or something like that.
[00:15:34] They create discoveries, they fabricate discoveries
[00:15:36] which are sort of exactly the next thing
[00:15:38] that people in their field would expect to be discovered
[00:15:40] or would expect to be shown true
[00:15:42] or supports a theory that everyone already believes
[00:15:44] or something like that.
[00:15:45] They satisfy their audience's expectations
[00:15:48] in much the same way that advertisers,
[00:15:50] politicians and other communicators do
[00:15:52] by understanding sort of what people are predicting
[00:15:55] is gonna happen and then showing them,
[00:15:56] hey, that's exactly what happened.
[00:15:59] Not a new planet orbiting right next to Earth
[00:16:01] but something, you know, more modest than that
[00:16:03] but of value to them to convince us of.
[00:16:06] It's just a slightly newer and fresher product,
[00:16:09] slightly better than what was there before
[00:16:10] and that's enough to make them say,
[00:16:12] oh yeah, maybe that's right.
[00:16:17] Let's take a short break when we come back out chat
[00:16:19] with Meredith Wilson of Emergent Risk International.
[00:16:24] This series on disinformation is a co-production
[00:16:26] of Evergreen Podcasts and Emergent Risk International,
[00:16:30] a global risk advisory firm.
[00:16:32] Emergent Risk International.
[00:16:33] We build intelligent solutions
[00:16:35] that find opportunities in a world of risk.
[00:16:46] Coming up on 5 Minute News, I'm Anthony Davis.
[00:16:50] You might think it's partisan
[00:16:52] because maybe it's critical of one side or the other
[00:16:55] but it's not, it's just the truth
[00:16:57] and I think that's also something that's kind of unusual
[00:16:59] for Americans listening to the radio or to podcasts
[00:17:03] because the news landscape in the States
[00:17:06] has been so partisan for so many decades
[00:17:09] so 5 Minute News is verified, truthful, independent,
[00:17:14] unbiased and essential world news daily.
[00:17:19] Hello, this is Gary Chahot,
[00:17:21] welcoming you to check out the French History Podcast.
[00:17:24] Our main show covers the history of France
[00:17:26] from the first humans until present.
[00:17:29] If you like Mike Duncan's The History of Rome
[00:17:31] and wanted a similar program covering the land of beauty,
[00:17:35] culture and love, we are exactly that.
[00:17:38] We also host world-renowned scholars
[00:17:40] who have delivered guest episodes on their specialties
[00:17:43] including 18th century pirates,
[00:17:46] revolutionary booksellers in 20th century Paris,
[00:17:50] the special friendship between the Marquis de Lafayette
[00:17:53] and Thomas Jefferson and numerous others.
[00:17:56] Learn what you love and listen
[00:17:58] to the French History Podcast today.
[00:18:08] Welcome back, let's bring in Meredith Wilson now.
[00:18:10] She's Chief Executive Officer
[00:18:12] of Emergent Risk International.
[00:18:14] I asked her about the so-called truth bias
[00:18:17] that Daniel and Chris wrote about in their book,
[00:18:20] the thesis being that we're inclined
[00:18:22] and this is healthy to believe what others tell us.
[00:18:26] Oh, I think that's absolutely true.
[00:18:28] You know, you think about particularly
[00:18:30] if we're just speaking about Americans
[00:18:33] and we can talk about other countries too
[00:18:36] but when you think about the US
[00:18:39] and you think about how many of our systems
[00:18:44] and even our economic system, our government,
[00:18:49] so much of that is based on trust
[00:18:51] and in order for it to work, you have to trust people
[00:18:55] and over the past six, eight decades,
[00:19:01] we got pretty good at trusting each other
[00:19:04] which is one of the reasons that things work
[00:19:06] and back in the day, we used to trust the media
[00:19:11] for the same reasons.
[00:19:13] The double-edged sword there
[00:19:14] is if you are constantly questioning everything,
[00:19:17] you tend not to trust people.
[00:19:19] So yeah, I think there is a truth bias
[00:19:22] and I think that there is a real dilemma
[00:19:27] in terms of how to manage that effectively
[00:19:31] when you want a society that trusts people,
[00:19:35] you want people to trust people
[00:19:38] but if you become too trusting
[00:19:40] particularly in the current environment,
[00:19:42] how much are you missing?
[00:19:44] So skepticism is warranted
[00:19:47] but the question is, if I ask you what time it is
[00:19:51] and you say, well, it's two o'clock,
[00:19:53] I'm inclined to believe you.
[00:19:56] I'm not gonna presume that you're lying about the time.
[00:20:00] The question then is at sort of at what point,
[00:20:03] how serious does the issue have to be
[00:20:05] before we do become skeptical about what we're being told?
[00:20:10] Have you told me the time, okay, I believe that
[00:20:12] but farther down the line as the issues get more serious
[00:20:16] when do we start to become more skeptical
[00:20:19] or when should we become more skeptical
[00:20:21] about what we're exposed to?
[00:20:23] The good question.
[00:20:25] I think again, if you look internationally
[00:20:29] and you look at some of the countries
[00:20:31] where corruption runs rampant
[00:20:34] where people are far more likely to enrich themselves
[00:20:38] and their families, also more likely
[00:20:40] to hire family members, a lot of that is due
[00:20:45] to those societies having very, very weak trust
[00:20:48] in each other.
[00:20:51] And so healthy skepticism is important
[00:20:56] and critical thinking is important.
[00:20:59] And when we talk about disinformation,
[00:21:02] a lot of what we really need to see in people
[00:21:05] is a certain level of skepticism
[00:21:07] in the information that is presented to them
[00:21:11] and a good process for determining what is true
[00:21:17] and what is not true.
[00:21:18] And if I had the exact answer to how to fix that,
[00:21:22] I'd be a very rich person right now
[00:21:24] but there has to be a balance, right?
[00:21:27] You have to be able to trust that the person
[00:21:30] that's coming up to the traffic light across from you
[00:21:33] is gonna stop.
[00:21:34] Otherwise, you would sit there all day long.
[00:21:37] You have to be able to trust that,
[00:21:39] when you ask somebody for directions
[00:21:41] that they're gonna give you decent directions
[00:21:44] or you'll never ask anybody anything.
[00:21:46] I guess the directions thing isn't important now
[00:21:49] but just we have GPS but it's a hard question answer.
[00:21:55] They also talk about something called familiarity.
[00:21:59] I don't think this is exactly new
[00:22:02] but it's a very powerful concept in terms of media
[00:22:07] and how we read and see and hear things today.
[00:22:10] And basically that is that if someone we know
[00:22:15] and trust and like a coworker or relative
[00:22:18] or someone like that, share something with us
[00:22:22] send something to us, we are more inclined to believe it
[00:22:27] because of our views toward that person.
[00:22:31] In other words, we can drop our guard
[00:22:34] which can make us more susceptible to false narratives.
[00:22:37] A very human instinct, very understandable
[00:22:40] but it also makes us potentially vulnerable.
[00:22:44] How serious an issue is that?
[00:22:46] And if you think it is a serious issue
[00:22:48] how can we overcome it?
[00:22:50] The concept of familiarity is definitely not new
[00:22:53] and it's something that as a society
[00:22:58] we've definitely learned how to capitalize on.
[00:23:00] So whether it's advertisements and public figures
[00:23:06] and endorsements, we definitely know how to capitalize on that.
[00:23:11] And there is definitely an aspect of disinformation
[00:23:14] where that's exactly what's happening.
[00:23:17] If you look at sort of information warfare 101
[00:23:21] involving people who are familiar to the people
[00:23:25] that you're targeting is very much a methodology
[00:23:30] that's in use today
[00:23:31] but has been in use for a long, long time.
[00:23:33] How to overcome it is a harder question.
[00:23:37] And I think probably the beginning of that
[00:23:41] is educating people and helping them understand
[00:23:46] how that happens and how that works.
[00:23:48] But can we overcome it completely?
[00:23:52] I don't know.
[00:23:53] It's a one issue that they mentioned here
[00:23:56] is the sort of the peripheral issue
[00:23:59] and what they mean by that is we tend to focus on
[00:24:03] things that are directly in front of us
[00:24:06] and doing so when we look at things that are right
[00:24:09] in front of us, we can overlook say additional data
[00:24:12] or context that's on the periphery
[00:24:15] that can help convey a richer,
[00:24:17] more accurate understanding of something.
[00:24:19] But again, if they're not right in front of us
[00:24:21] we might not notice it.
[00:24:24] So we kind of cheat ourselves out of a fuller understanding.
[00:24:28] How can we notice things if they're not in front of us?
[00:24:31] Well, one tool that we have in the analytic field
[00:24:34] because this is analytic bias too
[00:24:37] where you tend to look for confirming information for things
[00:24:41] but oftentimes fail to look for disconfirming information.
[00:24:45] So when you are researching a topic,
[00:24:50] people will find those little details
[00:24:54] that very succinctly support their case
[00:24:58] and analysts are trained to do that.
[00:25:02] But as you said, if you don't very specifically
[00:25:06] look for the information
[00:25:08] that might tell a different story
[00:25:10] there's a very good chance you will not see it
[00:25:12] because you're not looking for it.
[00:25:13] So having that sort of gut check,
[00:25:17] is there something out there that discontorms my theory?
[00:25:20] Is there something out there that is counter
[00:25:25] to what I think is accurate?
[00:25:29] Is a good methodology.
[00:25:32] Whether or not people who are supporting
[00:25:36] a particular politician or supporting
[00:25:38] a particular policy are interested
[00:25:41] in finding that information is a whole other question
[00:25:44] with a whole other answer.
[00:25:47] Yeah, well that's the issue.
[00:25:48] Dan and Chris say, look, people should take the time
[00:25:53] to explore more, to ask more questions
[00:25:55] but that knowledge will accumulate nature as such
[00:25:57] that not calling people lazy
[00:26:00] but people aren't gonna take the time necessarily
[00:26:02] to do that they're just gonna eat up
[00:26:04] what's in front of them
[00:26:05] and that's part of the problem.
[00:26:07] It depends on if they're incentivized to do so
[00:26:10] and or if it speaks to their own interests.
[00:26:15] But it also when you look at the way
[00:26:17] that we consume information now
[00:26:19] in this sort of scrolling, small bites of information
[00:26:23] that are just kind of running past us.
[00:26:26] It's also a question of whether they are cognitively aware
[00:26:30] of that they're taking in that information
[00:26:32] and actually thinking it through
[00:26:35] because we're all prone to do things now
[00:26:37] like watching TV and reading on our phones at the same time.
[00:26:41] And so part of it too is that the focus
[00:26:45] that we put on that information sometimes
[00:26:47] is not enough for us to be putting enough brain power
[00:26:50] behind whether or not it's something
[00:26:53] we should believe or not.
[00:26:54] Does that make sense?
[00:26:55] Yes, it does.
[00:27:00] Thanks to Daniel Simons and Christopher Chabry,
[00:27:03] their book highly recommended.
[00:27:05] Nobody's full why we get taken in
[00:27:08] and what we can do about it.
[00:27:10] Our sound designer and editor Noah Fouts,
[00:27:13] audio engineer Nathan Corson,
[00:27:15] executive producers Michael Dale-Loya
[00:27:18] and Gerardo Orlando.
[00:27:20] And on behalf of Meredith Wilson,
[00:27:22] I'm Paul Brandis.
[00:27:23] Thanks so much for listening.
[00:27:25] Music
[00:27:34] Faith in the news media has been challenged,
[00:27:37] making it even harder to get stories told.
[00:27:40] The Friday Reporter Podcast was created
[00:27:42] to help audiences better understand the media
[00:27:46] by hosting journalists who will answer the questions
[00:27:48] to which we need answers.
[00:27:50] Join me every Friday to hear more.


