The Intentions of the Adversary
DisinformationMay 21, 202400:32:23

The Intentions of the Adversary

"You have a direct pipeline, particularly with social media and with individuals. Essentially, all you're doing is figuring out who you want to target" In this podcast episode, host Paul Brandus discusses the growing concerns surrounding the upcoming presidential election, focusing on the threats posed by disinformation, false narratives, and foreign interference. He interviews Marek Posard, a military sociologist, and Brady Roberts, the COO of Emergent Risk International, to delve into the potential perfect storm of risks facing the election infrastructure. The challenges of combating disinformation, the impact of AI on spreading false narratives, and the need for critical thinking and media literacy education is needed to empower individuals to discern fact from fiction. Brandus underscores the importance of proactive measures, scenario-based planning, and a whole-government approach to safeguard against the influence of disinformation in the upcoming election. [00:02:36] A potential perfect storm. [00:05:06] Election system vulnerabilities. [00:08:56] Adversaries exploiting division in elections. [00:14:09] Impact of artificial intelligence. [00:17:17] Deepfakes and election influence. [00:19:35] Election falsehoods in narrow-margin states. [00:25:54] Education on critical thinking. [00:31:02] Teaching media literacy to youth. Got questions, comments or ideas or an example of disinformation you'd like us to check out? Send them to paulb@emergentriskinternational.com. Subscribe wherever you get your podcasts. Special thanks to our guests Marek Posard and Brady Roberts, our sound designer and editor Noah Foutz, audio engineer Nathan Corson, and executive producers Michael DeAloia and Gerardo Orlando. Thanks so much for listening. Learn more about your ad choices. Visit megaphone.fm/adchoices

"You have a direct pipeline, particularly with social media and with individuals. Essentially, all you're doing is figuring out who you want to target"


In this podcast episode, host Paul Brandus discusses the growing concerns surrounding the upcoming presidential election, focusing on the threats posed by disinformation, false narratives, and foreign interference. He interviews Marek Posard, a military sociologist, and Brady Roberts, the COO of Emergent Risk International, to delve into the potential perfect storm of risks facing the election infrastructure. The challenges of combating disinformation, the impact of AI on spreading false narratives, and the need for critical thinking and media literacy education is needed to empower individuals to discern fact from fiction. Brandus underscores the importance of proactive measures, scenario-based planning, and a whole-government approach to safeguard against the influence of disinformation in the upcoming election.


[00:02:36] A potential perfect storm.

[00:05:06] Election system vulnerabilities.

[00:08:56] Adversaries exploiting division in elections.

[00:14:09] Impact of artificial intelligence.

[00:17:17] Deepfakes and election influence.

[00:19:35] Election falsehoods in narrow-margin states.

[00:25:54] Education on critical thinking.

[00:31:02] Teaching media literacy to youth.


Got questions, comments or ideas or an example of disinformation you'd like us to check out? Send them to paulb@emergentriskinternational.com. Subscribe wherever you get your podcasts. Special thanks to our guests Marek Posard and Brady Roberts, our sound designer and editor Noah Foutz, audio engineer Nathan Corson, and executive producers Michael DeAloia and Gerardo Orlando. Thanks so much for listening.

Learn more about your ad choices. Visit megaphone.fm/adchoices

[00:00:00] There is concern tonight as election officials prepare for primaries and the presidential

[00:00:10] election about the unprecedented number of threats they are facing just for doing their

[00:00:14] job.

[00:00:15] One looming threat as America's presidential election years is security at polling places.

[00:00:22] Can it be maintained?

[00:00:24] This is no abstract concern that ABC News report noting that the number of threats to election

[00:00:30] workers has surged.

[00:00:32] As November approaches, however, the threat to people is but one concern.

[00:00:38] There are also worries about election infrastructure and reputational concerns about public confidence

[00:00:45] in the overall electoral process.

[00:00:48] An important new study on these three risks, again human infrastructure and reputational

[00:00:54] says that together they constitute a potential perfect storm.

[00:00:59] And weaving its way among these worries is the issue of false narratives also known as

[00:01:05] disinformation.

[00:01:06] I'm Paul Brandes and that's the name of this podcast series, Disinformation, a co-production

[00:01:17] of Evergreen Podcasting and Emergent Risk International, a global risk advisory firm.

[00:01:24] The Rand Corporation, the Los Angeles based think tank, menses no words with its report

[00:01:30] on our coming election, indulge me in reading the first few sentences of that report.

[00:01:36] Quote, foreign governments interfered in the 2020 U.S. presidential election while some

[00:01:42] domestic leaders alleged election fraud before voting even began.

[00:01:48] Assertions that the election was stolen gained so much traction that as certification

[00:01:54] began a crowd rallied and attacked the U.S. Capitol causing damage, injury and death.

[00:02:01] Messages discrediting the election results and the agencies and officials investigating

[00:02:06] the riot and election related offenses have continued unabated particularly on social

[00:02:12] media.

[00:02:13] Unquote.

[00:02:14] But what about now?

[00:02:16] One of the authors of the study, Mark Posard is a military sociologist also faculty

[00:02:22] member at the Rand Graduate School of Public Policy.

[00:02:26] Your support of our conversation.

[00:02:29] Your report on America's upcoming election is disturbing on, well, lots of front.

[00:02:36] You say a perfect storm is brewing.

[00:02:40] Give our listeners if you could a brief summary of that perfect storm and then we'll go

[00:02:45] into the details.

[00:02:47] I think there's kind of three examples here.

[00:02:49] The first is that we do have this hyper partisanship in our country right now.

[00:02:56] And you have both parties jockeying for a very small number of votes in many of these swing

[00:03:02] states.

[00:03:03] You have the potential for a crisis that is completely or seemingly completely unrelated

[00:03:09] to the election such as an attack on critical infrastructure that could occur.

[00:03:14] There are parties that are trying to amplify various crises for their own political benefit.

[00:03:20] And then you have these adversaries that are looking for these tactical opportunities that

[00:03:24] they can jump in and they can amplify falsehoods.

[00:03:27] They can try to gin up the population and these can all converge at the right moment

[00:03:34] right being from the perspective of an adversary to really kind of create a lot of lasting

[00:03:40] problems in the 2024 election in my humble opinion.

[00:03:44] So there are domestic concerns.

[00:03:47] There are foreign based concerns.

[00:03:50] Let's talk about some of these in detail.

[00:03:52] You classify them in three general areas physical threats, human threats, reputational

[00:03:59] threats.

[00:04:00] If you could discuss them within the context of disinformation if you could this is a podcast

[00:04:07] on disinformation first the physical threats tell me about those.

[00:04:13] So the truth of the matter is that because we have a decentralized election system these

[00:04:18] physical threats which are really anything that is tangible that relates to our election

[00:04:24] infrastructure.

[00:04:25] So facilities, equipment, information storage, voting machines, ballots.

[00:04:31] What's nice here in terms of the U.S. is that we have a decentralized election system

[00:04:35] so it's actually very difficult to systematically target physical aspects of our election infrastructure.

[00:04:44] The same goes for human what we call human assets or human parts of this infrastructure.

[00:04:49] These are the people that actually run our elections, government contractors, non-government

[00:04:54] partners, volunteer poll workers and again this is very decentralized so those are

[00:04:59] hard things to actually systematically target.

[00:05:02] So we have this third asset class that we call reputational assets.

[00:05:06] These are public perceptions of physical and human assets in our election system.

[00:05:12] There's some interesting paradox with our election system it's decentralized so it's

[00:05:16] really hard to systematically target physical and human assets but that makes it really

[00:05:21] easy to target reputational assets.

[00:05:24] It's really easy in the U.S. to make people think an election was stolen because all

[00:05:27] you got to do is find a few one-off examples of where something maybe not right happened

[00:05:33] and then you can amplify them.

[00:05:35] And I think that's where this issue of mis and disinformation comes in is you can use

[00:05:40] one-off examples, amplify them to kind of essentially gin up the population.

[00:05:46] Now we saw in 2020 not to delve into recent history but there were obviously plenty

[00:05:54] of examples where people took one little discrepancy or minor anomaly in one voting system in one

[00:06:04] county or precinct or something blew it up but there have been scores of court challenges,

[00:06:12] court investigations, these kinds of things.

[00:06:16] None of them have really proven to show that there has been a widespread or even

[00:06:23] minor election fraud and yet these views continue to persist.

[00:06:30] Why is that?

[00:06:32] Why do these false narratives live on years after the fact?

[00:06:37] Why does that happen?

[00:06:39] Well, at RAND we have this broader research portfolio called Truth Decay and there's been

[00:06:43] this broader trend I think across our society of the decline of facts and objective

[00:06:49] analysis to take these issues head-on.

[00:06:52] I think there's a broader social headwind that occurs plus there's this issue, a larger

[00:06:56] issue of distrust of the government that I think is brewing across our country.

[00:07:01] The issue with elections in particular though is that when you find those one-offs and amplify

[00:07:08] them we do have a judicial process that the United States uses to essentially adjudicate

[00:07:15] these facts.

[00:07:17] What I am most concerned about is our adversary is getting into the bunch.

[00:07:21] And so I led RAND's work on election or foreign election interference in 2020 for

[00:07:25] Governor Newsom in the state of California and one thing that we found was the Russians

[00:07:30] in particular would find targets of opportunity and amplify and essentially

[00:07:35] recycle our partisanship at scale.

[00:07:42] The Russians in particular which brings us to Vladimir Putin you're speaking in the Kremlin

[00:07:47] earlier this month after being sworn in for another six-year term as Russia's president.

[00:07:53] It is hardly news that Russia's information warfare efforts are robust, widespread and

[00:07:59] increasingly sophisticated but what makes them even more effective RAND study says

[00:08:05] is that those efforts are taking full advantage of our own weaknesses and divisions

[00:08:11] in other words what we're doing to ourselves.

[00:08:14] In this regard the Russians are hardly alone.

[00:08:18] And so in many cases it's not that Russia or China or these other countries are doing this

[00:08:22] what the fact is happening is they're waiting for us to kind of essentially create a tactical

[00:08:27] opportunity that they exploit and then they can amplify it further.

[00:08:31] So we're doing it to ourselves and then our adversaries essentially exploit it.

[00:08:35] And I think that's what happens in many cases with our elections where there might be

[00:08:39] some one-off case there might be a court case that one is trying to have adjudicated

[00:08:45] and then our adversaries are going to jump in the mix and start trying to amplify this stuff online

[00:08:50] or in other mediums.

[00:08:51] They're just piling on two things that we are doing to ourselves.

[00:08:56] Oh yeah I mean when this issue of partisanship and actually broader truth decay in our society

[00:09:01] are actually are really tactical opportunities for adversaries.

[00:09:04] It is a Christmas gift to the Russians.

[00:09:07] It is a gift to the Chinese and the Iranians and other countries that are trying to harm our democracy.

[00:09:13] And you say rather disturbingly that these are not individual silos that the seemingly

[00:09:21] unrelated threats could happen simultaneously.

[00:09:25] Tell me more about that how might that unfold what should we be looking for?

[00:09:31] I think well the key thing we should be looking for here is how one type of seemingly

[00:09:37] disconnected threat could suddenly relate to another threat.

[00:09:40] So if there's an attack on our critical infrastructure such as our utility companies

[00:09:45] and there is a partisan reaction and suddenly you start seeing it grow.

[00:09:51] We do have to ask ourselves is why is it growing?

[00:09:53] Is it actually homegrown in terms of the reaction to some type of crisis which could be

[00:09:58] an attack on critical infrastructure.

[00:10:00] It could be a hurricane.

[00:10:02] It could be a cyber attack or our adversaries trying to amplify this up further.

[00:10:08] And I think one thing that we're not particularly prepared for is having multiple adversaries

[00:10:13] jumping in at the same time.

[00:10:15] And so if there is some national crisis or a regional crisis

[00:10:19] that may affect the ability for us to carry out election in a state or locality

[00:10:23] and then suddenly you have Russian trolls online.

[00:10:27] You might have Iranian operations operating separately to really just kind of mishmash this crisis.

[00:10:34] Are we going to be in a position to be able to adjudicate it accordingly essentially

[00:10:40] and say what do we need to do to get this done to carry out our elections

[00:10:43] or are we going to essentially just self-consume ourselves during this crisis?

[00:10:47] At a recent summit between Russian President of Putin

[00:10:52] and Chinese President Xi Jinping, Xi said there are no bounds to our relationship

[00:10:59] meaning a military cooperation, intelligence cooperation, economic cooperation on and on and on.

[00:11:07] Is there any evidence that you have seen that they are coordinating their efforts

[00:11:13] to interfere with their election in any of the ways that you have described?

[00:11:18] So I haven't actually looked at that question specifically.

[00:11:21] So I don't want to speak to whether or not that's happening.

[00:11:24] I will say as a hypothesis it wouldn't be super surprising.

[00:11:29] These are very cheap operations to carry out.

[00:11:32] You don't need to invest in a 10-year weapons system

[00:11:35] and dump hundreds of millions of dollars to an R&D to carry this stuff out.

[00:11:40] It really is a matter of you have a direct pipeline,

[00:11:44] particularly the social media and with individuals.

[00:11:47] So essentially all you're doing is figuring out who you want to target

[00:11:50] and you pump out content.

[00:11:52] So it would be not super surprising to say the least if Russia and China were somehow coordinating,

[00:11:57] either explicitly or implicitly.

[00:12:00] And that very well could just be that there might be a tactical opportunity

[00:12:03] that Russia finds and they exploit

[00:12:05] and then China jumps in their own way,

[00:12:08] not necessarily coordinating every step of the way,

[00:12:11] but again just finding that opportunity and being able to communicate it.

[00:12:15] You don't need to do a whole lot to gen people out, particularly in this type of election cycle

[00:12:20] where I would suspect we're going through a political realignment.

[00:12:24] And so it's a lot of exploitable opportunities to say the least.

[00:12:27] And to your earlier point about our election system being decentralized,

[00:12:32] you really don't have to do that much.

[00:12:35] And in the case of Russia and China,

[00:12:38] you really only have to look at these handful of swing states

[00:12:42] that are going to determine the election.

[00:12:45] It's conceivable they could target just two or three states

[00:12:50] that they think are going to make a difference and just focus on those.

[00:12:54] So talk about asymmetric warfare in a very tiny way

[00:12:58] could actually make a huge difference.

[00:13:01] Exactly.

[00:13:02] And I think the key is, is that they're not going to...

[00:13:05] I don't think it's really a huge payoff

[00:13:07] to necessarily try to hack our voting machines

[00:13:10] or try to turn individuals who are local election workers.

[00:13:14] Because it's such a decentralized system,

[00:13:16] there's a lot of different standard operating procedures across states.

[00:13:19] But you're right, there's a few states where vote margins are very narrow.

[00:13:24] And if you can find a crisis that somehow relates to

[00:13:27] or is happening inside that state,

[00:13:30] it's not that hard to gin people up

[00:13:32] and then you start casting doubt more broadly on the election system

[00:13:36] based on one or two examples or one or two crises

[00:13:39] that are occurring in a swing state

[00:13:41] that could potentially directly relate to kind of the national outcome

[00:13:45] because of our electoral college.

[00:13:47] It's an opportunity for the Russians and the Chinese,

[00:13:50] the Iranians and others.

[00:13:51] Let's shift if we could just for a minute

[00:13:53] to artificial intelligence.

[00:13:55] This is something that is far more top of mind

[00:13:59] than it was in 2020.

[00:14:02] What's the impact of that in 2020,

[00:14:06] relative to four years ago?

[00:14:09] So I had a hypothesize here.

[00:14:11] I would probably suspect that it's just going to pollute the information space.

[00:14:15] Well, I think there's two things.

[00:14:18] One, it's going to pollute the information space

[00:14:20] because it's just going to create more bullshit on the internet.

[00:14:23] And when you have more crap on the internet,

[00:14:25] it's difficult for individuals to disentangle

[00:14:28] what is true from what is a falsehood,

[00:14:31] because you might have images and video that look very, very realistic.

[00:14:36] The second thing is it allows our adversaries

[00:14:39] to scale their operations relatively easy.

[00:14:41] So these are already cheap operations to run.

[00:14:44] You don't need a lot of money to stand up a server

[00:14:47] and maybe stand up some individuals to produce content,

[00:14:51] but this will allow you to automate that.

[00:14:53] So essentially you're reducing the costs

[00:14:56] to enter into this kind of operation.

[00:14:58] And we already have a lot of crap on the internet,

[00:15:00] and now you're going to have AI producing more crap,

[00:15:03] especially polluting the information space,

[00:15:05] making it harder for regular citizens

[00:15:07] to make an informed decision based on whatever policy issue

[00:15:10] is popping up in a discussion on a given day.

[00:15:13] Now, you and I are speaking in early May.

[00:15:17] The general election is six months away,

[00:15:20] but we've already gone through the primaries and so forth.

[00:15:24] In terms of false narratives, disinformation,

[00:15:27] AI and all the rest,

[00:15:29] what have you already seen this year that makes you worry?

[00:15:34] Well, I think you've seen these videos pop up periodically

[00:15:40] of President Trump and President Biden

[00:15:44] in some cases where they're doing things that they didn't do.

[00:15:47] They're just deep fakes.

[00:15:49] And I think that's the thing that could potential be problematic.

[00:15:52] Not so much that that's going to destroy our democracy

[00:15:55] and I don't think AI is this existential risk personally.

[00:15:58] But what I do think could happen is if you time it correctly,

[00:16:02] you could take a crisis and you can exploit it.

[00:16:05] And all you got to do is get that one viral video

[00:16:08] and then get the news cycle running

[00:16:10] and have it get picked up.

[00:16:12] And I think that's going to be really particularly concerning.

[00:16:15] If our adversaries get the timing right

[00:16:18] and they get the right piece of content

[00:16:20] that hooks onto a percentage of the electorate

[00:16:23] and it goes viral,

[00:16:24] it's really hard to get ahead of that story.

[00:16:27] A bunch of Milwaukee,

[00:16:29] you know the value of voting Democratic on our votes count.

[00:16:32] It's important that you save your vote for the November election.

[00:16:35] And deep fakes are getting better every day

[00:16:38] like this fake robo-call

[00:16:40] of President Biden urging New Hampshire Democrats

[00:16:43] to skip that state's primary.

[00:16:46] You claim to have found evidence of corruption

[00:16:48] and deceit in Trump's data center.

[00:16:50] Yeah, right.

[00:16:51] Krusty Hillary tried that s*** too and failed miserably.

[00:16:54] Donald Trump has also been a target of deep fakes

[00:16:57] like this clip, again artificially manufactured.

[00:17:00] The problem, of course,

[00:17:02] is that for the average person looking at

[00:17:04] or listening to something,

[00:17:06] it may be difficult to discern fact from fiction.

[00:17:09] What can be done to bolster our defenses

[00:17:12] against this sort of thing?

[00:17:14] I put that question to Mr. Posard.

[00:17:17] So, I mean, the truth of the matter is

[00:17:19] that the Washington D.A. Americans are living their lives.

[00:17:22] They've got mortgages to pay,

[00:17:24] they've got kids to take care of.

[00:17:27] They're not thinking about these issues as deeply

[00:17:30] as maybe folks inside the D.C. bubble.

[00:17:32] And so I think sometimes people, you know,

[00:17:35] there should be a concern that this may sway the elector.

[00:17:38] We also often understand that

[00:17:41] other people have their lives to live.

[00:17:43] So they're not going to be thinking about this

[00:17:45] as closely as maybe folks inside the belt they are.

[00:17:48] A little bit of planning can go a long way.

[00:17:50] And so one thing that we recommend is scenario-based planning

[00:17:53] where we work with local, state and federal governments.

[00:17:56] This becomes important because this mishmash of crises

[00:18:00] that could really just blow up in our face,

[00:18:03] we can start trying to plan ahead in terms of

[00:18:05] who's going to do what if something hits

[00:18:07] and we can get creative with these scenarios.

[00:18:10] One thing I can tell you, for example,

[00:18:12] with misinformation from my work in 2020 was

[00:18:16] you might be far right or far left,

[00:18:18] but nobody wants Russia reaffirming their beliefs.

[00:18:21] We found that in our focus groups and interviews

[00:18:23] in 2020 a month before the election.

[00:18:26] You might believe in the Second Amendment

[00:18:28] and you believe that the Democrats are going to

[00:18:31] take your guns away, but you don't want Russia to do that.

[00:18:34] The same thing for those on the far left.

[00:18:36] So I think a lot of times it's figuring out

[00:18:38] getting ahead of this stuff and figuring out

[00:18:40] what one was going to do quickly as a whole

[00:18:42] government approach and meeting people where they're at

[00:18:45] and having a kind of strategy up front.

[00:18:48] Because right now I think we just respond once it happens

[00:18:50] and that's going to be a recipe for disaster, I think.

[00:18:53] That's an interesting point being reactive

[00:18:56] as opposed to being a proactive

[00:18:59] and you outlined a couple of things that can be done.

[00:19:03] But is that going to be enough in your review to prevent?

[00:19:07] I'm thinking about some kind of October surprise,

[00:19:11] the proverbial October surprise that tends to pop up

[00:19:15] every election cycle.

[00:19:17] Are we doing enough to prevent that?

[00:19:19] I suppose it's hard to answer that,

[00:19:21] but give me your forecast.

[00:19:24] I think we don't know until it happens, right?

[00:19:27] But I would argue that these proactive measures

[00:19:29] across state local tribal and church-related governments

[00:19:32] is going to be key.

[00:19:33] And I would go a step further.

[00:19:35] This is not in the actual report,

[00:19:37] but there's a handful of states where the vote margins

[00:19:41] are pretty narrow and it turns out those are the states

[00:19:44] where a lot of election fall-sides really gain traction

[00:19:46] because there's a reasonable chance that even a small

[00:19:50] regularity could change the outcome in terms of who gets

[00:19:54] what electoral votes.

[00:19:55] No one's claiming that people are stealing elections

[00:19:57] or there's not popular claims that people are stealing

[00:20:00] elections in the state of Maryland

[00:20:02] or the state of California.

[00:20:05] But there are claims in these states that we know already

[00:20:08] that they have narrow vote margins.

[00:20:10] And so what I think needs to be done in many cases

[00:20:13] is further investment in those states to increase the

[00:20:16] barriers for people that actually have falsehoods

[00:20:19] that are going to gain traction.

[00:20:21] And again, we do the scenario-based planning.

[00:20:23] We work with local state federal governments

[00:20:25] and we identify the states that we pretty much are

[00:20:27] sure are going to have a very narrow vote margin

[00:20:29] to begin with and invest in those states.

[00:20:31] I think we can do a lot to tamp down on some of these

[00:20:34] election fall-sides.

[00:20:38] Let's take a short break here when we come back

[00:20:40] and chat with Brady Roberts, the Chief Operating Officer

[00:20:43] of Emergent Risk International.

[00:20:47] This series on disinformation is a co-production

[00:20:50] of Evergreen Podcasts and Emergent Risk International,

[00:20:53] a global risk advisory firm.

[00:20:55] Emergent Risk International.

[00:20:57] We build intelligent solutions that find opportunities

[00:21:00] in a world of risk.

[00:21:10] Welcome back. Let's bring in Brady Roberts,

[00:21:12] the Chief Operating Officer of Emergent Risk International.

[00:21:17] Brady, this Rand report shows that AIs increasing capabilities

[00:21:21] are such that they have, and according to their report,

[00:21:25] the potential to significantly increase

[00:21:28] the persuasive power of disinformation, unquote.

[00:21:33] Is that persuasive power in your view accelerating faster

[00:21:37] than our ability to discern between fact and fiction?

[00:21:41] I would argue, yes, today it is.

[00:21:46] The ability for groups, for organizations,

[00:21:50] for individuals to create content for whatever the purpose,

[00:21:54] whether it's for a legitimate business purpose

[00:21:57] or whether it's for influencing voters in the election

[00:22:02] or selling narratives in the media

[00:22:09] or simply creating content as an individual in social media,

[00:22:13] the ability for us to be able to create content

[00:22:17] and whatever narrative that we want to choose to create

[00:22:20] has rapidly, rapidly expanded and increased

[00:22:25] and becomes so much easier now.

[00:22:28] And so that does lead to so many different narratives out there.

[00:22:31] And absolutely, I think your average consumer of content,

[00:22:36] your average American,

[00:22:38] definitely is more boring but far more information

[00:22:40] than they ever have been before

[00:22:42] and they only have so much time to read and to absorb.

[00:22:47] So therefore their ability to talk about

[00:22:49] the average layman here,

[00:22:53] their ability again to discern fact from fiction

[00:22:57] you were saying is almost impossible

[00:23:00] given just the sheer scale of things that they're exposed to.

[00:23:06] Absolutely, especially when you consider the fact that

[00:23:09] so many laymen are still receiving information

[00:23:14] via social media, which is algorithmically driven,

[00:23:21] meaning for business purposes it's producing content

[00:23:24] or showing you content that you want to see alongside ads

[00:23:28] that you've hovered over in a recent use.

[00:23:33] And so it's driving that content

[00:23:35] and so that's what you're seeing

[00:23:36] and so it's not giving you different sides of any equation,

[00:23:40] it's tending to give you more of one side

[00:23:42] and that makes it very hard I think for the most anyone

[00:23:46] really to discern as a fact as a fiction

[00:23:48] am I seeing both sides

[00:23:49] am I able to kind of evaluate a certain piece of information?

[00:23:53] It's also quite easy to use large language models, these LLMs

[00:23:57] to quickly generate what Rand calls

[00:24:00] and I'll quote again from their report

[00:24:02] eloquent false and misleading claims about

[00:24:05] significant topics in the news.

[00:24:08] These are obviously going to be election related issues

[00:24:12] Ukraine for example school shootings, immigration and so forth

[00:24:16] in your judgment what can be done about this

[00:24:19] we just agreed that this information is proliferating

[00:24:23] at a scale beyond the average person's ability to comprehend

[00:24:28] so what can we do about it?

[00:24:30] And that is such a broad question

[00:24:32] and I'll take that back to my perspective

[00:24:35] as someone has spent their entire career

[00:24:37] in the intelligence field.

[00:24:40] As an intelligence analyst

[00:24:42] we're taught first and foremost to use critical thinking

[00:24:46] and we're taught how to

[00:24:49] what critical thinking is and how to apply it

[00:24:52] to anything that we're evaluating in order to make judgment solemn

[00:24:56] and the first principle of critical thinking

[00:25:00] is objectivity.

[00:25:02] In other words being able to look at

[00:25:04] look at a piece of information

[00:25:06] and think about and investigate and determine

[00:25:11] how factual it might be

[00:25:13] and whether or not that piece of information

[00:25:17] could have bias behind it.

[00:25:20] And so as someone that is an intelligence professional

[00:25:24] this world that we live in today

[00:25:26] the information that I myself am bombarded with

[00:25:29] it's not so scary because I understand

[00:25:31] that anything I look at

[00:25:33] anything I view, anything that I read

[00:25:35] could have a motivation behind it

[00:25:38] and it's up to me to be able to sort that out

[00:25:41] look at that source, evaluate

[00:25:43] think objectively and move on

[00:25:45] to make my own personal decision

[00:25:47] in whatever it is I'm making a decision about today.

[00:25:49] I believe that

[00:25:51] what can we do about this?

[00:25:53] You know, I'll get back to a comment

[00:25:55] I've made in a previous

[00:25:57] a previous episode of this podcast

[00:26:00] I think we start with education

[00:26:04] taking this tradecraft

[00:26:06] that we as intelligence analysts

[00:26:08] and intelligence professionals

[00:26:10] understand that underpins our career profession

[00:26:13] and teaching that

[00:26:15] and increasingly teaching that at a younger age

[00:26:18] in this country.

[00:26:20] When we go back to thinking about those old civic scores

[00:26:22] they once upon a time so many of us went through

[00:26:25] I think we need to look at

[00:26:27] whether or not we should be teaching

[00:26:29] in the schools at an early age

[00:26:31] especially as kids are being

[00:26:33] introduced to social media

[00:26:35] this idea of critical thinking

[00:26:38] and looking at that from the lens of

[00:26:40] what does this mean for

[00:26:42] how we think about our government

[00:26:44] what we think about the world around us

[00:26:46] how we think about how that's going to make us

[00:26:50] better participants

[00:26:52] in our own US democracy

[00:26:56] as we become adults

[00:26:58] and looking at that the recidivist lens.

[00:27:00] Brady, I don't want to ask about

[00:27:03] specific clients but from

[00:27:06] general observation for businesses in general

[00:27:11] what are they most concerned about again

[00:27:14] six months out to the election

[00:27:16] a lot of issues in play that could affect them

[00:27:19] what are you hearing in general

[00:27:22] that appears to be of concern with them

[00:27:25] and again within the parameters of what we're talking about here

[00:27:28] which are these false narratives

[00:27:30] sure I think number one

[00:27:32] we are in across the board

[00:27:35] there are concerns about

[00:27:37] the potential for

[00:27:40] another round of

[00:27:43] demonstrations if not violence

[00:27:46] things that could potentially impact business

[00:27:51] or you know lead us towards

[00:27:54] another round of mass demonstrations

[00:27:57] across the country

[00:27:59] or simply violence

[00:28:02] I think other some of our clients are also just worried about

[00:28:05] just reputation issues in general

[00:28:09] if they're perceived as being

[00:28:12] you know tied to one side of the political equation

[00:28:15] or the other what might that mean for their business

[00:28:18] One of the final things I just wanted to ask about

[00:28:21] you know we worry about the Russian

[00:28:24] efforts Chinese efforts

[00:28:27] in this area RAND report says that

[00:28:30] adversaries like them and their others

[00:28:33] can increasingly benefit from technology

[00:28:36] that effectively obscure their foreign origins

[00:28:39] in other words we try and trace the breadcrumbs

[00:28:42] of these false narratives but through getting awfully good

[00:28:45] RAND report says about covering up

[00:28:48] those breadcrumbs

[00:28:50] what are your thoughts on kind of

[00:28:53] overcoming that it seems like the bad guys are always

[00:28:56] a step ahead of the good guys

[00:29:00] today's LLMs make the ability to create content

[00:29:04] that sounds like

[00:29:07] any voice that you wanted to sound like

[00:29:09] so incredibly easy

[00:29:11] it can also pull from more localized information

[00:29:14] to really give it a contextual flavor

[00:29:17] that makes that content

[00:29:20] really hard to discern

[00:29:22] I think back for example to

[00:29:24] the very earliest days of phishing emails Paul

[00:29:27] you know we used to teach people

[00:29:30] that when you're looking for a potential phishing email

[00:29:33] the first thing you look for are typos

[00:29:36] spelling errors or does this sound like it actually comes from

[00:29:39] in the case of an English email

[00:29:42] a native English speaker and if it doesn't

[00:29:45] that's your first red flag

[00:29:47] today of course all of that goes away and this is not something

[00:29:50] we worry about anymore because phishing emails

[00:29:53] read so much more eloquently now than they ever did

[00:29:56] I think the same principle applies to when we think about

[00:29:59] any sort of content whatsoever

[00:30:02] it's very easy to make something sound very legitimate

[00:30:05] or though it's coming from in this case we're talking about

[00:30:08] the US making it sound as though it comes from

[00:30:11] a member of the American audience

[00:30:14] and so that alone much less looking at where the sourcing

[00:30:17] of that comes from but that authenticity

[00:30:20] that you can give content makes it increasingly difficult

[00:30:24] and I think the random report is absolutely right

[00:30:26] and that it will make the layman reader

[00:30:29] the layman viewer much more susceptible

[00:30:32] to this content that seems so authentic

[00:30:35] I was hoping for some reassurance from me Brady

[00:30:38] and I'm not getting that

[00:30:40] well you know this is Paul

[00:30:42] this is where I go back to I think it's about

[00:30:45] you know teaching looking at this

[00:30:48] from a long term strategic objective

[00:30:51] in the United States and teaching

[00:30:53] our young population

[00:30:56] what it means to look at this material

[00:30:58] what it means to think critically about it

[00:31:00] what it means to identify your sourcing

[00:31:02] and think through what that sourcing might come from

[00:31:05] and understand motives

[00:31:09] behind a given piece of information

[00:31:11] I think that's our good start

[00:31:13] I will say

[00:31:15] if you want one piece of good news Paul

[00:31:17] and I say this not as an intelligence professional

[00:31:19] but I say it as a parent raising kids of different ages right now

[00:31:22] and watching their media literacy grow

[00:31:25] I think I do think that our younger population

[00:31:28] in this country really is taking on head first

[00:31:31] this concept of media literacy

[00:31:34] and they're thinking about that in ways that we never would have before

[00:31:37] thanks to Mark Posard a military sociologist

[00:31:42] and faculty member at the Rand Graduate School

[00:31:45] of Public Policy

[00:31:47] and Brady Roberts chief operating officer

[00:31:50] of Emergent Risk International

[00:31:53] sound from ABC News

[00:31:55] our sound designer and editor Noah Fouts

[00:31:58] audio engineer Nathan Corson

[00:32:00] executive producers Michael Dale-Loya

[00:32:03] and Gerardo Orlando

[00:32:05] and on behalf of Meredith Wilson

[00:32:07] the CEO of Emergent Risk International

[00:32:10] I'm Paul Brandes

[00:32:12] thanks so much for listening