Placeholder Image

字幕列表 影片播放

  • MARTHA MINOW: I'm Martha Minow.

  • And it is with gratitude to the Berkman Klein team

  • and to Urs Gasser, particularly, that I

  • say, welcome to a discussion that I

  • promise will have answers--

  • at least suggestions-- as well as, oh my gosh!

  • What do we do?

  • So "fake news" is a phrase that, now, no one

  • is quite sure what it means.

  • But we're all worried about it.

  • And we will spend a little bit of time talking about,

  • what do we mean by it?

  • What has it come to mean?

  • But we're going to spend most of the time together talking

  • about, what are tools that are available or could be made

  • available to help people sort through the floods

  • of information and the democratization of access

  • to information that makes it very hard to know what's true

  • and what's not true.

  • And then, of course, there's not anything new at all

  • about propaganda and lies.

  • We've always had them.

  • Now we just have more access to them.

  • So one of my favorite cartoons shows

  • in the antique world of Xerox machines, someone

  • going to the Xerox machine and making a copy of something

  • and saying, send it to the world!

  • Now, at the time, that was a funny cartoon.

  • But now, with the internet and digital possibilities,

  • anybody can send anything, basically, to the world.

  • And I think that's the context that we're

  • going to be addressing.

  • I will say something briefly about each person

  • when I introduce them.

  • And I'm immediately turning to J. Nathan Matias, who

  • is very importantly involved in the Berkman Klein Center

  • for Internet and Society here.

  • And he's also involved in the MIT Center for Civic Media.

  • And he's a PhD candidate at MIT.

  • And he's going to kick us off.

  • What do we mean when we say fake news?

  • What do you mean?

  • J. NATHAN MATIAS: So when people think about fake news,

  • we often look back to that moment

  • when Craig Silverman at BuzzFeed did this amazing report

  • about Macedonian teenagers who were creating fake articles

  • and earning thousands of dollars a month.

  • In fact, one of my favorite fake news headlines is, quote,

  • "After election, Obama passes executive order

  • banning all fake news outlets."

  • Which, of course, was itself fake news.

  • But the reality is much more complex.

  • It's much more common to see something

  • like a recent Breitbart article entitled

  • "California's recipe for voter fraud on a massive scale."

  • There's recent work by Yochai Benkler and the folks

  • at the Media Cloud team here at Harvard

  • that shows that often what we get

  • are powerful political entities creating information

  • that has, maybe, a kernel of truth,

  • but it's really disinformation.

  • They mix truths with familiar falsehoods and logics

  • of the paranoid to make something that is not

  • just believable but something that, maybe,

  • when you go to Google, because they're

  • the only people writing about it,

  • you might feel like you're fact checking it.

  • Because you see 10 other links from

  • similarly-connected organizations saying

  • the same thing, even though it's something closer

  • to disinformation.

  • It goes beyond what can actually be claimed.

  • Because in the case, for example, of California,

  • their motor voter laws are things

  • that are similar to what other states have already

  • implemented.

  • And there's not really been evidence

  • that those kinds of things lead to voter fraud.

  • So there's this problem where we have a wide variety

  • of disinformation.

  • And people are concerned about how that information spreads

  • on social media.

  • There are fears about filter bubbles.

  • There are fears about the use of algorithms,

  • whether it's Google Search or whether it's

  • Facebook's news feed, that might influence

  • how these things spread.

  • And in my research, I've done work

  • to help understand what we as citizens can do

  • and what the public can do to better understand

  • those algorithms and influence how they work for the better.

  • MARTHA MINOW: Great.

  • We're going to hear more about that soon.

  • So An Xiao Mina is an expert on memes.

  • And she's a writer who looks at global internet and network

  • creativity.

  • And here, as a fellow at the Berkman Klein Center,

  • she's studying language barriers in the technology stack,

  • because the interest in diverse communities

  • is a big development of her work.

  • She leads the product team at Meedan,

  • which is building digital tools for journalists

  • and translators.

  • And she co-founded Civic Beat, a research

  • collective focused on the creative side

  • of civic technology.

  • What do you mean by fake news?

  • AN XIAO MINA: So I think, when we

  • think about fake news, often--

  • this is my perspective as a product manager

  • working with journalists.

  • Often, in these communities, we always

  • use air-quotes, "fake news, fake news."

  • And in many ways, this is an implicit acknowledgement

  • that this phrase has come to mean so many things

  • to become almost meaningless.

  • It's an umbrella term for so many other words,

  • other phenomenon.

  • So the problem of fake news starts to seem intractable,

  • because it has such a diffuse meaning.

  • And I really appreciate Claire Wardle's breakdown

  • of fake news.

  • She looks at different types of fake news, anything from satire

  • and parody to misleading content to really manipulated content

  • and, then, fully-fabricated content,

  • and then also breaks down different motivations,

  • everything from parody to the goal of punking to actually

  • spreading propaganda.

  • And when we look at these different techniques,

  • when we really break down fake news,

  • we can start to think about different strategies

  • and different techniques for addressing

  • the wide variety of problems under this umbrella.

  • So I think there's a different range

  • of strategies for when an Onion article becomes cited as fact--

  • which is a frequent phenomenon, especially in global contexts

  • where a global newspaper, a newspaper outside of the US,

  • might misunderstand the context of The Onion

  • and then cite that as news--

  • versus our strategies for dealing

  • with state-sponsored propaganda botnets.

  • So as we break down these different motivations

  • and techniques, it also helps us think about breaking down

  • our strategies.

  • The other thing about their frameworks around fake news

  • is also the very word "fake news."

  • It orients us towards an orientation towards truth

  • and falsehood.

  • When often, the reason that things spread

  • is not about truth or falsehood but about affirmation.

  • We talk about the internet as an information superhighway.

  • It's one of the early metaphors for the internet.

  • In many ways, it's like an affirmation superhighway.

  • People are looking for validation of perspectives

  • and deeper cultural logic.

  • So I tend to agree with researcher Whitney Phillips.

  • Her framework around this is suggesting

  • that we think about it as folkloric news or folk news.

  • Because it orients us less towards truth and falsehood,

  • which is still important, but more towards motivations

  • for sharing and participation and how

  • that reinforces deeper cultural logics.

  • And I guess that's-- my third point here is,

  • as we think about solutions for this fake news problem,

  • it's also thinking about short-term and long-term

  • solutions.

  • In product management, we often think about,

  • what is the immediately addressable problem versus what

  • is the long-term issue here?

  • And this issue around cultural logics, I think,