Placeholder Image

字幕列表 影片播放

  • We're interested in sort of the intersection between humans and computers.

  • You know, he would be with Interact.

  • Ondas, a part of my junior on meeting various researchers I met with Christian on immediately, sort of identified some natural research interests that we shared.

  • It's been an interesting encounter of computer scientists.

  • That's certainly not me and sociologists.

  • We've been working on this Internet, that disciplinary fashion, complaining descriptive servers to help people consume news with, ah, very interesting experiments.

  • There's been very much driven by the computer scientists involved in this, which has tried to capture and tap into what people do when they come face to face with with news fate.

  • Otherwise, one thing we're interested in.

  • Computers parent science perspective is the history of information, which you refer to its provenance on.

  • The idea is that you can look at the veracity of information.

  • It stems from wine, actually sort of wine bottles.

  • Looking out.

  • Where was that wine where the grapes from where was it manufactured?

  • Where's it come from?

  • So what?

  • You know that history you know, this whole understanding of the white When you apply that principle to information, you can start to look out OK, how is information manipulated, particularly in the Web and so on?

  • And something like Wikipedia is a great example of that.

  • How sources have changed, how the content has changed on taking that idea and matching it with Christian's work around sort of nationalism, society and so on.

  • The fake news cropped up on was almost immediately right, which is, you know, how does information on dhe fake information affect society as a whole?

  • Fake news to me looks pretty straight for year.

  • You look at it on ditzy.

  • They got good sources or it's no, it's making some wild claim or it's not.

  • So how did you guys put a couple of things?

  • Really so Festival.

  • We wanted to get sense off people's understanding.

  • Eso understanding of what people think news is first and foremost.

  • I mean, news is different for everyone.

  • I mean, some people, instant celebrity, some people interested politicians and so on.

  • I said, to do that, we sent out sort of survey to various people 200 or so people.

  • I think I got a good sense off what people understand as what is news to them s so that gives him of the bigger picture.

  • Then to really delve deeper into it, he started to look at doing some some qualitative work interviews with people and understanding in the real world.

  • How did they understand these things?

  • So what we did was we put together a fake Facebook page populated with a variety of different news articles.

  • Some of them were real.

  • Some of them were fakes and different new sources and so on.

  • And we asked people to look through that news food A CZ, if it were their own, suggest information and try and identify what is the fake news in that space.

  • Now people make Ah lot of decisions very quickly.

  • This of the digital space.

  • Imagine yourself on a mobile phone.

  • You'd be like Yep, yep, Next, Next, next.

  • Your Snapchat kind of consumption of things S Oh, really?

  • To break that down a little bit, we have to make use.

  • It's an eye tracking technology.

  • The idea behind that is that if we can track someone's gaze on the computer, we can look out.

  • What information Where's with screen They digesting is a part of that breakdown analysis of their news.

  • I want it allowed us to do was to see Okay, Well, people have a tendency A to look at the title of the news article.

  • First and foremost, make what you can see if they're making a kind of initial judgment at that point in time.

  • Is this true or not?

  • What's my feeling about it?

  • Cows are fed skittles by farmers.

  • Is that true or not?

  • It instantly, I'd feel, you know, skepticism about that.

  • Whereas if it said Trump has done something ridiculous again, probably more true at that point in time, you'd say OK if I have some doubt about that.

  • What people tend to do then, is look at the source of the article, which is the link links above.

  • And it was something from the BBC, people would typically say, Yeah, okay, I believe that if it's something from, um or counting committee faking that well, like satire.

  • Yes, I status and click click based Clickbait sorting in the woods so consumed understand about point.

  • Now, there were cases where another micro judgment was made, which is where people would say Okay, well, I'm not sure about the article.

  • I'm not sure about the source.

  • They'll go into the article that look more form or information.

  • And that's when we'll start to look at things like dates like the content, the way it's written with author and so on.

  • So you can start really break down this kind of order, almost criteria that people are instantly using to make make judgments about these things.

  • Does this all sounds kind of very logical, Very kind of orders.

  • Do you think everybody does all those things?

  • People do it in different ways, but I think for the most part people do.

  • And then I realized they're doing it on Guy.

  • Actually, some people would look at the source first in an article as a as a part of their natural consumption of this information.

  • Others then would skip over things broadly very quickly.

  • But I believe based on the data we've got, that people are making these micro judgments quickly.

  • This is a video we recorded off some of the eye tracking that we did.

  • So this is this is the Facebook page a little Let's play it now.

  • You can see that the dots moving around is where someone's gaze is on the circle gets bigger, the longer looking at something you can see this person looking around the screen looking at the source.

  • Their eyes are jumping around the place because all the while this is happening.

  • We're using what's called a sort of think aloud protocol, and that's where we're asking someone to look through this feed on, talk out loud about what they're doing.

  • Some real stuff mixed.

  • Yes, yes, they're gonna try and make it as convincing as possible consuming media.

  • In this way, it's quite private thing in a way, right?

  • It's something you do on your own.

  • There's no one around watching you do it.

  • And so, in the first instance, to ask someone to talk through the sort of logic of their thinking as they're looking through this stuff is kind of strange.

  • It was a bit odd, really.

  • People aren't used to talking about it.

  • Actually, there's almost a level of discomfort that comes from talking about these things because you're being challenged on something that you normally challenged about on.

  • For example, someone said to me, Well, when I've thought about this, I I immediately think, Yeah, I trust the BBC Trust.

  • That was a new source.

  • When someone asks me why Do you trust that as a new source?

  • I think to myself.

  • I'm not sure, really.

  • I kind of just just do like I kind of have trusted it for a long period of time.

  • But I've never actually sat down.

  • And, God, why is it that I trust that was a new source on that kind of question?

  • People struggled with a lot.

  • When one turns on these things, What is it that this research do you think will lead to Where, where we going with this?

  • There's a couple of things, really.

  • I mean, the first is is that we can look at and understand how people as a community, verifying news.

  • So, you know, social media, shared space.

  • People can comment on things and actually a lot off people's trust in news that they get it.

  • It's from their friend or friends commented on something.

  • So it's kind of this trust thing, but I guess where we're moving with this is really around.

  • How can technology support people in this space?

  • And it's a two front.

  • This the first is and we mentioned earlier, is the idea of having algorithms that can assess a news article based on a set of criteria, whether it's the source of the content of it, and then label it or label it with some kind of, I don't know, fake indicator off it.

  • Fake index, um, some some level of trust or confidence in that thing.

  • The question that space is a what algorithms do you trust?

  • You have the same problem that you do with this sort of new sources, which is, there will be different types of algorithm that work in different ways.

  • How you trust which wish newsreels secondly, then, is how do you present that information back to people?

  • Is it appropriate to have that kind of indicators slapped onto a new news article in your newsfeed?

  • It's already tons of stuff going on in that that you really want something else.

  • But how would you represent that?

  • Is it some kind of percentage?

  • Red, yellow, green?

  • I don't know some kind of indicator on that.

  • The second then consideration is sort of the more social side of things.

  • Which is how do we support able people to continue to as a crowd as a group to verify these things?

  • Eso often you'll find in news articles where there's an element of doubt.

  • You look at the conversation on, people are plowing in different sources.

  • People are having that conversation on.

  • That's kind of another way to verify what's happening.

  • How would you enable that community to continue to do that?

  • And how do you embed that into the the news article itself as well?

  • So you could almost imagine having some kind of algorithmic indicator, but also some kind of social indicator that says, Yeah, 20% of people have read this article.

  • Believe this to be true.

  • How does that then influence someone's consumption of that media?

  • Fake news isn't really anything new.

  • It's just the speed of which it can travel for her, you know.

  • But when you're a child on the playground, so one of your mates would tell you, you know, section section search has happened on, then you should make a judgment call the story that you believe that based upon here it wasn't telling you on who they from, but this is technology that's causing it to be a bit of a problem, and also by the sounds of it, the room there were agents out there were trying to make make things changing places on Earth.

  • Do you think this is inherently about educating people?

  • Can we do anything at the source technology?

  • Um, not that we can find the source, necessarily.

  • Perhaps that's the problem.

  • I don't know.

  • I mean, I think you've got the social media is inherently social platform, and I think you've got to rely on people to govern that I'm not sure how much, really, Technically, you can attack the source of those things that the problem is, is that technology, as you say, enabling these things.

  • And not only is it easier and faster to share information, but I or anyone can create very convincing looking news articles that look look fake.

  • So I'm not sure how you govern that.

  • I think I think it's got to be a people driven government.

  • I believe Facebook have actually hired people thio physically look at these sites and postings and things, so they're going back to people to do this.

  • Do you think that machine learning and all sorts of, you know, clever computer algorithms could actually do some of this stuff?

  • But I think they definitely could.

  • I mean, you know, there's plenty of sort of opportunities through text analytics and gauges of What's it All the contents saying?

  • What's interesting is that do those people that are reviewing those articles have any biases themselves does?

  • Does the organization behind them have any biases on dhe?

  • When you think about how much in away control that organization has over the information that people are presented with, it gets a bit worrying then.

  • So in that sense, yes, there should definitely be a place for machines to do that because they could be some unbiased anyway, So it is a heat map.

  • One?

  • Yes, you can see lots of jumping around the screen.

  • When was this using the same kid?

  • Is he the same case?

  • So it's just a different representation.

  • So some of it is just kind of almost the logistics of our need to scroll the page down.

  • So I look about the scroll.

  • I'm actually not a lot of the time.

  • People don't really realize what they are looking at.

  • What they're not looking at you.

  • So we would know only with this allow.

  • Inform us about that sort of these decisions and order they're making, but it's also a prompt for us to talk to them about them.

  • Did you realize you were doing this?

  • And so were you watching this in real time?

  • Yes.

  • Yeah, you're watching real time as they as they were consuming in talking about it on the wall.

  • That would let's do then, as a part of the conversation would prompt and say, Okay, well, I know it's what you just did there.

  • Did you realize that you click the link first or so on on?

  • Helped really draw out what people are actually thinking.

  • In those moments.

  • The first thing they see is the Facebook page we put together on.

  • Then we just said Okay, Well, what you want to do is you want to look through the page as you would naturally best.

  • You can.

  • Yes.

  • You take pin pin was interested a lot all the time.

  • People are poison to look and talk to people.

  • And the guy days moves back and forth, actually, but one of the challenges with eye tracking is maintain that level of accuracy.

  • A lot of we we used them.

  • I tracked it very expensive.

  • Upwards of 30 grand.

  • You know, on a lot of time, people have to put their chin, something that keeps their heads straight and so on.

  • This is the device we use is more of an affordable a tracker on.

  • Actually, it wasn't the accuracy of all.

  • They're looking at this word or they're looking at, you know, it's not about that.

  • It's just about the order within which they do things which meant the accuracy of this was enough for us.

  • And really, it was it was it's secondary to what our primary goal was, which is to understand what they think, what they talk about.

  • One thing that was apparent to me.

  • Just looking at that just then, is how quickly the arm moves between.

  • Think what?

  • Have you been able to say that down and just do a bit more kind of in depth?

  • Look at Yes.

  • Oh, not not yet.

  • But we will do that on Yeah, we'll use the video, slow it right down.

  • Then you'll get that point to point to point to point order of where people people are looking.

  • I wonder how much it's conscious.

  • No, not all of it.

  • I don't think I spent a lot of it.

  • Is an unconscious Yeah, People share my friends have shared it.

  • Therefore, it's true rather than I'm in the newsagents.

  • That paper is when I trust that one is one I don't trust.

  • This is Facebook's, right.

  • It's But it's my friends and share this, the medium mattering more than the the message or the medium, establishing a framework within which certain decisions become more likely than the others.

  • That's certainly my sense.

  • And I if I'm honest, that was most things that the outside of it.

We're interested in sort of the intersection between humans and computers.

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

A2 初級

假新聞消費者 - Computerphile (Fake News Consumers - Computerphile)

  • 3 1
    林宜悉 發佈於 2021 年 01 月 14 日
影片單字