Placeholder Image

字幕列表 影片播放

自動翻譯
  • This is a story involving Facebook

    這是一個涉及Facebook的故事

  • and a company called Cambridge Analytica.

    和一家名為劍橋分析的公司。

  • Which I know sounds like a Harry Potter spell

    我知道這聽起來像哈利波特的咒語。

  • that does your homework, but...

    這做你的功課,但... ...

  • it's actually a data analytics company

    其實是一家數據分析公司

  • dedicated to one thing:

    專注於一件事。

  • figuring out how to manipulate you at all costs.

    想方設法不惜一切代價操縱你。

  • NEWSMAN: Beginning in 2014, many Facebook users were paid

    新民網:從2014年開始,許多Facebook用戶被付

  • to take a personality test funded by Cambridge Analytica,

    以參加劍橋分析公司資助的人格測試。

  • agreeing to give up some personal data.

    同意提供一些個人數據;

  • But what they didn't know--

    但他們不知道的是...

  • at the same time, the company was scooping up

    同時,該公司還在搶購

  • all of their friends' private information, too.

    他們所有朋友的私人資訊,也。

  • So a survey that started with about 270,000 people

    所以,一項調查,開始時約有27萬人參與其中

  • ultimately collected more than 50 million profiles.

    最終收集了5000多萬份資料。

  • All right, I'm sorry, but that's some bullshit, right?

    好吧,我很抱歉,但這是一些廢話,對不對?

  • Because your friend took this dumbass quiz,

    因為你的朋友參加了這個愚蠢的測驗。

  • this company you've never heard of got access to your account?

    這家你從未聽說過的公司進入了你的賬戶?

  • Yeah. It's like your friend boned someone,

    是啊,就像你的朋友把人打成了骨幹一樣。

  • and then you get the STD.

    然後你就得了性病。

  • It's like, "What the (bleep) man."

    這就像,"什麼(嗶)的人。"

  • "It was totally worth it." "Not for me."

    "這是完全值得的。""對我來說不值得"

  • (applause)

    (掌聲)

  • Now...

    現在...

  • Now, you might be saying,

    現在,你可能會說,

  • "What do I care if Cambridge Analytica got my Facebook data?

    "如果劍橋分析公司拿到了我的Facebook數據,我還在乎什麼?

  • "I don't mind that people know

    "我不介意別人知道

  • "that I like Ben Affleck's back tattoo.

    "我喜歡本-阿弗萊克的背部紋身,

  • I think it brings out his eyes."

    我覺得這能凸顯他的眼睛。"

  • But the truth is, the truth is, in the wrong hands,

    但事實是,真相是,在錯誤的手中。

  • our data can be used to do some pretty sinister things.

    我們的數據可以被用來做一些非常邪惡的事情。

  • The level of what can be predicted about you,

    可以預測到你的水準。

  • based on what you like on Facebook,

    根據你在Facebook上喜歡的內容。

  • is higher than that your wife would say about you,

    比你妻子對你的評價要高。

  • what your parents or friends can say about you.

    你的父母或朋友對你的評價。

  • Cambridge Analytica will try to pick whatever mental weakness

    劍橋分析公司會試圖挑選任何心理弱點。

  • or vulnerability that we think you have,

    或我們認為你有的弱點。

  • and try to warp your perception of what's real around you.

    並試圖扭曲你對周圍真實事物的認知。

  • Okay, now, that should scare you.

    好了,現在,這應該嚇唬你。

  • Because if you've seen movies, you know that when the person

    因為如果你看過電影,你就會知道,當一個人...

  • with crazy hair gets stressed out,

    與瘋狂的頭髮得到壓力。

  • something really bad is going down.

    一些非常糟糕的事情是怎麼回事。

  • It's like, "They hacked into the main frame.

    這就像,"他們黑進了主框架。

  • I wouldn't believe you but you got purple dreadlocks."

    我不會相信你,但你有紫色的小辮子。"

  • And this was really bad, because, sure,

    這真的很糟糕,因為,肯定。

  • some people might say this is just like advertising.

    有些人可能會說,這就像廣告一樣。

  • It sounds just like advertising, right?

    聽起來就像廣告,對吧?

  • They try to get you to buy something

    他們想讓你買東西

  • by tugging at your emotions.

    通過牽動你的情緒。

  • But this is ten levels above that,

    但這比這還要高出十個層次。

  • because traditional advertisers

    因為傳統廣告商

  • don't know who you are, personally.

    不知道你是誰,個人。

  • Like, imagine if Samsung knew from Facebook data

    就像,想象一下,如果三星從Facebook的數據中知道

  • that you lost your dad last week,

    你上週失去了你的父親。

  • so they put a message on your feed that their new phone

    所以他們把一個消息在你的飼料,他們的新手機

  • could contact your dad on the other side.

    可以聯繫你爸爸在另一邊。

  • You would be way more likely to buy that phone.

    你會更有可能買下那部手機。

  • It would tug at your heartstrings. You'd be like,

    它將撥動你的心絃。你會喜歡。

  • (crying): "Dad, is that you?

    (哭)。"爸爸,是你嗎?

  • "Oh, my God, Dad, is that you?

    "哦,我的天,爸爸,是你嗎?

  • "Can you tell me where you left the keys for the Camaro, please?

    "請問你能告訴我你把Camaro的鑰匙放在哪裡嗎?

  • "I, um, yeah, I can't find them.

    "我,嗯,是的,我找不到他們。

  • All right, bye. I love you."

    好了,再見。我愛你。"

  • Like, they could get to you.

    就像,他們可以得到你。

  • And we know, we know that Cambridge Analytica

    我們知道,我們知道,劍橋分析公司。

  • got people's data from Facebook.

    從Facebook獲得了人們的數據。

  • We know that they figured out how to use this data

    我們知道,他們想出瞭如何使用這些數據

  • to manipulate people.

    來操縱人們。

  • What you may not know is who they gave all that power to.

    你可能不知道的是,他們把所有的權力都給了誰。

  • TV REPORTER: The data firm hired by Donald Trump's

    電視記者:唐納德-特朗普聘請的數據公司。

  • presidential election campaign

    總統競選

  • used secretly obtained information

    神不知鬼不覺

  • from tens of millions of unsuspecting Facebook users

    從數以千萬計毫無戒備的Facebook用戶那裡得到的資訊

  • to directly target potential American voters.

    以直接針對潛在的美國選民。

  • The entire operation centered around deception,

    整個行動圍繞著欺騙展開。

  • false grassroots support, and a strategy

    偽草根、偽策略

  • that seems to border on electronic brainwashing.

    這似乎接近於電子洗腦。

  • You see, using Cambridge Analytica's tools,

    你看,用劍橋分析公司的工具。

  • Trump's campaign figured out a way to manipulate people,

    特朗普的競選活動想出了一個操縱人的辦法。

  • or as they called it, "electronic brainwashing."

    或者他們稱之為 "電子洗腦"。

  • Which also happens to be the name of my favorite

    這也恰好是我最喜歡的名字。

  • Daft Punk album. Yeah.

    Daft Punk的專輯。對啊

  • It's the one with that song that goes like,

    就是那首歌的那首,是這樣的。

  • (electronic-like vocalizing)

    (電聲)

  • No, no. It's the other one. The one that goes,

    不,不,是另一個。這是另一個。一個去。

  • (electronic-like vocalizing)

    (電聲)

  • Is that the-- No, no. It's the one where it's like,

    是那個... 不,不是。這是一個在那裡它的喜歡。

  • (electronic-like vocalizing)

    (電聲)

  • Yeah, that one. That one.

    是的,那一個。那一個。

  • Here's an example, here's an example

    這裡有一個例子,這裡有一個例子。

  • of how the Trump campaign used Cambridge Analytica's tools.

    的特朗普競選活動如何利用劍橋分析公司的工具。

  • Cambridge Analytica figured out that the phrase

    劍橋分析公司發現,這句話中的

  • "drain the swamp" made people angry

    "激濁揚清

  • at career politicians, and this would make them want

    在職業政治家,這將使他們想。

  • to vote for Donald Trump, and I'm not making this up.

    投票給唐納德・特朗普, 我不是在編造這個。

  • Trump told us this himself.

    特朗普親口告訴我們的。

  • It was a term that was actually given to me.

    這其實是給我的一個名詞。

  • Usually, I like to think them up myself,

    通常,我喜歡自己想辦法。

  • but this was given to me.

    但這是給我的。

  • But they had this expression, "drain the swamp."

    但他們有這樣的說法,"排幹沼澤"。

  • And I hated it.

    我討厭它。

  • I thought it was so hokey.

    我覺得這太胡鬧了。

  • I said that is the hokiest. Give me a break.

    我說這是最搞笑的。饒了我吧

  • I'm embarrassed to say it.

    我都不好意思說了。

  • And I was in Florida,

    而我當時在佛羅里達州。

  • where 25,000 people going wild.

    在那裡,25000人瘋狂。

  • And I said, "And we will drain the swamp."

    我說:"我們要把沼澤排幹"

  • The place went crazy.

    這個地方瘋了。

  • I couldn't believe it.

    我簡直不敢相信。

  • Yeah, neither could we.

    是啊,我們也不能。

  • You know, you always think it's unrealistic

    你知道嗎,你總是認為這不現實。

  • when Bond villains reveal their entire scheme,

    當邦德的反派暴露了他們的整個計劃。

  • and then you see this, and you're like, yeah.

    然後你看到這個,你就會覺得,是的。

  • (mimics Trump): "And you see, Mr. Bond, unless someone finds

    (模仿特朗普)。"你看,邦德先生,除非有人發現...

  • "the hidden switch under my castle,

    "我城堡下的隱藏開關,

  • "no one will be able to stop the bomb.

    "沒有人能夠阻止炸彈。

  • That's why-- Oh, he's gone. Oh, no."

    這就是為什麼... 哦,他走了。哦,不。"

  • (normal voice): So thanks to Cambridge Analytica,

    (正常聲音):所以,感謝劍橋分析公司。

  • Trump knew "drain the swamp"

    特朗普知道 "抽乾沼澤地"

  • would drum up anti-establishment votes.

    會鼓動反建制的選票。

  • People who might have never voted before.

    可能從來沒有投過票的人。

  • But here's the thing, don't get it twisted.

    但事情是這樣的,不要把它扭曲了。

  • They might be able to use these tools

    他們或許可以使用這些工具

  • to push you in a certain direction,

    把你推向某個方向。

  • but they couldn't completely trick you

    騙不了你

  • into voting for Donald Trump.

    到投票給唐納德-特朗普。

  • And you know how we know this?

    你知道我們怎麼知道的嗎?

  • Because of this:

    正因為如此。

  • We haven't spoken about the fact that Ted Cruz,

    我們還沒有談到特德-克魯茲的事實。

  • who was also a presidential candidate,

    他也是總統候選人。

  • also used Cambridge Analytica.

    還使用了劍橋分析公司。

  • His campaign was a disaster.

    他的競選是一場災難。

  • Yeah.

    是啊。

  • All the electric brainwashing in the world

    世界上所有的電動洗腦

  • can't make people like Ted Cruz.

    不能讓人們喜歡泰德-克魯茲。

  • All of it.

    所有的一切。

  • -All of it. -(cheers and applause)

    -全部-(歡呼聲和掌聲)

  • Like, you could hypnotize someone.

    比如,你可以催眠一個人。

  • You could you be like, You're a dog. (barks)

    你可以你是喜歡,你是一隻狗。(吠聲)

  • You're a chicken. (gobbles)

    你是一隻雞。(咕嚕聲)

  • You like Ted Cruz. I'm not hypnotized.

    你喜歡Ted Cruz我沒有被催眠

  • This is bull. Like, this...

    這是胡說八道。就像,這...

  • Hypnosis doesn't even work, man.

    催眠根本不起作用,夥計。

  • Basically, Trump didn't create new fears in people, right?

    基本上,特朗普沒有給人們製造新的恐懼吧?

  • He found a way to appeal to fears and desires

    他找到了一種方法來吸引恐懼和慾望。

  • that already existed, you know?

    已經存在的,你知道嗎?

  • And they used Facebook,

    而且他們用的是Facebook。

  • in the same way that Facebook will be like,

    以同樣的方式,Facebook將像。

  • "Hey, remember your friend Steve from high school?"

    "嘿,還記得你高中的朋友史蒂夫嗎?"

  • Except this time it was like, "Hey, remember

    只不過這次是 "嘿,還記得嗎?

  • how you're scared of brown people? Yeah."

    你怎麼會害怕棕色人種?是啊。"

  • And I'm gonna be honest with you.

    我就實話實說吧。

  • The fact that Donald Trump used Cambridge Analytica's tools

    唐納德-特朗普使用劍橋分析公司的工具的事實。

  • isn't the worst thing that happened here.

    這不是最糟糕的事情,發生在這裡。

  • Every politician will use the tools

    每個政治家都會使用的工具

  • at their disposal to get votes.

    在他們的支配下,以獲得選票。

  • Obama did a similar thing himself.

    奧巴馬自己也做過類似的事情。

  • My problem is with Facebook.

    我的問題是與Facebook。

  • They need to be held accountable,

    他們需要被追究責任。

  • because not only did they turn a blind eye

    因為他們不僅視而不見

  • to Cambridge Analytica using this data,

    向劍橋分析公司使用這些數據。

  • but they also didn't tell their users that this was happening.

    但他們也沒有告訴他們的用戶有這種情況發生。

  • At the same time, though,

    但與此同時,。

  • it's responsibility to be vigilant.

    有責任提高警惕。

  • Like, in the year 2018,

    比如,在2018年。

  • you just have to assume everything you click online,

    你只需要假設你點擊在線的一切。

  • everything you watch, every website you visit,

    你看的一切,你訪問的每一個網站。

  • will be collecting data on you,

    將收集您的數據。

  • and that data will be used eventually

    而這些數據最終將被用於

  • to try and sell you something.

    試圖賣給你一些東西。

  • Even the people on the places you trust,

    即使是你信任的地方的人。

  • they're all just trying to sell you something.

    他們都只是想賣給你一些東西。

  • Never forget that.

    永遠不要忘記這一點。

  • And now, a word from our sponsors.

    現在,我們的贊助商說一句話。

This is a story involving Facebook

這是一個涉及Facebook的故事

字幕與單字
自動翻譯

影片操作 你可以在這邊進行「影片」的調整,以及「字幕」的顯示

B1 中級 中文 TheDailyShow 劍橋 特朗普 分析 公司 數據

科技洗腦:劍橋分析險惡的臉書策略 Electronic Brainwashing: Cambridge Analytica's Sinister Facebook Strategy | The Daily Show

  • 123 1
    Samuel 發佈於 2018 年 03 月 22 日
影片單字