Placeholder Image

字幕列表 影片播放

  • My relationship with the internet reminds me of the setup

    譯者: Lilian Chiu 審譯者: Yanyan Hong

  • to a clichéd horror movie.

    我與網際網路的關係,

  • You know, the blissfully happy family moves in to their perfect new home,

    讓我想起老套恐怖片的情境。

  • excited about their perfect future,

    幸福快樂的家庭, 搬進一間美好的新房子,

  • and it's sunny outside and the birds are chirping ...

    興奮期待完美的未來,

  • And then it gets dark.

    外頭陽光普照,鳥兒在唱歌……

  • And there are noises from the attic.

    接著電影就變黑暗了。

  • And we realize that that perfect new house isn't so perfect.

    閣樓傳出噪音。

  • When I started working at Google in 2006,

    我們發現,那間美好的 新房子並沒有那麼美好。

  • Facebook was just a two-year-old,

    2006 年,當我開始 在谷歌(Google)工作時,

  • and Twitter hadn't yet been born.

    臉書(Facebook)才剛推出兩年,

  • And I was in absolute awe of the internet and all of its promise

    推特(Twitter)甚至還沒問世。

  • to make us closer

    我對網際網路及它所有的承諾 感到絕對的敬畏,

  • and smarter

    它承諾要讓我們

  • and more free.

    更靠近且更聰明,

  • But as we were doing the inspiring work of building search engines

    還有給予更多自由。

  • and video-sharing sites and social networks,

    但當我們開始進行 這鼓舞人心的工作,

  • criminals, dictators and terrorists were figuring out

    建立搜尋引擎,

  • how to use those same platforms against us.

    建立影片分享網站和社交網路,

  • And we didn't have the foresight to stop them.

    罪犯、獨裁者, 及恐怖分子都在想辦法

  • Over the last few years, geopolitical forces have come online to wreak havoc.

    如何用同樣的平台來對抗我們。

  • And in response,

    我們沒有先見之明來阻止他們。

  • Google supported a few colleagues and me to set up a new group called Jigsaw,

    在過去幾年,地緣政治學的 勢力也上網展開大破壞。

  • with a mandate to make people safer from threats like violent extremism,

    造成的反應是,

  • censorship, persecution --

    谷歌支持我和幾位同事, 成立一個小組,叫做「Jigsaw」,

  • threats that feel very personal to me because I was born in Iran,

    我們的使命是要讓大家更安全, 避免受到像是極端主義、

  • and I left in the aftermath of a violent revolution.

    審查制度、迫害的威脅——

  • But I've come to realize that even if we had all of the resources

    我個人對這些威脅特別有感, 因為我是在伊朗出生的,

  • of all of the technology companies in the world,

    在一場暴力革命的餘波中, 我被迫離開了伊朗。

  • we'd still fail

    但我漸漸了解到, 即使我們有所有的資源,

  • if we overlooked one critical ingredient:

    有全世界所有的科技公司,

  • the human experiences of the victims and perpetrators of those threats.

    如果我們忽略了

  • There are many challenges I could talk to you about today.

    一項關鍵因素,我們仍然會失敗:

  • I'm going to focus on just two.

    那些威脅的受害者 與加害者的人類經歷。

  • The first is terrorism.

    今天我其實可以 與各位談很多的挑戰。

  • So in order to understand the radicalization process,

    但我只打算著重兩項:

  • we met with dozens of former members of violent extremist groups.

    第一是恐怖主義。

  • One was a British schoolgirl,

    為了要了解激進化的過程,

  • who had been taken off of a plane at London Heathrow

    我們和數十名暴力 極端主義團體的前成員見面。

  • as she was trying to make her way to Syria to join ISIS.

    其中一位是英國的女學生,

  • And she was 13 years old.

    她曾在倫敦希斯洛機場 被強拉下飛機,

  • So I sat down with her and her father, and I said, "Why?"

    因為當時她打算 去敘利亞加入伊斯蘭國。

  • And she said,

    她當時只有十三歲。

  • "I was looking at pictures of what life is like in Syria,

    我和她及她父親坐下來談, 我說:「為什麼?」

  • and I thought I was going to go and live in the Islamic Disney World."

    她說:

  • That's what she saw in ISIS.

    「我在看一些敘利亞 生活寫照的圖片,

  • She thought she'd meet and marry a jihadi Brad Pitt

    我以為我是要去住到 伊斯蘭的迪士尼樂園。」

  • and go shopping in the mall all day and live happily ever after.

    這是她看到的伊斯蘭國。

  • ISIS understands what drives people,

    她以為她要去見一位聖戰士中的 布萊德彼得並嫁給他,

  • and they carefully craft a message for each audience.

    整天都能去購物, 從此幸福快樂地生活。

  • Just look at how many languages

    伊斯蘭國知道什麼能驅使人,

  • they translate their marketing material into.

    他們會為每一位觀眾 精心策劃一則訊息。

  • They make pamphlets, radio shows and videos

    光是去看看他們把他們的行銷素材

  • in not just English and Arabic,

    翻譯成多少語言,就能了解了。

  • but German, Russian, French, Turkish, Kurdish,

    他們會製作小冊子、 廣播節目,和影片,

  • Hebrew,

    不只用英語和阿拉伯語,

  • Mandarin Chinese.

    還有德語、俄語、法語、 土耳其語、庫德語、

  • I've even seen an ISIS-produced video in sign language.

    希伯來語、

  • Just think about that for a second:

    華語(中文)。

  • ISIS took the time and made the effort

    我甚至看過一支伊斯蘭國 製作的影片是用手語的。

  • to ensure their message is reaching the deaf and hard of hearing.

    花點時間思考一下:

  • It's actually not tech-savviness

    伊斯蘭國投入時間和心力,

  • that is the reason why ISIS wins hearts and minds.

    來確保他們的訊息 能夠傳達給聽障人士。

  • It's their insight into the prejudices, the vulnerabilities, the desires

    伊斯蘭國能贏得人心和人信,

  • of the people they're trying to reach

    並不是因為他們很精通科技。

  • that does that.

    因為他們有洞見,了解 他們試圖接觸的人有什麼

  • That's why it's not enough

    偏見、脆弱、慾望,

  • for the online platforms to focus on removing recruiting material.

    才能做到這一點。

  • If we want to have a shot at building meaningful technology

    那就說明了為什麼

  • that's going to counter radicalization,

    線上平台只把焦點放在 移除召募素材是不足的。

  • we have to start with the human journey at its core.

    如果我想要有機會建立 一種有意義的技術,

  • So we went to Iraq

    用它來對抗極端化,

  • to speak to young men who'd bought into ISIS's promise

    我們就得要從它核心的 人類旅程開始著手。

  • of heroism and righteousness,

    所以,我們去了伊拉克,

  • who'd taken up arms to fight for them

    去和年輕人交談, 我們找的對象曾相信伊斯蘭國

  • and then who'd defected

    所做的關於英雄主義與公正的承諾,

  • after they witnessed the brutality of ISIS's rule.

    曾拿起武器為他們作戰,

  • And I'm sitting there in this makeshift prison in the north of Iraq

    接著,在目擊了

  • with this 23-year-old who had actually trained as a suicide bomber

    伊斯蘭國統治的殘酷之後選擇變節。

  • before defecting.

    我坐在北伊拉克的一間臨時監獄裡,

  • And he says,

    會見一位在變節前 受過訓練的自殺炸彈客,

  • "I arrived in Syria full of hope,

    年僅 23 歲。

  • and immediately, I had two of my prized possessions confiscated:

    他說:

  • my passport and my mobile phone."

    「我到敘利亞時滿懷著希望,

  • The symbols of his physical and digital liberty

    可一下子我兩項最重要的 東西就被沒收了:

  • were taken away from him on arrival.

    我的護照和我的手機。」

  • And then this is the way he described that moment of loss to me.

    在他抵達時, 這兩樣象徵他實體自由

  • He said,

    與數位自由的東西被奪去了。

  • "You know in 'Tom and Jerry,'

    接著,他這樣向我描述迷失的時刻。

  • when Jerry wants to escape, and then Tom locks the door

    他說:

  • and swallows the key

    「在《湯姆貓與傑利鼠》中,

  • and you see it bulging out of his throat as it travels down?"

    當傑利想要逃脫時,湯姆把門鎖住,

  • And of course, I really could see the image that he was describing,

    把鑰匙吞掉,

  • and I really did connect with the feeling that he was trying to convey,

    還可以從外表形狀看到鑰匙 延著喉嚨下滑,記得嗎?」

  • which was one of doom,

    當然,我能看見他所描述的畫面,

  • when you know there's no way out.

    我真的能和他試圖傳達的 這種感受產生連結,

  • And I was wondering:

    一種在劫難逃的感受,

  • What, if anything, could have changed his mind

    你知道沒有路可逃了。

  • the day that he left home?

    而我很納悶:

  • So I asked,

    在他離家的那一天,如果有的話,

  • "If you knew everything that you know now

    什麼能改變他的心意?

  • about the suffering and the corruption, the brutality --

    於是,我問:

  • that day you left home,

    「如果你當時知道 你現在知道的這些

  • would you still have gone?"

    關於苦難、腐敗、殘酷的狀況——

  • And he said, "Yes."

    在離家的那天就知道,

  • And I thought, "Holy crap, he said 'Yes.'"

    你還會選擇離開嗎?」

  • And then he said,

    他說:「會。」

  • "At that point, I was so brainwashed,

    我心想:「老天爺,他說『會』。」

  • I wasn't taking in any contradictory information.

    接著他說:

  • I couldn't have been swayed."

    「在那個時點,我完全被洗腦了,

  • "Well, what if you knew everything that you know now

    我不會接受任何有所矛盾的資訊。

  • six months before the day that you left?"

    我當時不可能被動搖。」

  • "At that point, I think it probably would have changed my mind."

    「那麼如果你在你離開前 六個月就已經知道

  • Radicalization isn't this yes-or-no choice.

    你現在知道的這些,結果會如何?」

  • It's a process, during which people have questions --

    「若在那個時點, 我想我可能會改變心意。」

  • about ideology, religion, the living conditions.

    極端化並不是關於是非的選擇。

  • And they're coming online for answers,

    它是一個過程,在這過程中, 人們會有問題——

  • which is an opportunity to reach them.

    關於意識型態、宗教、 生活條件的問題。

  • And there are videos online from people who have answers --

    他們會上網尋求答案,

  • defectors, for example, telling the story of their journey

    這就是一個接觸他們的機會。

  • into and out of violence;

    有答案的人會在網路提供影片——

  • stories like the one from that man I met in the Iraqi prison.

    比如,叛逃者訴說他們

  • There are locals who've uploaded cell phone footage

    投入和脫離暴力的心路歷程;

  • of what life is really like in the caliphate under ISIS's rule.

    就像我在伊拉克監獄見到的 那名男子告訴我的故事。

  • There are clerics who are sharing peaceful interpretations of Islam.

    有當地人會上傳手機影片,

  • But you know what?

    呈現在伊斯蘭國統治之下 穆斯林國的真實生活樣貌。

  • These people don't generally have the marketing prowess of ISIS.

    有教會聖職人員分享 關於伊斯蘭的和平詮釋。

  • They risk their lives to speak up and confront terrorist propaganda,

    但你們知道嗎?

  • and then they tragically don't reach the people

    這些人通常都沒有 伊斯蘭國的高超行銷本領。

  • who most need to hear from them.

    他們冒著生命危險說出來, 和恐怖主義的宣傳對質,

  • And we wanted to see if technology could change that.

    但很不幸的是,他們無法接觸到

  • So in 2016, we partnered with Moonshot CVE

    最需要聽到他們聲音的人。

  • to pilot a new approach to countering radicalization

    我們想試看看, 科技是否能改變這一點。

  • called the "Redirect Method."

    2016 年,我們和 Moonshot CVE (信息泄露組織)合作,

  • It uses the power of online advertising

    試驗一種對抗極端化的新方法,

  • to bridge the gap between those susceptible to ISIS's messaging

    叫做「重新定向法」。

  • and those credible voices that are debunking that messaging.

    它用線上廣告的力量,

  • And it works like this:

    在容易受到伊斯蘭國訊息影響的人

  • someone looking for extremist material --

    與揭發那些訊息真面目的 可靠聲音之間搭起橋樑。

  • say they search for "How do I join ISIS?" --

    它的運作方式如下:

  • will see an ad appear

    有人在尋找極端主義的素材——

  • that invites them to watch a YouTube video of a cleric, of a defector --

    比如他們搜尋 「如何加入伊斯蘭國?」——

  • someone who has an authentic answer.

    就會有一則廣告出現,

  • And that targeting is based not on a profile of who they are,

    邀請他們上 YouTube, 看聖職人員、變節者的影片——

  • but of determining something that's directly relevant

    有真實答案的人所拍的影片。

  • to their query or question.

    這個方法鎖定目標對象的方式 不是依據個人資料,

  • During our eight-week pilot in English and Arabic,

    而是由與他們的查詢或問題有直接

  • we reached over 300,000 people

    相關的東西來決定。

  • who had expressed an interest in or sympathy towards a jihadi group.

    我們用英語和阿拉伯語 做了八週的測試,

  • These people were now watching videos

    接觸到了超過三十萬人,

  • that could prevent them from making devastating choices.

    他們都是對聖戰團體 表示感興趣或同情的人。

  • And because violent extremism isn't confined to any one language,

    現在這些人在看的影片

  • religion or ideology,

    能預防他們做出毀滅性的選擇。

  • the Redirect Method is now being deployed globally

    因為暴力極端主義 不侷限於任何一種語言、

  • to protect people being courted online by violent ideologues,

    宗教,或意識形態,

  • whether they're Islamists, white supremacists

    「重新定向法」現已在全球實施,

  • or other violent extremists,

    保護大家上網時不會受到 暴力意識形態的引誘,

  • with the goal of giving them the chance to hear from someone

    不論是伊斯蘭教的、 白人至上主義的,

  • on the other side of that journey;

    或其他暴力極端主義的,

  • to give them the chance to choose a different path.

    我們的目標是要 給他們機會去聽聽看

  • It turns out that often the bad guys are good at exploiting the internet,

    在那旅程另一端的人怎麼說;

  • not because they're some kind of technological geniuses,

    給他們機會去選擇不同的路。

  • but because they understand what makes people tick.

    事實證明, 壞人通常擅長利用網際網路,

  • I want to give you a second example:

    並不是因為他們是什麼科技天才,

  • online harassment.

    而是因為他們了解人的癢處。

  • Online harassers also work to figure out what will resonate

    我再舉個例子說明:

  • with another human being.

    線上騷擾。

  • But not to recruit them like ISIS does,

    線上騷擾者也在致力於 找出什麼能讓

  • but to cause them pain.

    另一個人產生共鳴。

  • Imagine this:

    但他們的目的不像 伊斯蘭國是要招募人,

  • you're a woman,

    而是造成別人痛苦。

  • you're married,

    想像這個狀況:

  • you have a kid.

    你是一名女子,

  • You post something on social media,

    已婚,

  • and in a reply, you're told that you'll be raped,

    有一個孩子。

  • that your son will be watching,

    你在社交媒體上發了一篇文,

  • details of when and where.

    你得到一則回應,說你會被強暴,

  • In fact, your home address is put online for everyone to see.

    你的兒子會被監視,

  • That feels like a pretty real threat.

    還有時間和地點的細節資訊。

  • Do you think you'd go home?

    事實上,在網路上大家 都能看到你家的地址。

  • Do you think you'd continue doing the thing that you were doing?

    那威脅感覺十分真實。

  • Would you continue doing that thing that's irritating your attacker?

    你認為你會回家嗎?

  • Online abuse has been this perverse art

    你認為你會繼續做你正在做的事嗎?

  • of figuring out what makes people angry,

    你會繼續做那件惹惱了 攻擊你的人的那件事嗎?

  • what makes people afraid,

    線上虐待一直都是種 刻意作對的藝術,

  • what makes people insecure,

    找出什麼能讓人生氣,

  • and then pushing those pressure points until they're silenced.

    什麼能讓人害怕,

  • When online harassment goes unchecked,

    什麼能讓人沒有安全感,

  • free speech is stifled.

    接著去壓那些對壓力敏感之處, 直到它們被壓制下來。

  • And even the people hosting the conversation

    當線上騷擾不受管束時,

  • throw up their arms and call it quits,

    自由言論就會被扼殺。

  • closing their comment sections and their forums altogether.

    即使主持對話的人

  • That means we're actually losing spaces online

    棄械並宣佈到此為止,

  • to meet and exchange ideas.

    把他們的留言區以及論壇都關閉。

  • And where online spaces remain,

    那意味著,我們其實正在失去線上

  • we descend into echo chambers with people who think just like us.

    可以碰面交換點子的空間。

  • But that enables the spread of disinformation;

    在還有線上空間的地方,

  • that facilitates polarization.

    我們會陷入到迴音室當中, 只和相同想法的人聚在一起。

  • What if technology instead could enable empathy at scale?

    但那會造成錯誤訊息被散佈出去;

  • This was the question that motivated our partnership

    那會促成兩極化。

  • with Google's Counter Abuse team,

    但如果反之能用科技 來大量產生同理心呢?

  • Wikipedia

    就是這個問題

  • and newspapers like the New York Times.

    驅使我們和谷歌的反虐待小組、

  • We wanted to see if we could build machine-learning models

    維基,

  • that could understand the emotional impact of language.

    以及像紐約時報這類報紙合作。

  • Could we predict which comments were likely to make someone else leave

    我們想要看看我們 是否能建立出能夠了解

  • the online conversation?

    語言會帶來什麼情緒影響的 機器學習模型,

  • And that's no mean feat.

    我們能否預測什麼樣的意見 有可能會讓另一個人

  • That's no trivial accomplishment

    離開線上對談?

  • for AI to be able to do something like that.

    那不是容易的事。

  • I mean, just consider these two examples of messages

    人工智慧要做到

  • that could have been sent to me last week.

    那樣的事,並不是理所當然。

  • "Break a leg at TED!"

    我是指,想想這兩個例子,

  • ... and

    都是在上週我有可能 會收到的訊息。

  • "I'll break your legs at TED."

    「祝在 TED 順利!」 (直譯:在 TED 斷一條腿。)

  • (Laughter)

    以及

  • You are human,

    「我會在 TED 打斷你一條腿。」

  • that's why that's an obvious difference to you,

    (笑聲)

  • even though the words are pretty much the same.

    你們是人,

  • But for AI, it takes some training to teach the models

    那就是為何你們能明顯看出,

  • to recognize that difference.

    用字幾乎相同的兩個句子有何差別。

  • The beauty of building AI that can tell the difference

    但對人工智慧來說, 要透過訓練來教導模型

  • is that AI can then scale to the size of the online toxicity phenomenon,

    去辨視那差別。

  • and that was our goal in building our technology called Perspective.

    建立能夠分辨出那差別的 人工智慧,有個美好之處,

  • With the help of Perspective,

    就是人工智慧能處理 線上毒素現象的規模,

  • the New York Times, for example,

    為此目的,我們建立了 一種出名為「觀點」的技術。

  • has increased spaces online for conversation.

    在「觀點」的協助下,

  • Before our collaboration,

    以紐約時報為例,

  • they only had comments enabled on just 10 percent of their articles.

    他們增加了線上交談的空間。

  • With the help of machine learning,

    在與我們合作之前,

  • they have that number up to 30 percent.

    他們的文章只有 大約 10% 有開放留言。

  • So they've tripled it,

    在機器學習的協助下,

  • and we're still just getting started.

    這個數字提升到了 30%。

  • But this is about way more than just making moderators more efficient.

    增加了三倍,

  • Right now I can see you,

    且我們才剛開始合作而已。

  • and I can gauge how what I'm saying is landing with you.

    這絕對不只是讓版主更有效率。

  • You don't have that opportunity online.

    現在,我可以看見你們,

  • Imagine if machine learning could give commenters,

    我可以估量我所說的話 會如何對你們產生影響。

  • as they're typing,

    在線上沒有這樣的機會。

  • real-time feedback about how their words might land,

    想像一下, 當留言者在打字的時候,

  • just like facial expressions do in a face-to-face conversation.

    如果機器學習能夠

  • Machine learning isn't perfect,

    即使給他們回饋, 說明他們的文字會造成什麼影響,

  • and it still makes plenty of mistakes.

    就像在面對面交談時, 面部表情的功能。

  • But if we can build technology

    機器學習並不完美,

  • that understands the emotional impact of language,

    它仍然會犯許多錯誤。

  • we can build empathy.

    但如果我們能建立出

  • That means that we can have dialogue between people

    能了解語言有什麼 情緒影響力的技術,

  • with different politics,

    我們就能建立同理心。

  • different worldviews,

    那就表示,我們能讓兩個人對話,

  • different values.

    即使他們政治立場不同,

  • And we can reinvigorate the spaces online that most of us have given up on.

    世界觀不同,

  • When people use technology to exploit and harm others,

    價值觀不同。

  • they're preying on our human fears and vulnerabilities.

    我們能讓大部分人已經放棄的 線上空間再度復甦。

  • If we ever thought that we could build an internet

    當人們用科技來利用和傷害他人時,

  • insulated from the dark side of humanity,

    他們靠的是我們人類的恐懼和脆弱。

  • we were wrong.

    如果我們認為我們能夠建立一個完全

  • If we want today to build technology

    沒有人性黑暗面的網際網路,

  • that can overcome the challenges that we face,

    我們就錯了。

  • we have to throw our entire selves into understanding the issues

    如果現今我們想要建立技術

  • and into building solutions

    來克服我們面臨的挑戰,

  • that are as human as the problems they aim to solve.

    我們就得把自己全心全意投入, 並去了解這些議題,

  • Let's make that happen.

    去建立人類的解決方案,

  • Thank you.

    來解決人類的問題。

  • (Applause)

    讓我們來實現它吧。

My relationship with the internet reminds me of the setup

譯者: Lilian Chiu 審譯者: Yanyan Hong

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

B1 中級 中文 美國腔 TED 伊斯蘭國 線上 主義 影片 暴力

TED】Yasmin Green:科技如何對抗極端主義和網絡騷擾(科技如何對抗極端主義和網絡騷擾|亞斯敏-格林 (【TED】Yasmin Green: How technology can fight extremism and online harassment (How technology can fight extremism and online harassment | Yasmin Green))

  • 522 32
    林宜悉 發佈於 2021 年 01 月 14 日
影片單字