Placeholder Image

字幕列表 影片播放

  • Democracy is easy.

    民主非常簡單。

  • It's like stealing ice cream from a baby

    就像從小嬰兒手上偷走冰淇淋一樣輕而易舉。

  • I genuinely love the process of manipulating people online for money.

    我真心喜愛在網路上操控人們以賺錢。

  • We just want to predict your future behaviours.

    我們只是想預測你未來的行為。

  • These videos are all deepfakes.

    這些都是深度偽造影片。

  • Synthesised content created using artificial intelligence.

    運用人工智慧而成的內容。

  • Fake, fake, disgusting news.

    虛假、噁心的新聞。

  • Deepfakes will make for even more complicated arguments about what is fake news and what is real.

    深度偽造影片會讓新聞到底是真是假的爭論更加複雜。

  • And if seeing is no longer believing, the very real question is could deepfakes weaken democracy?

    若是眼見不再能為憑,實際的問題是,深度偽造影片會不會削弱民主社會?

  • Democracy just doesn't work if people don't believe in it.

    人民不相信的話民主就無法運作。

  • So the deepfake artworks used artificial intelligence and machine-learning technologies to kind of hack the bodies if you like, of famous celebrity influencers.

    深度偽造作品利用人工智慧與機器學習科技來駭取仿造名人網紅的軀體。

  • Bill Posters is the artist behind these deepfake videos known as the Spectre Project.

    Bill Posters 是名為 Spectre 的計畫中這些假影片的幕後黑手。

  • Spectre is almost too powerful to comprehend.

    Spectre 計畫幾乎強大到人們難以理解。

  • Two of the main questions we wanted to explore with the Spectre Project is what does it feel like when our personal data is used in unexpected ways by powerful tech companies and how, as a result, can that change our understandings of today.

    Spectre 計畫想探討的兩個主要問題是,要是我們的私人資料在預期之外被大型科技公司會怎麼樣,還有這個結果將如何改變我們現今的認知。

  • To test Facebook's response, Bill posted the deepfake videos on Instagram a social-media platform owned by Facebook.

    為了測試臉書對此的反應,Bill 在臉書持有的社群平台 Instagram 上發布了深度偽造影片。

  • The company downgraded the videos' visibility.

    臉書降低了那些影片的能見度。

  • Spectre showed me how to manipulate you into sharing intimate data about yourself and all those you love for free.

    Spectre 教會我如何控制你免費分享自身和所愛的人的私密資料。

  • But that didn't stop this fake clip of Facebook boss Mark Zuckerberg going viral.

    但那並未阻止這段臉書老闆馬克祖克伯的假短片爆紅瘋傳。

  • That showed the potential for spreading disinformation online through deepfakes.

    那顯現了使用深度偽造影片散布假消息的潛能。

  • A danger that's likely to increase as long as tech companies and politicians remain unsure how to deal with it.

    只要科技公司和政客不確定如何處理,這個危險將持續加重。

  • The power of deepfakes is an area of great concern whilst these technologies exist in what is essentially a regulatory black hole.

    由於這些科技基本上毫無規範,深度偽造科技的力量十分令人憂心。

  • Image manipulation is already exploited by autocratic regimes.

    獨裁政權早已利用影像操縱技術。

  • It's a dark art that goes back to Joseph Stalin who made his enemies disappear.

    這項黑暗的藝術可追溯至讓敵人消失的史達林。

  • AI today is capable of making deepfake videos like this where comedian Bill Hader morphs into Tom Cruise.

    現今的人工智慧技術能夠做出合成喜劇演員比爾哈德和湯姆克魯斯的深度偽造影片。

  • As the technology advances the danger is that deepfakes will be used to mislead voters in democratic countries.

    隨著科技進步,深度偽造影片的危險在於會被用來誤導民主國家的投票者。

  • If you take away those tools that enable us to be able to sort out what's real from what's not, you make very poor decisions.

    若是剝奪了得以辨識事物真偽的工具,人們就會做出糟糕的決定。

  • Aviv Ovadya is the founder of Thoughtful Technology Project.

    Aviv Ovadya 是 Thoughtful Technology 計畫的創辦人。

  • He worries about another problem that deepfakes could be used as an excuse to help politicians escape scrutiny.

    他擔心另一個問題,政客能因深度偽造影片逃過審查。

  • You have the corrupt politician being able to say, "oh yeah that video of methat was fake."

    貪腐的政客就可以找藉口說:「噢,那段我的影片是假的。」

  • That brings us into a world where people won't know what they can trust.

    這會讓世人不知道能相信什麼。

  • He believes the ultimate threat from deepfakes could be that more and more people opt out of democratic politics.

    他相信深度偽造影片最終極的威脅在於越來越多人可能會選擇放棄民主政治。

  • A phenomenon he calls "reality apathy".

    他將這個現象稱之為「現實冷漠」。

  • Reality apathy is when it's so hard to make sense of what's happening, people just sort of give up.

    現實冷漠就是當理解現況過度困難以至於人們乾脆放棄。

  • Democracy just doesn't work if people don't believe in it.

    人民不相信的話民主就無法運作。

  • So what can be done to fight back?

    那麼,有什麼能夠反擊呢?

  • A group of scientists at Cambridge University are having a go.

    劍橋大學的一群科學家們正在嘗試。

  • They have developed a computer game to teach people how to spot disinformation.

    他們開發了一款教導人們如何指出錯誤訊息的電腦遊戲。

  • So in the game people essentially step into the shoes of a fake news producer, and you build your way up to a fake news empire by spreading fake content online.

    在遊戲中,玩家們以假消息製造者的角度來在假消息帝國中藉由在網路上散布不實內容來博得一席之地。

  • Dr Sander van der Linden, the game's designer, believes it will help people to distinguish fact from fiction.

    遊戲設計師 Sander van der Linden 博士相信這能幫助人們分辨事實和虛構事件。

  • So your goal is to get as many followers as possible while maintaining your online credibility.

    玩家的目標就是一邊維持網路信譽一邊得到越多越好的關注者。

  • So you can't be too ridiculous.

    所以不能夠太荒唐誇張。

  • And the first badge in the game is about impersonating other people online.

    遊戲第一關就是要在網路上模仿其他人。

  • And of course one example that we've talked about is deepfakes.

    當然其中一個我們談過的例子就是深度偽造影片。

  • So in the game we test people before and after, and at the beginning we found that people are duped by a lot of these techniques, but once they played the game they've become resistant and are able to identify them later on.

    我們在遊戲中對人們進行前後測,一開始我們發現他們被很多這些技巧欺騙,但一旦他們玩了遊戲就對其有抵抗力,並且之後能夠辨識出來。

  • Dr van der Linden's team have drawn inspiration from preventative medicine in their hunt to cure fake news.

    van der Linden 博士的團隊在尋求假新聞解藥時從預防性藥物取得靈感。

  • So just as you inject someone with a severely weakened dose of a virus to trigger antibodies in the immune system, you can do the same with information.

    就像會往人體注射極低劑量的病毒來觸發抗體,資訊也可以這麼做。

  • People can essentially create mental antibodies and become immune against fake news.

    人們基本上能夠創造出心智抗體來對假消息產生免疫力。

  • And essentially everyone is their own bullshit detector.

    每個人都是自己的唬爛偵測器。

  • Today I'm president, not because I'm the greatest, though probably I am.

    我今天身為總統不是因為我是最偉大的,雖然我可能真的是。

  • Deepfake technology means that faking videos is becoming as easy as faking words and photos.

    深度偽造技術代表偽造影片將和偽造文字和照片一樣容易。

  • Until people learn to look at video with a more critical eye, there's a danger that deepfakes could be used to undermine democracy.

    直到人們學會以更加批判的眼光看待影片之前,民主都有被深度偽造影片削弱的危機。

Democracy is easy.

民主非常簡單。

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋