Placeholder Image

字幕列表 影片播放

已審核 字幕已審核
  • Hello.

    哈囉。

  • Today I'm going to be talking to you about a new technology that's affecting famous people.

    今天我要來跟你談談一項影響了許多有名人士的新科技。

  • - Remember when Obama called Trump a dipshit? -"Complete dipshit."

    - 記得歐巴馬說川普是個腦殘嗎? -「完全就是個腦殘。」

  • Or the time Kim Kardashian rapped, "Because I'm always half naked"?

    或是金.卡戴珊饒舌唱:「因為我總是半裸。」?

  • Or when Arnold Schwarzenegger impersonated himself?

    或是阿諾.史瓦辛格模仿自己的時候?

  • "Get out of there! There's a bomb in there! Get out!"

    「快跑!那裡有炸彈!快跑!」

  • Deepfake. Deepfake. Deepfake.

    深度偽造。深度偽造。深度偽造。

  • You gotta be kidding!

    你開玩笑的吧!

  • This is a deepfake, too.

    這也是深度偽造。

  • I'm not Adele.

    我不是愛黛兒。

  • But I am an expert in online manipulation.

    但我是個網路話題操作的專家。

  • So deepfakes is a term that is used to describe video or audio files that have been created using artificial intelligence.

    「深度偽造」是一個用來形容「用人工智慧製作出來的影片檔或音檔」的用語。

  • My favorite is probably Lisa Vanderpump.

    我最喜歡的演員應該是 Lisa Vanderpump。

  • It started as a very basic face-swapping technology.

    深度偽造是從最基本的換臉技術開始的。

  • And now it's turned into film-level CGI.

    現在則已變成電影規格的特效製作。

  • There's been this huge explosion of, "Oh my goodness, we can't trust anything."

    「天啊!我們什麼都不能相信了!」這樣的訊息急遽增加。

  • Yes, deepfakes are eerily dystopian.

    沒錯,深度偽造製造出來的東西恐怖地反烏托邦。

  • And they're only going to get more realistic and cheaper to make.

    它們只會變得更真實、成本變得更低。

  • But the panic around them is overblown.

    但這樣的恐慌有點過於誇張了。

  • In fact, the alarmist hype is possibly more dangerous than the technology itself.

    事實上,危言聳聽的媒體炒作或許比科技本身更危險。

  • Let me break this down.

    以下是我的分析。

  • First, what everyone is freaking out about is actually not new.

    第一點,讓大家嚇壞了的東西其實已存在一陣子。

  • It's a much older phenomenon that I like to call the weaponization of context or shallowfakes with Photoshop and video editing software.

    那是種較以前的現象,我都叫它們「內容武器」或「淺偽」,是用 Photoshop 或是影片編輯軟體製作出來的。

  • "There's so many needs . . ."

    有許多的需求。

  • How about the time Nancy Pelosi appeared to be drunk while giving a speech?

    來看看美國政客南希.裴洛西似乎在酒醉時做了一番演講。

  • "But you never know."

    「但你不會知道。」

  • "with this president of the United States."

    「這美國總統。」

  • Turns out that video was just slowed down at 75%.

    結果只是那影片被以 0.75 倍速播放。

  • "It was very, very strange."

    「這非常奇怪。」

  • You can have a really simplistic piece of misleading content that can do huge damage.

    一條非常簡單的誤導訊息,也可以造成巨大的傷害。

  • For example, in the lead-up to the midterms, we saw lots of imagery around this caravan of people who were moving towards the U.S.

    舉例來說,在美國期中選舉的預備階段,網路上出現了許多類似移民車隊前往美國的照片。

  • This photo was shared with captions demonizing the so-called migrant caravan at the U.S.-Mexico border in 2018.

    這張照片於 2018 年在網路上流傳,以文字詆毀那些所謂的在美墨邊界的移民車隊。

  • But a reverse image search showed it was actually Pakistani refugees in Greece.

    但反向圖片搜尋的結果指出這其實是在希臘的巴基斯坦難民。

  • You don't need deepfakes' A.I. technology to manipulate emotions or to spread misinformation.

    你不需要深度偽造人工智慧技術來操作人群的情緒或散播假消息。

  • This brings me to my second point.

    這也帶到我的第二點。

  • What we should be really worried about is the liar's dividend.

    我們真的該擔心的是「說謊者紅利」。

  • The lies and actions people will get away with by exploiting widespread skepticism to their own advantage.

    說謊、造假者可利用人們廣傳的懷疑論來逃過一劫,甚至獲取利益。

  • So, remember the "Access Hollywood" tape that emerged a few weeks before the 2016 election?

    記得在 2016 年選舉前幾週流出的「前進好萊塢」片段嗎?

  • "When you're a star, they let you do it."

    「當你是明星,他們就會讓你做。」

  • "You can do anything."

    「你什麼都能做。」

  • Around that time, Trump apologized, but then more recently he's actually said, I'm not sure if I actually said that.

    在當時,川普道歉了,但最近他說:「我不確定我是否真的有說過那些話。」

  • When anything can be fake, it becomes much easier for the guilty to dismiss the truth as fake.

    當任何東西都有可能造假,這讓造謠者能更輕易地將真相營造成虛假。

  • What really keeps me awake at night is less the technology.

    讓我夜晚輾轉難眠的不是科技。

  • It's how we as a society respond to the idea that we can't trust what we see or what we hear.

    而是社會大眾對「我們無法相信所見所聞」這件事做出怎麼樣的反應。

  • So if we are fearmongering, if we are hyperbolic, if we are waving our hands in the air, that itself can be part of the problem.

    如果我們危言聳聽、誇大事實,或漠不關心,這都可能成為問題之一。

  • You can see where this road leads.

    你可以看到這件事會怎麼發展。

  • As public trust in institutions like the media, education and elections dwindles, then democracy itself becomes unsustainable.

    當人們對於媒體、教育與選舉的信任降低,民主本身將變得岌岌可危。

  • The way that we respond to this serious issue is critical.

    如何回應這嚴肅議題是很重要的。

  • Partly this is the platforms thinking very seriously about what they do with this type of content, how they label this kind of content.

    有一部分是這些平台該認真想想如何處理這些內容、如何為這些內容分類。

  • Partly is the public recognizing their own responsibility.

    一部分則是大眾對自身責任的認知。

  • And if you don't know 100%, hand on heart, "This is true," please don't share, because it's not worth the risk.

    所以若你不敢百分之百確認「這是真的」,請不要分享,因為這風險並不值得。

Hello.

哈囉。

字幕與單字
已審核 字幕已審核

單字即點即查 點擊單字可以查詢單字解釋