Placeholder Image

字幕列表 影片播放

由 AI 自動生成
  • How did you first find out that there were deepfakes of you?

    你是怎麼第一次發現有你的深陷的?

  • My husband actually told me.

    我老公居然告訴我。

  • Because he is friends with Ashton Kutcher.

    因為他是艾什頓-庫徹的朋友。

  • So he actually told him like, “Oh, by the way, there these things called deepfakes

    所以,他居然告訴他,"哦,順便說一下,有這些東西叫深層偽裝

  • and your wife is one of them."

    而你的妻子就是其中之一。"

  • Deepfakes use machine learning to fabricate events that never happened

    Deepfakes利用機器學習來編造從未發生過的事件。

  • like Bill Hader shape-shifting:

    就像比爾-海德的變形。

  • And I said, “Seth Rogan was like, “It was amazing!

    我說,"塞斯-羅根就像,"這是驚人的!"。

  • He has like, a bike track in his back yard!

    他的後院裡有一條自行車道!

  • It's phenomenal.”

    這是驚人的。"

  • And I did a Seth Rogan impression

    我還模仿了塞斯-羅根

  • And it was like I did a magic trick, Tom Cruise was like, “Ahoooh!”

    這就像我做了一個魔術,湯姆-克魯斯是這樣的,"Ahoooh!"

  • And there's growing concern that in the wrong hands this technology

    越來越多的人擔心,在錯誤的人手中,這種技術。

  • can pose a very real national security threat.”

    "會對國家安全構成非常真實的威脅"

  • could impact the 2020 election

    "可能影響2020年選舉"

  • could become a real and present danger to our democracy, Dana.”

    "可能會對我們的民主構成真正的威脅 Dana"

  • But the most pressing, daily threat of deepfakes isn't politics.

    但是,最緊迫的、每天都在威脅著深交所的不是政治。

  • It's porn.

    這是色情。

  • People are hijacking women's faces to make porn videos they never consented to be in.

    人們劫持了女人的臉來製作她們從未同意的色情視頻。

  • Which is why this kind of deepfake is harmful, even when people know it's not real.

    所以這種深度造假才是有害的,即使人們知道它不是真的。

  • I was just shocked.

    我只是被嚇到了。

  • Because this is my face, it belongs to me!

    因為這是我的臉,是屬於我的!

  • In a September 2019 survey, researchers at Deeptrace found that of the deepfake videos

    在2019年9月的一項調查中,Deeptrace的研究人員發現,在深度假視頻中

  • they could identify online, 96% were pornographic.

    他們可以在網上識別,96%是色情。

  • And that more or less 100% percent of these videos are women and are not consensual.

    而這些視頻中,或多或少百分之百是女性,並不是自願的。

  • Pornhub and other big porn streaming sites have policies banning deepfakes,

    Pornhub和其他大的色情流媒體網站都有禁止深扒的政策。

  • though they don't seem to be enforcing them.

    雖然他們似乎沒有執行這些規定。

  • Mostly, these videos show up on separate sites dedicated to this kind of abuse.

    大多數情況下,這些視頻會出現在單獨的網站上,專門針對這種虐待行為。

  • It's not just celebrities anymore.

    不僅僅是名人了。

  • Not that celebrities feel any less pain from these videos.

    並不是說名人從這些視頻中感受到的痛苦就少了。

  • But the phenomenon is evolving at a rate where we're seeing deepfake pornography increasing

    但這種現象正在以一種速度發展,我們看到深層假色情的現象越來越多

  • in number and an increasing number of victims as well.

    越來越多,受害者也越來越多。

  • I was at work and I got an email on my phone.

    我正在上班,手機上收到一封郵件。

  • In fact, I can even bring up the email.

    事實上,我甚至可以提起郵件。

  • 25th of May, 4:18 p.m.

    5月25日下午4時18分。

  • "F.Y.I. There is a deepfake video of you on some porn sites.

    "F.Y.I.在一些色情網站上有你的深層偽造視頻。

  • It looks real."

    它看起來很真實。"

  • I remember sitting down receiving that email.

    我記得我坐下來收到那封郵件。

  • I think it was like you're frozen for that moment of time.

    我想那一刻你就像被凍結了一樣。

  • It was depicting me having sexual intercourse, and the title of the video had my full name.

    那是描寫我發生性關係的,視頻的標題有我的全名。

  • And then I saw another video that was depicting me performing oral sex.

    然後我又看到了另一個視頻,是描述我進行口交的。

  • Noelle is an Australian law graduate.

    Noelle是澳洲法律專業的畢業生。

  • She's not a celebrity.

    她不是名人。

  • Someone took photos she shared on social media and first photoshopped them into nude images

    有人將她在社交媒體上分享的照片,先是拍照合成為裸體圖片。

  • then graduated to deepfake videos.

    然後畢業於深造視頻。

  • What's happening in these videos is a specific kind of digital manipulation,

    這些視頻中發生的是一種特殊的數字操作。

  • and it's unlike older face-swapping filters you might have used.

    而且它不像你可能用過的老式換臉濾鏡。

  • Those tools let you put your face into your friend's head, but you still controlled it

    這些工具讓你把臉放到你朋友的頭上,但你還是控制了它。

  • -- a sort of video Photoshop --

    -- 一種視頻Photoshop --

  • transferring both your facial features and your expressions.

    轉移你的面部特徵和表情。

  • But deepfakes can take the facial features alone,

    但深層的假面可以單獨拿下面部特徵。

  • and animate that face with the expressions of someone else.

    並用別人的表情將這張臉生動化。

  • Tom Cruise was likeAhoooh!”

    "湯姆-克魯斯就像..." "Ahoooh!"

  • This is what makes deepfake porn videos so invasive.

    這就是深層假色情視頻的侵害性。

  • The creator takes away a victim's control of her own face

    創造者剝奪了受害者對自己臉部的控制權。

  • and uses it for something she never wanted.

    並用它來換取她不想要的東西。

  • Transferring a mask of someone's facial features requires training a piece of software

    轉移一個人的面部特徵的面具需要訓練一個軟件。

  • called anautoencoderon hundreds of images of the same face from different angles

    被稱為 "自動編碼器 "的數百張同一張臉從不同角度拍攝的影像。

  • and in different lighting with different expressions until it learns what they all have in common.

    並在不同的燈光下用不同的表情,直到它瞭解到它們的共同點。

  • That volume of images has long been available of celebrities, but increasingly it exists of...

    那一量的影像早已有了名人,但越來越多地存在於......

  • anyone.

    任何人。

  • If you're someone not even with a very intense social media presence, but just a presence online,

    如果你是一個連社交媒體都沒有非常激烈的存在,但只是在網上存在的人。

  • you have a few pictures.

    你有一些圖片。

  • Maybe there's a video of you from a party or something like this.

    也許有一個視頻 你從一個黨或類似的東西。

  • You have so many training images to take from that.

    你有那麼多的訓練畫面可以借鏡。

  • At the moment, you do still need a fair bit of data to make a convincing deepfake.

    目前,你確實還需要相當多的數據才能做出令人信服的深造。

  • But as the technology is improving, we're needing less data and the tools are becoming increasingly

    但隨著技術的進步,我們需要的數據越來越少,工具也越來越多。

  • accessible and user-friendly.

    容易獲得和方便用戶。

  • And there's a growing infrastructure for deepfakes.

    而且深造的基礎設施也越來越多。

  • It's not just about posting videos, it's also about forums discussing how to make them,

    不僅僅是發佈視頻,還可以在論壇上討論如何製作視頻。

  • how to target certain individuals.

    如何針對某些人。

  • In fact, the termdeepfakeoriginated as the name of a subreddit for swapping

    事實上,"深假 "一詞起源於一個交換的子reddit的名稱。

  • celebrity faces onto porn stars.

    名人面孔到色情明星。

  • Reddit banned the page, but the conversation just moved to other sites.

    Reddit禁止了這個頁面,但談話只是轉移到其他網站。

  • You have almost directories about, "Okay, you want to make a deepfake of a certain celebrity.

    你有幾乎目錄,"好吧,你想對某位名人進行深層次的偽造。

  • Here are adult performers that will best suit that."

    這裡有最適合的成人表演者。"

  • There's a lot of money to be made from selling custom deepfakes.

    賣定製的深裝品,有很多錢可以賺。

  • Users pay for deepfakes of specific celebrities

    用戶為特定名人的深扒付費

  • or even women they know personally.

    甚至是自己認識的女性。

  • And they discuss whether all of this is legal.

    他們還討論了這一切是否合法。

  • Some think they can protect themselves by identifying the videos as fake.

    有的人認為,他們可以通過識別視頻是假的來保護自己。

  • But that's not true.

    但這不是真的。

  • If you live in the US and someone makes porn with your face, you can sue the creator

    如果你住在美國,有人用你的臉拍色情片,你可以起訴創作者。

  • whether or not they've marked it as a deepfake.

    無論他們是否已經將其標記為深偽。

  • What is true is, it's very difficult to actually do that.

    真實的情況是,要真正做到這一點非常困難。

  • You'd need to pay to bring a lawsuit, with no certainty you'd be able to secure a punishment

    你要打官司就得花錢,還不一定能爭取到懲罰

  • or even find the creator, let alone stop the video from spreading.

    甚至找到創作者,更不用說阻止視頻的傳播了。

  • Some people on these sites question the morality of what they're doing.

    這些網站上的一些人質疑他們所做的事情的道德性。

  • And disagree about how they'd feel if it happened to them.

    而不同意他們如果發生在自己身上會有什麼感受。

  • But it probably won't happen to them.

    但這可能不會發生在他們身上。

  • That's the point.

    這就是問題的關鍵。

  • Taking a woman's face and putting it into this context is part of a long history

    把一個女人的臉放到這個背景下,是歷史的一部分。

  • of using sexual humiliation against women.

    對婦女進行性羞辱;

  • You know, we're having this gigantic conversation about consent and I don't consent.

    你知道,我們有這個巨大的談話 關於同意和我不同意。

  • So that's why it's not OK.

    所以這就是為什麼不可以。

  • Even if it's labeled as, "this is not actually her."

    即使它的標籤是,"這其實不是她"。

  • It's hard to think about that.

    想一想都覺得難受。

  • This is probably one of the most difficult things because

    這可能是最困難的事情之一,因為

  • fake porn and my name will be forever associated.

    假色情片,我的名字將永遠聯繫在一起。

  • My children, my future children, will have to see things.

    我的孩子們,我未來的孩子們,將不得不看到的東西。

  • My future partner will have to see things.

    我未來的伴侶要看事情。

  • And that's what makes me sad.

    這就是讓我傷心的原因。

  • I just, I wish that the Internet were a little bit more responsible and a little bit kinder.

    我只是,希望網絡上的人多一點責任心,多一點善良。

How did you first find out that there were deepfakes of you?

你是怎麼第一次發現有你的深陷的?

字幕與單字
由 AI 自動生成

單字即點即查 點擊單字可以查詢單字解釋