Placeholder Image

字幕列表 影片播放

由 AI 自動生成
  • You're a little whore and we've all seen your little video.

    你是個小婊子,我們都看過你的小視頻。

  • That was the text message that was sent to me in April of 2022.

    這是 2022 年 4 月發給我的簡訊。

  • I'm sitting in my grandmother's living room in what is her 90th birthday, surrounded by family and friends as my phone blows up with messages from strangers right across the country who say they have seen a video of me engaging in hardcore pornographic activity with a man.

    在祖母 90 歲生日之際,我坐在她的客廳裡,周圍都是家人和朋友,而我的手機卻被來自全國各地的陌生人的簡訊刷爆了,他們說看到了我與一名男子進行激烈色情活動的視頻。

  • I knew this was impossible.

    我知道這是不可能的。

  • With just three weeks out from my election, I felt as though my career was crumbling before my very eyes.

    距離競選只有三個星期了,我感覺自己的職業生涯就像在眼前搖搖欲墜。

  • My heart pounded, my mind raced, sweat beaded on my skin, and then I watched the video and my worst fear was realized.

    我的心怦怦直跳,思緒萬千,汗珠在我的皮膚上打轉,然後我看了視頻,我最擔心的事情發生了。

  • Although this woman in the video was not me, she looked exactly like me, impossibly like me, eerily like me.

    雖然視頻中的這個女人不是我,但她長得和我一模一樣,不可能像我,非常像我。

  • I had so many questions running through my mind.

    我的腦海中閃過許多問題。

  • Was this AI?

    這是人工智能嗎?

  • Was it not?

    難道不是嗎?

  • Who made this?

    這是誰做的?

  • How did they make it?

    他們是怎麼做到的?

  • Why did they make it?

    他們為什麼要這樣做?

  • So I did what anyone would do, and I approached my local police service to ask for advice, for guidance, and really, where did I go from there?

    於是,我做了任何人都會做的事,向當地警方尋求建議和指導。

  • But they informed me that they wouldn't have the cybercrime technology to assist to find out where this video came from, and it was from that moment I knew that I was on my own.

    但他們告訴我,他們沒有網絡犯罪技術來協助查找這段視頻的來源,從那一刻起,我知道我只能靠自己了。

  • Now, to set the stage, as you can probably tell, I'm from Ireland, and to be exact, I'm Northern Ireland, which is an even smaller place.

    現在,先介紹一下情況,你們可能看得出來,我來自愛爾蘭,確切地說,我來自北愛爾蘭,那是一個更小的地方。

  • We have just 1.8 million people, very similar to size of Vienna.

    我們只有 180 萬人口,與維也納的大小非常相似。

  • So you can imagine a rumor of this sinister nature, particularly in the world of politics, can go very far very fast, and that old saying, seeing is believing, began to haunt me.

    是以,你可以想象,特別是在政界,這種陰險性質的謠言會很快傳得很遠,那句老話 "眼見為實 "開始困擾著我。

  • And in the weeks leading up to my election, this video, this false video, was shared thousands and thousands of times across WhatsApp.

    在我當選前的幾周,這段視頻,這段虛假視頻,在 WhatsApp 上被分享了成千上萬次。

  • And attached to this video was photos of me at work, smiling, campaigning, building a sense of trust with my constituents.

    這段視頻附有我工作時的照片,我微笑著參加競選活動,與選民建立信任感。

  • And as the weeks went on, messages flooded in faster and faster, and they were of a very vile and sexual nature.

    隨著時間的推移,資訊湧入的速度越來越快,而且都是非常卑鄙的性資訊。

  • Ding, we've all seen your little video.

    丁,我們都看過你的小視頻了。

  • Ding, you should be ashamed of yourself.

    丁,你應該為自己感到羞愧。

  • Ding, ah, now I see how you got your position in politics.

    丁,啊,現在我明白你是如何在政壇上立足的了。

  • It was very difficult.

    這是非常困難的。

  • And having been in politics since the age of 23, and at this point, I've been in it for about four to five years, and I'm from Northern Ireland, which is a post-conflict society, still very deeply divided.

    我從 23 歲開始從政,到現在已經四五年了,我來自北愛爾蘭,那裡是一個衝突後的社會,仍然存在嚴重的分裂。

  • So I anticipated challenges, I anticipated disagreements, I even anticipated attacks.

    是以,我預料到了挑戰,預料到了分歧,甚至預料到了攻擊。

  • It's politics after all.

    畢竟這是政治。

  • But what I did not anticipate was this moment.

    但我沒有預料到的是這一刻。

  • This was different.

    這次不同。

  • This was the moment where misogyny meets the misuse of technology, and even had the potential to impact the outcome of a democratic election.

    這是厭女症與濫用技術交鋒的時刻,甚至有可能影響民主選舉的結果。

  • And the sad thing for me was this lie became so far spread, so far so fast, that even my own family started to believe it.

    讓我感到悲哀的是,這個謊言傳播得如此之廣,如此之快,以至於連我自己的家人都開始相信了。

  • Some would say that they'd heard it at a golf club, others would say they heard it at the bar, and of course, some even said they heard it in a locker room.

    有人說是在高爾夫俱樂部聽到的,也有人說是在酒吧聽到的,當然還有人說是在更衣室聽到的。

  • A really good example of how far this went was people that I knew my entire life would pass me in the street without whispering a word, people like school teachers, people I had a sense of trust with and, you know, an affinity with, and that was really hard.

    有一個很好的例子可以說明這一點:我認識了一輩子的人在街上經過我的時候都不會說一句悄悄話,比如學校的老師,我對他們有一種信任感和親切感,這真的很不容易。

  • It felt like overnight, I was wearing a scarlet letter.

    彷彿一夜之間,我就戴上了紅字。

  • And as things moved on, and we're about two, three weeks out from the election, I kept receiving messages, and it got wider and wider, it was global.

    隨著時間的推移,我們距離大選還有兩三週的時間,我不斷收到資訊,而且範圍越來越廣,是全球性的。

  • Not only was I receiving messages from Dublin and from London, but I was also receiving messages from Massachusetts, Manhattan, and I was getting so many follows on my political social media, predominantly from men, hungry for more of this scandal.

    我不僅收到了來自都柏林和倫敦的資訊,還收到了來自馬薩諸塞州、曼哈頓的資訊,我的政治社交媒體上也有很多關注者,主要是男性,他們渴望瞭解更多的醜聞。

  • And this intersection of online harms impacting my real life was something I found utterly strange and surreal, but it got to the point where I was recognized on the street and approached by a stranger who asked me for a sexual favor.

    這種網絡傷害與現實生活的交叉影響讓我覺得非常奇怪和超現實,但事情發展到這樣一個地步:我在街上被認出來,一個陌生人向我提出性要求。

  • And it was just, for me, it was like in the blink of an eye, everything had just changed.

    對我來說,就像一眨眼的功夫,一切都變了。

  • And it was utterly humiliating.

    這簡直是恥辱。

  • I didn't want to leave the house, and I had turned notifications off in my phone, just so I could kind of catch my breath.

    我不想離開家,我關掉了手機通知,這樣我就可以喘口氣了。

  • But this wasn't ideal in the lead up, of course, to an election.

    當然,這在選舉前並不理想。

  • And for me, I think that was the purpose of this false video, was to do just that.

    對我來說,我認為這就是這段虛假視頻的目的,就是要做到這一點。

  • But what hurt the most for me was sitting down my father and having to explain to him this strange, surreal situation.

    但最讓我傷心的是,我不得不讓父親坐下來,向他解釋這種奇怪的、超現實的情況。

  • My father is an Irishman, completely disconnected from tech.

    我父親是愛爾蘭人,完全與技術脫節。

  • And so having to explain this horrific situation was an entire fabrication was very hard to do.

    是以,要解釋這種可怕的情況是完全捏造的,是非常困難的。

  • This was this strange moment where the online world met my life, my reality.

    這就是網絡世界與我的生活、我的現實相遇的奇特時刻。

  • Not only having the impact to ruin my reputation, but have the capacity to change the outcome of a democratic election.

    這不僅會毀了我的名聲,還有能力改變民主選舉的結果。

  • And you know, for years, I spent so much time building trust with my constituents.

    要知道,多年來,我花了很多時間與選民建立信任。

  • I mean, we all know how much people like politicians, and you know, we're as likable as the taxman.

    我的意思是,我們都知道人們有多喜歡政客,你知道,我們就像稅務員一樣討人喜歡。

  • So for me, it was hard.

    所以對我來說,這很難。

  • It was really hard because it was years of hard work.

    這真的很不容易,因為這是多年的心血。

  • You know, I'm so passionate about my job and this video, this complete falsehood had the ability to just undermine years of hard work in mere seconds.

    你知道,我對我的工作充滿熱情,而這段視頻,這個徹頭徹尾的謊言,卻能在短短几秒鐘內毀掉我多年的辛勤工作。

  • But instead of succumbing entirely to victimhood, I asked myself today, you know, where do we go from here?

    但我今天並沒有完全屈服於受害者的身份,而是問自己,你知道,我們該何去何從?

  • And how can AI evolve to prevent something like this happening again?

    人工智能如何發展才能避免類似事件再次發生?

  • Not only has it happened to me, but we want to future proof and ensure that this doesn't happen to the woman of tomorrow.

    這不僅發生在我身上,我們也希望未來的女性不會再遇到這種情況。

  • How can we, you and I, people who care about people, ensure that this is a tech for good?

    我們,你和我,關心人類的人,如何才能確保這是一項造福人類的技術?

  • How can we, the policy makers, the creators, the consumers, ensure we regulate AI and things like social media, putting humans and humanity at the center of artificial intelligence?

    作為政策制定者、創造者和消費者,我們如何才能確保規範人工智能和社交媒體等事物,將人類和人性置於人工智能的中心?

  • AI can change the world.

    人工智能可以改變世界。

  • In fact, as we've heard today, it already has.

    事實上,正如我們今天所聽到的,它已經做到了。

  • In a matter of seconds, people who speak completely different languages can connect and understand one another.

    只需幾秒鐘,語言完全不同的人就能相互聯繫、相互理解。

  • And we've even seen the Pope as a style icon in a puffer jacket.

    我們甚至還看到教皇身著海豹夾克,成為時尚偶像。

  • So some really important uses right there.

    所以,這裡有一些非常重要的用途。

  • But then in my case as well, we can also see how it is weaponized against the truth.

    但在我的案例中,我們也可以看到它是如何被用來對付真相的。

  • And good examples of this would be art that appears like reality, AI generated reviews on fairly boosting certain products, and things like chap up propaganda.

    這方面的好例子包括貌似現實的藝術作品、人工智能生成的評論對某些產品的公平提振作用,以及諸如皸裂宣傳之類的東西。

  • And then politically speaking, we've seen over the years, deepfakes of Nancy Pelosi slurring, Joe Biden cursing, and even President Zelensky asking his soldiers to surrender their weapons.

    從政治角度講,多年來我們已經看到南希-佩洛西(Nancy Pelosi)口齒不清、喬-拜登(Joe Biden)破口大罵,甚至澤倫斯基總統(President Zelensky)要求他的阿兵哥繳械投降的深情假象。

  • So when AI is used like this to manipulate, it can be a threat to our democracy.

    是以,當人工智能被這樣用來操縱時,就會對我們的民主構成威脅。

  • And the tech is becoming so advanced that it's hard to differentiate fact from fiction.

    而且技術越來越先進,很難區分事實和虛構。

  • So how does AI interfere with politics?

    那麼,人工智能是如何幹預政治的呢?

  • And for us as politicians, what should we be worried about?

    作為政治家,我們應該擔心什麼呢?

  • Could truth and democracy become shattered by AI?

    真理和民主會被人工智能摧毀嗎?

  • Has it already?

    已經有了嗎?

  • Well, to dive a little deeper here, I think firstly, we need to talk about the concept of truth.

    好吧,為了更深入地探討這個問題,我想首先,我們需要談談真理的概念。

  • Without truth, democracy collapses.

    沒有真理,民主就會崩潰。

  • Truth allows us to make informed decisions.

    真相能讓我們做出明智的決定。

  • It enables us to hold leaders accountable, which is very important.

    它使我們能夠讓領導者承擔責任,這一點非常重要。

  • And it also allows us as political representatives to create a sense of trust with our citizens and our constituents.

    這也讓我們作為政治代表能夠與我們的公民和選民建立信任感。

  • But without that truth, democracy is vulnerable to misinformation, manipulation, and of course, corruption.

    但是,如果沒有真相,民主就很容易受到誤導、操縱,當然還有腐敗。

  • When AI erodes truth, it erodes trust, and it undermines our democracy.

    人工智能侵蝕真相,侵蝕信任,破壞民主。

  • And for me, in my experience with a deepfake, I've seen what a fantastic distortion tool that deepfakes really are.

    而對我來說,在我使用 deepfake 的經歷中,我看到了 deepfake 真的是一個非常棒的變形工具。

  • So how can we safeguard democracy from this ever-advancing technology?

    那麼,我們該如何保護民主不受這種不斷進步的技術的影響呢?

  • It's becoming ever harder to distinguish between real and synthetic content.

    區分真實內容和合成內容變得越來越難。

  • And politically, what role does AI play in the future?

    在政治上,人工智能在未來會扮演什麼角色?

  • And I can't talk about politics without talking about media as well.

    談政治就不能不談媒體。

  • They're undeniably linked, they're intertwined.

    不可否認,它們是相互聯繫、相互交織的。

  • And I think journalism has its own battle here as well.

    我認為新聞業也有自己的鬥爭。

  • From AI algorithms, boosting articles unfairly to clickbait headlines, and then also moments where they can manipulate the public as well.

    從人工智能算法、不公平地提升文章到點擊誘餌式標題,再到它們也能操縱公眾的時刻。

  • But politically speaking, we've seen AI-tailored political messaging influencing voters.

    但在政治上,我們已經看到人工智能定製的政治資訊影響著選民。

  • We've seen it adding to existing bias.

    我們看到它加劇了現有的偏見。

  • And definitely, I think we all have that aunt that's on Facebook and kind of believes anything.

    當然,我想我們都有這樣一個阿姨,她在 Facebook 上什麼都信。

  • So for me, as a politician, I think it's really important we dive a little deeper into the relationship of AI, journalism, and media.

    是以,對我來說,作為一名政治家,我認為我們必須更深入地探討人工智能、新聞和媒體之間的關係。

  • But it also puts us at risk of creating a more divided and reactionary society, because falsehoods can create a lot of reaction.

    但這也有可能使我們的社會變得更加分裂和反動,因為謬誤會引起很大的反響。

  • And for myself, coming from Northern Ireland, which is that post-conflict society, I do have concerns about how it could shape our political landscape and other places across the globe.

    就我個人而言,我來自衝突後社會北愛爾蘭,我確實擔心這會如何影響我們的政治格局以及全球其他地方的政治格局。

  • Sadly, this deepfake video is not the only instance of me having experienced abuse with AI.

    遺憾的是,這段 deepfake 視頻並不是我遭遇人工智能虐待的唯一案例。

  • Just six months ago, I received 15 fake deepfake images of myself in lingerie, posing provocatively.

    就在六個月前,我收到了 15 張自己穿著內衣、擺出挑逗姿勢的假 Deepfake 圖片。

  • And I thought to myself, here we go again.

    我心想,又來了。

  • And you know, I spoke with some other female politicians, thankfully, where I represent, we have more women getting into politics.

    你知道,我和其他一些女政治家談過,值得慶幸的是,在我所代表的地方,我們有更多的女性進入政界。

  • But I had a really good conversation with them, and it's around, if this is happening to you now, what happens to me tomorrow?

    但我與他們進行了一次非常愉快的對話,主題是:如果你現在遇到這種情況,我明天會怎樣?

  • And I think this really strikes at the heart of the climate of fear that AI can create for those in public life.

    我認為,這確實觸及了人工智能可能給公共生活中的人們造成的恐懼氣氛的核心。

  • And I don't blame women, it's very sad, I don't blame women for not wanting to get into technology, come forward.

    我並不責怪女性,這是非常可悲的,我並不責怪女性不想進入技術領域,不想站出來。

  • So that's so important that we safeguard it.

    是以,我們必須保護它。

  • What also concerned me was the position of elderly voters, perhaps their media literacy, their comprehension of this technology, people who don't use the internet, who perhaps are not aware of AI and its many, many uses.

    我還擔心的是老年選民的立場,也許他們的媒體素養,他們對這項技術的理解能力,他們不使用互聯網,也許他們不知道人工智能及其很多很多的用途。

  • So that was really concerning as well.

    是以,這也著實令人擔憂。

  • But it doesn't have to be this way.

    但也不一定非得這樣。

  • We can be part of the change.

    我們可以成為變革的一部分。

  • For me and my video, I still don't know, to this day, who did this.

    對於我和我的視頻,我至今仍不知道是誰幹的。

  • I can imagine why they did it, but I don't know who did it.

    我能想象他們為什麼這麼做,但我不知道是誰幹的。

  • And sadly, for me, and for across the globe, it still wouldn't be considered a crime.

    遺憾的是,對我來說,對全球來說,這仍然不會被視為犯罪。

  • So from being the enemy of fair, transparent, good-natured politics, to warfare, to international But it doesn't have to be this way.

    是以,從與公平、透明、善意的政治為敵,到與戰爭為敵,再到與國際為敵。

  • I feel passionately that AI can be a humanistic technology with human values that complements the lives that we live, to make us the very best versions of ourselves.

    我熱切地認為,人工智能可以成為一種具有人類價值觀的人文技術,與我們的生活相輔相成,使我們成為最好的自己。

  • But to do that, I think we need to embed ethics into this technology, to eliminate bias, to install empathy, and make sure it is aligned with human values and human principles.

    但要做到這一點,我認為我們需要在這項技術中植入道德規範,消除偏見,植入同理心,確保它符合人類價值觀和人類原則。

  • Who knows?

    誰知道呢?

  • Our democracy could depend on it.

    我們的民主可以依靠它。

  • And today, I know we have some of the brightest minds in this room.

    今天,我知道在座的有一些最聰明的人。

  • I heard some of the talks earlier, and I'm certain of that.

    我聽到了之前的一些談話,我確信這一點。

  • And I know each and every one of you have weight on your shoulders when looking to the future.

    我知道,展望未來,你們每個人肩上的擔子都很重。

  • But I also know each and every one of you want to see this tech for good.

    但我也知道,你們每一個人都希望看到這項技術的成功。

  • And what gives me hope is witnessing the movement right across the globe to see this hard journey begin, to regulate this ever-advancing technology, which can be used for bad or for good.

    讓我充滿希望的是,我看到全球各地都在開展運動,希望開始這一艱難的歷程,希望對這一不斷進步的技術進行監管,因為它既可以用來做壞事,也可以用來做好事。

  • But perhaps we, each and every one of us in this room, can be part of finding the solution.

    但是,也許我們,在座的每一位,都能成為尋找解決方案的一部分。

You're a little whore and we've all seen your little video.

你是個小婊子,我們都看過你的小視頻。

字幕與單字
由 AI 自動生成

單字即點即查 點擊單字可以查詢單字解釋

B1 中級 中文 美國腔

Deepfake 如何幾乎毀了我的政治生涯 | Cara Hunter | TED (How a Deepfake Almost Ruined My Political Career | Cara Hunter | TED)

  • 4 1
    lai 發佈於 2024 年 12 月 18 日
影片單字