字幕列表 影片播放
-
[This talk contains mature content]
[本演說中成人內容]
-
Five years ago,
五年前
-
I received a phone call that would change my life.
我接到了一通改變我生命的電話。
-
I remember so vividly that day.
我對那一天還記憶猶新。
-
It was about this time of year,
大約就是那一年的這個時候,
-
and I was sitting in my office.
我坐在辦公室中。
-
I remember the sun streaming through the window.
我還記得陽光灑落在窗上。
-
And my phone rang.
我的電話響了。
-
And I picked it up,
我接了起來。
-
and it was two federal agents, asking for my help
是兩位聯邦探員要向我求助,
-
in identifying a little girl
幫他們辨識一個小女孩的身份
-
featured in hundreds of child sexual abuse images they had found online.
她出現在網路上數百張兒童性虐待照片之中。
-
They had just started working the case,
他們才剛開始偵辦這個案件,
-
but what they knew
但他們知道,
-
was that her abuse had been broadcast to the world for years
她被虐待的影像已在一個專對兒童性虐待的暗網上,
-
on dark web sites dedicated to the sexual abuse of children.
於世界各地播放很多年了。
-
And her abuser was incredibly technologically sophisticated:
她的施虐者有著極其精良的技術,
-
new images and new videos every few weeks,
每幾個星期就有新的照片和影片上傳,
-
but very few clues as to who she was
但很少線索可以查出她是誰,
-
or where she was.
或者她身在何處。
-
And so they called us,
他們打電話給我們,
-
because they had heard we were a new nonprofit
因為他們聽說我們這個新的非營利組織
-
building technology to fight child sexual abuse.
專門開發對抗兒童性虐待的技術。
-
But we were only two years old,
但我們才成立兩年,
-
and we had only worked on child sex trafficking.
我們只處理過兒童的非法性交易。
-
And I had to tell them
我必須要告訴他們,
-
we had nothing.
我們什麼都沒有。
-
We had nothing that could help them stop this abuse.
我們沒有能協助他們阻止這種虐待的資源。
-
It took those agents another year
那些探員又花了一年
-
to ultimately find that child.
才終於找到那個孩子。
-
And by the time she was rescued,
等到她被救出來時,
-
hundreds of images and videos documenting her rape had gone viral,
她被性侵的無數照片和影片
-
from the dark web
早就從那個暗網瘋傳出去,
-
to peer-to-peer networks, private chat rooms
傳到點對點的網路、私人聊天室,
-
and to the websites you and I use
甚至你我每天
-
every single day.
都會使用的網站。
-
And today, as she struggles to recover,
如今,她在努力恢復的同時,
-
she lives with the fact that thousands around the world
她也得接受全世界仍然有數千人
-
continue to watch her abuse.
仍持續在觀看她受虐影片的殘酷現實。
-
I have come to learn in the last five years
過去五年我漸漸了解到,
-
that this case is far from unique.
這絕非單一個案。
-
How did we get here as a society?
我們這個社會是怎麼走到這一步的?
-
In the late 1980s, child pornography --
八〇年代末,兒童色情——
-
or what it actually is, child sexual abuse material --
其實就是兒童性虐待素材——
-
was nearly eliminated.
幾乎絕跡了。
-
New laws and increased prosecutions made it simply too risky
新的法律和越來越多的起訴案件
-
to trade it through the mail.
使得透過郵件交易的風險太高。
-
And then came the internet, and the market exploded.
接著網路出現了,市場迅速膨脹。
-
The amount of content in circulation today
現今在網路上流通的內容量
-
is massive and growing.
非常巨大且持續增長。
-
This is a truly global problem,
這真的是個全球性的問題,
-
but if we just look at the US:
但,如果只談美國:
-
in the US alone last year,
去年光是在美國,
-
more than 45 million images and videos of child sexual abuse material
兒童性虐待素材的圖像和影片數量就超過四千五百萬,
-
were reported to the National Center for Missing and Exploited Children,
這是回報給國家失蹤與受虐兒童援助中心的數字,
-
and that is nearly double the amount the year prior.
比前一年增加近兩倍。
-
And the details behind these numbers are hard to contemplate,
這些數字背後的細節很難想像,
-
with more than 60 percent of the images featuring children younger than 12,
影像中超過六成的兒童都還不到十二歲,
-
and most of them including extreme acts of sexual violence.
且大部分的畫面都有極端的性暴力行為。
-
Abusers are cheered on in chat rooms dedicated to the abuse of children,
施虐者在專門談論兒童虐待的聊天室中歡呼喝采,
-
where they gain rank and notoriety
在那裡隨著更多的施暴、 更多的受害者,
-
with more abuse and more victims.
他們就能獲得進階和惡名。
-
In this market,
在這個市場中,
-
the currency has become the content itself.
貨幣變成了內容。
-
It's clear that abusers have been quick to leverage new technologies,
顯然,施虐者能快速地採用新技術,
-
but our response as a society has not.
但我們社會的因應方式卻跟不上腳步。
-
These abusers don't read user agreements of websites,
這些施虐者不會閱讀 網站的使用者條款,
-
and the content doesn't honor geographic boundaries.
內容也不受地理國界的限制。
-
And they win when we look at one piece of the puzzle at a time,
每次只要我們看茫茫大海中的一個, 他們就會得勝,
-
which is exactly how our response today is designed.
因為我們現今設計的因應方式就是如此。
-
Law enforcement works in one jurisdiction.
執法單位在一個轄區中作業。
-
Companies look at just their platform.
公司只管他們自己的平台。
-
And whatever data they learn along the way
他們在過程中得到的資料
-
is rarely shared.
也很少會分享出來。
-
It is so clear that this disconnected approach is not working.
很顯然,這種各自為政的方法根本行不通。
-
We have to redesign our response to this epidemic
針對數位時代所充斥的兒童性侵影片
-
for the digital age.
我們應該重新設計因應解決方式。
-
And that's exactly what we're doing at Thorn.
那正是我們在 Thorn 做的事。
-
We're building the technology to connect these dots,
我們建立了能夠將這些點連結起來的技術,
-
to arm everyone on the front lines --
把前線的每個人武裝起來——
-
law enforcement, NGOs and companies --
執法人員、非政府組織、公司——
-
with the tools they need to ultimately eliminate
都能擁有他們需要的工具,
-
child sexual abuse material from the internet.
最終將兒童性虐待素材從網路上根除。
-
Let's talk for a minute --
我們來談一下...
-
(Applause)
(掌聲)
-
Thank you.
謝謝。
-
(Applause)
(掌聲)
-
Let's talk for a minute about what those dots are.
我們來談一下這些「點」是什麼。
-
As you can imagine, this content is horrific.
各位可以想像,這些內容很可怕。
-
If you don't have to look at it, you don't want to look at it.
如果沒有必要,你不會想看。
-
And so, most companies or law enforcement agencies
所以,擁有這些內容的
-
that have this content
大部分公司或執法機構
-
can translate every file into a unique string of numbers.
可以將每個檔案轉譯為一串獨一無二的數字,
-
This is called a "hash."
叫做雜湊。
-
It's essentially a fingerprint
基本上,它就是每個檔案或影片
-
for each file or each video.
的指紋身分。
-
And what this allows them to do is use the information in investigations
他們可以把這項資訊用在調查中,
-
or for a company to remove the content from their platform,
或公司不需每次重覆地觀看所有的圖像和影片,
-
without having to relook at every image and every video each time.
就能據以把不當內容從平台上移除。
-
The problem today, though,
不過,現在的問題是,
-
is that there are hundreds of millions of these hashes
數億組雜湊函式散布在
-
sitting in siloed databases all around the world.
世界各地獨立的資料庫中。
-
In a silo,
孤立的資料庫
-
it might work for the one agency that has control over it,
對於掌控它的那個機構來說可能行得通,
-
but not connecting this data means we don't know how many are unique.
但不把資料做連結,就表示我們不知道有多少獨特案例。
-
We don't know which ones represent children who have already been rescued
我們不知道哪些代表已經被救出的孩子,或者還有
-
or need to be identified still.
待被辨視出來的孩子。
-
So our first, most basic premise is that all of this data
所以,我們最首要、根本的前提就是
-
must be connected.
讓所有資料必須相互連結。
-
There are two ways where this data, combined with software on a global scale,
用軟體將這些資料做全球性的整合後,
-
can have transformative impact in this space.
在兩個層面上會為這個領域帶來革命性的影響。
-
The first is with law enforcement:
第一種是執法面:
-
helping them identify new victims faster,
協助他們更快辨視出新的受害者、
-
stopping abuse
阻止虐待、
-
and stopping those producing this content.
阻止製造這類內容的人。
-
The second is with companies:
第二種是公司面:
-
using it as clues to identify the hundreds of millions of files
用這些資料當線索,
-
in circulation today,
來辨別現今流通的數億個檔案,
-
pulling it down
將這些內容下架,
-
and then stopping the upload of new material before it ever goes viral.
並在新的素材被瘋傳之前就阻止它們上傳。
-
Four years ago,
四年前,
-
when that case ended,
當那個案件結案時,
-
our team sat there, and we just felt this, um ...
我們的團隊坐在那裡, 我們感覺到一種……
-
... deep sense of failure, is the way I can put it,
我會說是很深的挫敗感
-
because we watched that whole year
,因為當探員在找尋她的下落時,
-
while they looked for her.
我們關注了一整年。
-
And we saw every place in the investigation
我們察看每個被調查的地方,
-
where, if the technology would have existed,
如果當時已有那個技術,
-
they would have found her faster.
他們就可以更快找到她。
-
And so we walked away from that
所以我們不再坐視,
-
and we went and we did the only thing we knew how to do:
做了一件我們唯一知道如何做的事:
-
we began to build software.
我們開始開發軟體。
-
So we've started with law enforcement.
我們從執法面開始。
-
Our dream was an alarm bell on the desks of officers all around the world
我們的夢想是做出一個警鈴,放在全世界執法人員的桌子上,
-
so that if anyone dare post a new victim online,
如果有人膽敢在線上張貼出新的受害者,
-
someone would start looking for them immediately.
馬上就會有人開始搜尋他。
-
I obviously can't talk about the details of that software,
我顯然無法談論軟體的細節,
-
but today it's at work in 38 countries,
但現今,已有三十八個國家運用此軟體,
-
having reduced the time it takes to get to a child
將找到孩子所需要的時間
-
by more than 65 percent.
減少了至少 65%。
-
(Applause)
(掌聲)
-
And now we're embarking on that second horizon:
現在我們在著手進行第二個範圍:
-
building the software to help companies identify and remove this content.
打造能協助公司辨別和移除這類內容的軟體。
-
Let's talk for a minute about these companies.
我們來談談這些公司。
-
So, I told you -- 45 million images and videos in the US alone last year.
我剛說過——去年光在美國,這類影像和影片的總數就有四千五百萬。
-
Those come from just 12 companies.
那些影像和影片只來自十二間公司。
-
Twelve companies, 45 million files of child sexual abuse material.
十二間公司,四千五百萬個兒童性虐待素材檔案。
-
These come from those companies that have the money
這些公司都有錢
-
to build the infrastructure that it takes to pull this content down.
可以建造移除這類內容的必要設施。
-
But there are hundreds of other companies,
但還有數百間其他公司,
-
small- to medium-size companies around the world,
全世界各地的中小型公司,
-
that need to do this work,
也需要做這項工作,
-
but they either: 1) can't imagine that their platform would be used for abuse,
但他們可能,第一,無法想像他們的平台會被作為散播施虐影像的地方,
-
or 2) don't have the money to spend on something that is not driving revenue.
或者,第二,沒有錢可以花在無法產生利潤的地方。
-
So we went ahead and built it for them,
所以我們就主動幫他們建造,
-
and this system now gets smarter with the more companies that participate.
隨著更多公司的參與,使得這個系統變得更聰明了。
-
Let me give you an example.
讓我舉個例子。
-
Our first partner, Imgur -- if you haven't heard of this company,
我們的第一個夥伴是 Imgur—— 如果你沒聽過這間公司,
-
it's one of the most visited websites in the US --
它是美國最多造訪人次的網站之一——
-
millions of pieces of user-generated content uploaded every single day,
每天都有數百萬件使用者產成的內容上傳到這個網站,
-
in a mission to make the internet a more fun place.
目的就是要讓網路變成更有趣的地方。
-
They partnered with us first.
他們最先和我們成為夥伴。
-
Within 20 minutes of going live on our system,
才在我們的系統運作二十分鐘,
-
someone tried to upload a known piece of abuse material.
就有人試圖上傳一件已知的虐待素材。
-
They were able to stop it, they pull it down,
他們阻止了這次上傳,將素材移除,
-
they report it to the National Center for Missing and Exploited Children.
他們向國家失蹤與受虐兒童援助中心回報。
-
But they went a step further,
但他們又再更進一步,
-
and they went and inspected the account of the person who had uploaded it.
他們調查了上傳者的帳號。
-
Hundreds more pieces of child sexual abuse material
發現還有數百件我們從未看過的
-
that we had never seen.
兒童性虐待素材。
-
And this is where we start to see exponential impact.
這就是我們開始看見雷霆萬鈞的影響力。
-
We pull that material down,
我們把那些素材下架,
-
it gets reported to the National Center for Missing and Exploited Children
向國家失蹤與受虐兒童援助中心回報這些素材,
-
and then those hashes go back into the system
接著,那些雜湊函式會被放回到系統中,
-
and benefit every other company on it.
讓其他使用系統的公司受益。
-
And when the millions of hashes we have lead to millions more and, in real time,
我們擁有的數百萬個雜湊函式又即時帶出了另外數百萬個,
-
companies around the world are identifying and pulling this content down,
全世界的公司都可辨認出這些內容並將之下架,
-
we will have dramatically increased the speed at which we are removing
我們就可以大大增加將兒童性虐待素材
-
child sexual abuse material from the internet around the world.
從全世界的網路上移除的速度。
-
(Applause)
(掌聲)
-
But this is why it can't just be about software and data,
但,這就是為什麼不能只想著軟體和資料,
-
it has to be about scale.
還要考量到規模。
-
We have to activate thousands of officers,
我們得要讓數千名執法人員、
-
hundreds of companies around the world
全世界數百間公司都動起來,
-
if technology will allow us to outrun the perpetrators
如果科技能讓我們勝過犯罪者,
-
and dismantle the communities that are normalizing child sexual abuse
瓦解世界各地那些將兒童性虐待
-
around the world today.
視為正常的社群。
-
And the time to do this is now.
現在是該行動的時候了。
-
We can no longer say we don't know the impact this is having on our children.
我們不能再說我們不知道這個問題會對我們的孩子造成什麼樣的影響。
-
The first generation of children whose abuse has gone viral
那些受虐影像和影片被瘋傳的第一代兒童
-
are now young adults.
現在都已是年輕人了。
-
The Canadian Centre for Child Protection
加拿大兒童保護中心
-
just did a recent study of these young adults
最近針對這些年輕人做了一項研究,
-
to understand the unique trauma they try to recover from,
以了解他們在知道自己受虐的影像還繼續流傳的情況下,
-
knowing that their abuse lives on.
要如何從這獨特的創傷中恢復。
-
Eighty percent of these young adults have thought about suicide.
那些年輕人中有 80% 曾經想過自殺。
-
More than 60 percent have attempted suicide.
超過 60% 曾經嘗試過自殺。
-
And most of them live with the fear every single day
他們大部分都帶著恐懼過每一天,
-
that as they walk down the street or they interview for a job
擔心當他們走在街上, 或工作面試時,
-
or they go to school
或去學校時,
-
or they meet someone online,
或在網路上認識某人時,
-
that that person has seen their abuse.
對方可能曾看過他們被施虐的畫面。
-
And the reality came true for more than 30 percent of them.
這些人當中有三成真的遇到這種狀況。
-
They had been recognized from their abuse material online.
他們因為網路上的受虐素材而被人認出。
-
This is not going to be easy,
這絕不是一件容易的事,
-
but it is not impossible.
但並不是不可能達成。
-
Now it's going to take the will,
這需要決心,
-
the will of our society
我們社會得要有意志
-
to look at something that is really hard to look at,
去面對很難面對的問題,
-
to take something out of the darkness
在黑暗中找出希望,
-
so these kids have a voice;
這些孩子才能夠發聲;
-
the will of companies to take action and make sure that their platforms
公司要有意願採取行動, 確保它們的平台
-
are not complicit in the abuse of a child;
不會成為虐童的共犯;
-
the will of governments to invest with their law enforcement
政府要有意願投資在執法上,
-
for the tools they need to investigate a digital first crime,
因為需要工具來調查數位初犯,
-
even when the victims cannot speak for themselves.
即使在受害者無法為自己發聲時也要能做到。
-
This audacious commitment is part of that will.
這個大無畏的承諾就是顯示決心的一部分。
-
It's a declaration of war against one of humanity's darkest evils.
向人性中最黑暗的邪惡來宣戰。
-
But what I hang on to
但我抱持的想法是,
-
is that it's actually an investment in a future
這其實是對未來的投資,
-
where every child can simply be a kid.
讓每個孩子都可以單純只當個孩子。
-
Thank you.
謝謝。
-
(Applause)
(掌聲)