字幕列表 影片播放
[HELICOPTER BLADES WHIRRING]
[EXPLOSION]
This is a war that’s being uploaded
on the internet every day.
Evidence of crimes against humanity and war crimes.
FIfty-eight countries urged the United Nations
to refer these war crimes to the International Criminal
Court.
Content moderation is kind of like the referee
of the internet –
calling what’s in and out of bounds and removing
inappropriate posts.
Let’s say I’m on a YouTube binge
and I see a video with child porn in it.
I report it to YouTube.
That’s called flagging.
The video then gets sent to content
moderators whose job it is to review flagged videos.
And they then decide whether to leave it up or remove it
from the platform.
Well, it got complicated in 2017.
That year, YouTube started relying much more
on machine-learning technology to flag content.
“No amount of people that we can hire
will be enough to review all of the content.
We now have A.I. systems that can identify and take down
99 percent of the Al Qaeda- and ISIS-related content
in our system before someone, a human, even flags it us.
I think we need to do more of that.”
Now, Facebook, YouTube and Twitter
get a huge amount of content uploaded
to their sites every day.
YouTube gets 300 hours uploaded per minute.
These companies are under a ton
of pressure from governments to quickly
get rid of the harmful stuff.
But it’s become a double-edged sword,
and essential human rights content
is getting caught in the net.
The bottom line is, computers may
be good at detecting violence but they’re just not
as nuanced as humans.
They’re not as good at figuring out
whether a video is ISIS propaganda
or vital human rights documentation.
[PENSIVE MUSIC]
[PENSIVE MUSIC]