字幕列表 影片播放
Hello, I'm Joy, a poet of code,
譯者: Suzie Tang 審譯者: Helen Chang
on a mission to stop an unseen force that's rising,
你好 我叫玖伊 是個寫媒體程式的詩人
a force that I called "the coded gaze,"
我的使命是
my term for algorithmic bias.
終止一個隱形力量的崛起
Algorithmic bias, like human bias, results in unfairness.
我稱這種力量為「數碼凝視」
However, algorithms, like viruses, can spread bias on a massive scale
是我替偏差演算法取的名稱
at a rapid pace.
偏差的演算法跟人的偏見一樣
Algorithmic bias can also lead to exclusionary experiences
會導致不公平的結果
and discriminatory practices.
然而演算法更像病毒
Let me show you what I mean.
它傳播的偏見
(Video) Joy Buolamwini: Hi, camera. I've got a face.
大量而迅速
Can you see my face?
演算法偏差讓人 體驗到什麼叫做被排擠
No-glasses face?
也會導致差別對待
You can see her face.
讓我告訴你我的意思
What about my face?
嗨 相機 我有一張臉
I've got a mask. Can you see my mask?
你能看見我的臉嗎?
Joy Buolamwini: So how did this happen?
不戴眼鏡呢?
Why am I sitting in front of a computer
你看得見她啊
in a white mask,
那麼我的臉呢?
trying to be detected by a cheap webcam?
戴上面具 你看得見戴上面具嗎?
Well, when I'm not fighting the coded gaze
到底是怎麽回事?
as a poet of code,
我為什麽要坐在電腦前
I'm a graduate student at the MIT Media Lab,
戴著白色面具
and there I have the opportunity to work on all sorts of whimsical projects,
好讓這台廉價的攝影機能看得見我
including the Aspire Mirror,
如果我沒有忙著對抗數碼凝視
a project I did so I could project digital masks onto my reflection.
當個媒體程式詩人
So in the morning, if I wanted to feel powerful,
我就是麻省理工媒體實驗室的研究生
I could put on a lion.
我在那裡從事一些稀奇古怪的計劃
If I wanted to be uplifted, I might have a quote.
包括照妖鏡
So I used generic facial recognition software
照妖鏡計劃
to build the system,
讓我能把數位面具投射在自己臉上
but found it was really hard to test it unless I wore a white mask.
早上起來如果我需要強大的力量
Unfortunately, I've run into this issue before.
我就投上一個獅子面具
When I was an undergraduate at Georgia Tech studying computer science,
如果我缺乏鬥志
I used to work on social robots,
我就放一段名人名言
and one of my tasks was to get a robot to play peek-a-boo,
因為我使用一般的臉部辨識軟體
a simple turn-taking game
來測試這個系統
where partners cover their face and then uncover it saying, "Peek-a-boo!"
結果竟然發現
The problem is, peek-a-boo doesn't really work if I can't see you,
電腦無法偵測到我
and my robot couldn't see me.
除非我戴上白色面具
But I borrowed my roommate's face to get the project done,
很不幸我之前就碰過這種問題
submitted the assignment,
先前我在喬治亞理工學院
and figured, you know what, somebody else will solve this problem.
攻讀電腦科學學士學位時
Not too long after,
我研究社交機器人
I was in Hong Kong for an entrepreneurship competition.
其中的一個實驗
The organizers decided to take participants
就是和機器人玩躲貓貓
on a tour of local start-ups.
這個簡單的互動遊戲
One of the start-ups had a social robot,
讓對手先遮住臉再放開
and they decided to do a demo.
同時要說 peek-a-boo
The demo worked on everybody until it got to me,
問題是如果看不到對方
and you can probably guess it.
遊戲就玩不下去了
It couldn't detect my face.
我的機器人就是看不到我
I asked the developers what was going on,
最後我只好借我室友的臉來完成
and it turned out we had used the same generic facial recognition software.
做完實驗時我想
Halfway around the world,
總有一天會有別人解決這個問題
I learned that algorithmic bias can travel as quickly
不久之後
as it takes to download some files off of the internet.
我去香港參加一個
So what's going on? Why isn't my face being detected?
業界舉辦的競技比賽
Well, we have to look at how we give machines sight.
主辦單位先帶每位參賽者
Computer vision uses machine learning techniques
去參觀當地的新創市場
to do facial recognition.
其中一項就是社交機器人
So how this works is, you create a training set with examples of faces.
當他們用社交機器人展示成果時
This is a face. This is a face. This is not a face.
社交機器人對每個參賽者都有反應
And over time, you can teach a computer how to recognize other faces.
直到遇到了我
However, if the training sets aren't really that diverse,
接下來的情形你應該能想像
any face that deviates too much from the established norm
社交機器人怎樣都偵測不到我的臉
will be harder to detect,
我問軟體開發人員是怎麼一回事
which is what was happening to me.
才驚覺當年通用的
But don't worry -- there's some good news.
人臉辨識軟體
Training sets don't just materialize out of nowhere.
竟然飄洋過海到了香港
We actually can create them.
偏差的演算邏輯快速散播
So there's an opportunity to create full-spectrum training sets
只要從網路下載幾個檔案就搞定了
that reflect a richer portrait of humanity.
為什麼機器人就是看不見我的臉?
Now you've seen in my examples
得先知道我們如何賦予機器視力
how social robots
電腦使用機器學習的技術
was how I found out about exclusion with algorithmic bias.
來辨識人臉
But algorithmic bias can also lead to discriminatory practices.
你必須用許多實作測試來訓練他們
Across the US,
這是人臉這是人臉這是人臉
police departments are starting to use facial recognition software
這不是人臉
in their crime-fighting arsenal.
一而再再而三你就能教機器人
Georgetown Law published a report
辨識其他的人臉
showing that one in two adults in the US -- that's 117 million people --
但是如果實作測試不夠多樣化
have their faces in facial recognition networks.
當出現的人臉
Police departments can currently look at these networks unregulated,
與既定規範相去太遠時
using algorithms that have not been audited for accuracy.
電腦就很難判斷了
Yet we know facial recognition is not fail proof,
我的親身經驗就是這樣
and labeling faces consistently remains a challenge.
但別慌張 有好消息
You might have seen this on Facebook.
實作測試並不是無中生有
My friends and I laugh all the time when we see other people
事實上我們能夠建的
mislabeled in our photos.
我們可以有一套更周詳的測試樣本
But misidentifying a suspected criminal is no laughing matter,
涵蓋人種的多樣性
nor is breaching civil liberties.
我的實驗說明了
Machine learning is being used for facial recognition,
社交機器人
but it's also extending beyond the realm of computer vision.
產生排他現象
In her book, "Weapons of Math Destruction,"
因為偏差的演算邏輯
data scientist Cathy O'Neil talks about the rising new WMDs --
偏差的演算邏輯
widespread, mysterious and destructive algorithms
也可能讓偏見成為一種習慣
that are increasingly being used to make decisions
美國各地的警方
that impact more aspects of our lives.
正開始使用這套人臉辨識軟體
So who gets hired or fired?
來建立警方的打擊犯罪系統
Do you get that loan? Do you get insurance?
喬治城大學法律中心的報告指出
Are you admitted into the college you wanted to get into?
每兩個美國成年人就有一個人
Do you and I pay the same price for the same product
也就是一億一千七百萬筆臉部資料
purchased on the same platform?
在美國警方這套系統裡
Law enforcement is also starting to use machine learning
警方這套系統既缺乏規範
for predictive policing.
也缺乏正確合法的演算邏輯
Some judges use machine-generated risk scores to determine
你要知道人臉辨識並非萬無一失
how long an individual is going to spend in prison.
要一貫正確地標註人臉 往往不是那麼容易
So we really have to think about these decisions.
或許你在臉書上看過
Are they fair?
朋友和我常覺得很好笑
And we've seen that algorithmic bias
看見有人標註朋友卻標錯了
doesn't necessarily always lead to fair outcomes.
如果標錯的是犯人的臉呢
So what can we do about it?
那就讓人笑不出來了
Well, we can start thinking about how we create more inclusive code
侵害公民自由也同樣讓人笑不出來
and employ inclusive coding practices.
不僅辨識人臉倚賴機器學習的技術
It really starts with people.
許多領域其實都要用到機器學習
So who codes matters.
《大數據的傲慢與偏見》 這本書的作者
Are we creating full-spectrum teams with diverse individuals
數據科學家凱西 歐尼爾
who can check each other's blind spots?
談到新 WMD 勢力的崛起
On the technical side, how we code matters.
WMD 是廣泛 神秘和具破壞性的算法
Are we factoring in fairness as we're developing systems?
演算法漸漸取代我們做決定
And finally, why we code matters.
影響我們生活的更多層面
We've used tools of computational creation to unlock immense wealth.
例如誰升了官?誰丟了飯碗?
We now have the opportunity to unlock even greater equality
你借到錢了嗎?你買保險了嗎?
if we make social change a priority
你進入心目中理想的大學了嗎?
and not an afterthought.
我們花同樣多的錢在同樣的平台上
And so these are the three tenets that will make up the "incoding" movement.
買到同樣的產品嗎?
Who codes matters,
警方也開始使用機器學習
how we code matters
來防範犯罪
and why we code matters.
法官根據電腦顯示的危險因子數據
So to go towards incoding, we can start thinking about
來決定一個人要在監獄待幾年
building platforms that can identify bias
我們得仔細想想這些判定
by collecting people's experiences like the ones I shared,
它們真的公平嗎?
but also auditing existing software.
我們親眼看見偏差的演算邏輯
We can also start to create more inclusive training sets.
未必做出正確的判斷
Imagine a "Selfies for Inclusion" campaign
我們該怎麽辦呢?
where you and I can help developers test and create
我們要先確定程式碼是否具多樣性
more inclusive training sets.
以及寫程式的過程是否周詳
And we can also start thinking more conscientiously
事實上全都始於人
about the social impact of the technology that we're developing.
程式是誰寫的有關係
To get the incoding movement started,
寫程式的團隊是否由 多元的個體組成呢?
I've launched the Algorithmic Justice League,
這樣才能互補並找出彼此的盲點
where anyone who cares about fairness can help fight the coded gaze.
從技術面而言 我們如何寫程式很重要
On codedgaze.com, you can report bias,
我們是否對公平這項要素
request audits, become a tester
在系統開發階段就考量到呢?
and join the ongoing conversation,
最後 我們為什麼寫程式也重要
#codedgaze.
我們使用計算創造工具 開啟了巨額財富之門
So I invite you to join me
我們現在有機會實現更大的平等
in creating a world where technology works for all of us,
如果我們將社會變革作為優先事項
not just some of us,
而不是事後的想法
a world where we value inclusion and center social change.
這裡有改革程式的三元素
Thank you.
程式是誰寫的重要
(Applause)
如何寫程式重要
But I have one question:
以及為何寫程式重要
Will you join me in the fight?
要成功改革程式
(Laughter)
我們可以先從建立能夠 找出偏差的分析平台開始
(Applause)
作法是收集人們的親身經歷 像是我剛才分享的經歷