字幕列表 影片播放
-
Translator: Joseph Geni Reviewer: Krystian Aparta
譯者: Wilde Luo 審譯者: NAN-KUN WU
-
There was a day, about 10 years ago,
10 年前的某一天,
-
when I asked a friend to hold a baby dinosaur robot upside down.
我叫我的一個朋友 倒著抓住一只恐龍寶寶機器人。
-
It was this toy called a Pleo that I had ordered,
這是我訂購的玩具,叫做「普里奧」,
-
and I was really excited about it because I've always loved robots.
我對此很興奮,因為 我對機器人情有獨鍾。
-
And this one has really cool technical features.
這隻機器人使用了很酷的技術。
-
It had motors and touch sensors
它裝備了一些馬達和觸摸感應器,
-
and it had an infrared camera.
以及一個紅外線攝影機。
-
And one of the things it had was a tilt sensor,
此外,它還有一個傾斜感應器,
-
so it knew what direction it was facing.
所以它知道自己面對的方向。
-
And when you held it upside down,
當你倒過來抓住它,
-
it would start to cry.
它會開始哭。
-
And I thought this was super cool, so I was showing it off to my friend,
我覺得這真是酷斃了, 所以我自豪地向朋友展示它,
-
and I said, "Oh, hold it up by the tail. See what it does."
我說:「噢,抓它的尾巴, 吊著它,看它會怎麽樣。」
-
So we're watching the theatrics of this robot
然後我們看著這隻機器人
-
struggle and cry out.
裝模作樣地掙扎和哭喊。
-
And after a few seconds,
幾秒鐘之後,
-
it starts to bother me a little,
它開始令我心煩,
-
and I said, "OK, that's enough now.
我說:「好了,夠了。
-
Let's put him back down."
我們放下它。」
-
And then I pet the robot to make it stop crying.
接着我安撫這隻機器人, 讓它平靜下來。
-
And that was kind of a weird experience for me.
這件事讓我感覺十分怪異。
-
For one thing, I wasn't the most maternal person at the time.
一方面,我在那時 並不如母親一般慈愛。
-
Although since then I've become a mother, nine months ago,
直到九個月前,我成為了母親,
-
and I've learned that babies also squirm when you hold them upside down.
我才知道原來寶寶在被 倒著抓住時也會掙扎扭動。
-
(Laughter)
(笑聲)
-
But my response to this robot was also interesting
但是我對這隻機器人的 反應也耐人尋味,
-
because I knew exactly how this machine worked,
因為,即使我對這臺 機器的工作原理一清二楚,
-
and yet I still felt compelled to be kind to it.
但我仍然情不自禁地 對它生出惻隱之心。
-
And that observation sparked a curiosity
這個觀察結果激起了我的好奇心,
-
that I've spent the past decade pursuing.
讓我在過去這十年不斷探究:
-
Why did I comfort this robot?
為什麼我會安撫這隻機器人?
-
And one of the things I discovered was that my treatment of this machine
我的其中一個發現就是: 我對這臺機器的所作所為
-
was more than just an awkward moment in my living room,
不僅僅只是客廳裡一個尷尬的 小插曲,它說明了更多東西。
-
that in a world where we're increasingly integrating robots into our lives,
在如今,我們的生活 與機器人日益親密無間,
-
an instinct like that might actually have consequences,
類似上述行為的本能 可能會造成實在的影響,
-
because the first thing that I discovered is that it's not just me.
因為我的第一個發現就是: 我不是唯一一個這樣做的人。
-
In 2007, the Washington Post reported that the United States military
2007 年,華盛頓郵報報導, 美國軍方正在測試
-
was testing this robot that defused land mines.
一種拆除地雷的機器人。
-
And the way it worked was it was shaped like a stick insect
它是這樣工作的: 它的形狀類似於竹節蟲,
-
and it would walk around a minefield on its legs,
它會用自己的「腿」在雷區附近移動。
-
and every time it stepped on a mine, one of the legs would blow up,
每一次踩到地雷, 它都會被炸掉一隻腿,
-
and it would continue on the other legs to blow up more mines.
然後它會用其它的「腿」 繼續移動,引爆更多的地雷。
-
And the colonel who was in charge of this testing exercise
那位負責這項測試工作的上校
-
ends up calling it off,
最終取消了它。
-
because, he says, it's too inhumane
因為他說,看著這些傷痕累累的機器人
-
to watch this damaged robot drag itself along the minefield.
在雷區掙扎前行實在是慘無人道。
-
Now, what would cause a hardened military officer
那麽,又是什麼讓這位老練的軍官
-
and someone like myself
以及像我這樣的人
-
to have this response to robots?
對機器人做出這種反應?
-
Well, of course, we're primed by science fiction and pop culture
當然,我們受科幻小說 和流行文化的影響,
-
to really want to personify these things,
已經躍躍欲試要將這些事物人格化,
-
but it goes a little bit deeper than that.
但還不止於此。
-
It turns out that we're biologically hardwired to project intent and life
事實證明,對於身邊任何 看來像是自發做出的動作,
-
onto any movement in our physical space that seems autonomous to us.
生物天性決定了我們 會為其賦予意圖和生命。
-
So people will treat all sorts of robots like they're alive.
所以人們會把各種各樣的 機器人當成生命對待。
-
These bomb-disposal units get names.
這些炸彈處理單位被賦予了名字。
-
They get medals of honor.
它們被授予了榮譽勛章。
-
They've had funerals for them with gun salutes.
在它們的葬禮上,人們以禮炮致敬。
-
And research shows that we do this even with very simple household robots,
研究表明,甚至對於構造簡單的 家用機器人,我們也是如此,
-
like the Roomba vacuum cleaner.
例如「Roomba 吸塵器機器人」。
-
(Laughter)
(笑聲)
-
It's just a disc that roams around your floor to clean it,
它只是一個圓盤, 在地板上遊蕩並打掃,
-
but just the fact it's moving around on its own
但僅因為它自己四處移動,
-
will cause people to name the Roomba
也會讓人們親昵地幫它取名字,
-
and feel bad for the Roomba when it gets stuck under the couch.
當它被卡在沙發下時 還會給予同情。
-
(Laughter)
(笑聲)
-
And we can design robots specifically to evoke this response,
並且我們能專門為此 來設計機器人,
-
using eyes and faces or movements
藉助眼睛、面部或者動作,
-
that people automatically, subconsciously associate
人們在潛意識裏
-
with states of mind.
會自動將其與某種心情聯繫起來。
-
And there's an entire body of research called human-robot interaction
一系列「人與機器人互動」的研究
-
that really shows how well this works.
都表明了這種機制的運作多麼廣泛。
-
So for example, researchers at Stanford University found out
例如,史丹佛大學的研究者發現
-
that it makes people really uncomfortable
當要求人們去觸碰機器人的私處時,
-
when you ask them to touch a robot's private parts.
他們會感到非常不適。
-
(Laughter)
(笑聲)
-
So from this, but from many other studies,
因此,從很多研究看來,
-
we know, we know that people respond to the cues given to them
我們知道人們會對這些 逼真的機器人所帶來的暗示
-
by these lifelike machines,
作出反應,
-
even if they know that they're not real.
即使人們知道它們並非生命。
-
Now, we're headed towards a world where robots are everywhere.
我們正進入一個 機器人已無處不在的世界,
-
Robotic technology is moving out from behind factory walls.
機器人技術打破藩籬,開始走出工廠。
-
It's entering workplaces, households.
它進入了工作場所、家庭。
-
And as these machines that can sense and make autonomous decisions and learn
隨着這些能夠感知、 自主決策及學習的機器
-
enter into these shared spaces,
進入這些公共場所,
-
I think that maybe the best analogy we have for this
我認為,也許對此最好的類比就是
-
is our relationship with animals.
我們與動物之間的關係。
-
Thousands of years ago, we started to domesticate animals,
數千年前,我們開始馴養動物,
-
and we trained them for work and weaponry and companionship.
它們被我們馴化:付出勞動、 作為武器,並陪伴我們。
-
And throughout history, we've treated some animals like tools or like products,
在歷史上,我們把一些動物 當作工具或者產品,
-
and other animals, we've treated with kindness
而對於另一些動物,我們予以善待
-
and we've given a place in society as our companions.
並為它們保留一席之地, 成為我們的夥伴。
-
I think it's plausible we might start to integrate robots in similar ways.
我認為,我們以相似的方式 接納機器人,是合情合理的。
-
And sure, animals are alive.
當然,動物是活的,
-
Robots are not.
而機器人不是。
-
And I can tell you, from working with roboticists,
實話實說,與機器人 專家共事讓我得知
-
that we're pretty far away from developing robots that can feel anything.
我們還遠不足以開發出 能有任何感受的機器人。
-
But we feel for them,
但是我們同情它們,
-
and that matters,
這很重要。
-
because if we're trying to integrate robots into these shared spaces,
因為如果想要讓機器人 融入這些公共空間,
-
we need to understand that people will treat them differently than other devices,
我們需要瞭解:人們對待 它們與其他設備的方式會不同。
-
and that in some cases,
在某些情況下,
-
for example, the case of a soldier who becomes emotionally attached
例如,一位軍人對一同工作的機器人
-
to the robot that they work with,
產生了情感依賴,
-
that can be anything from inefficient to dangerous.
這可能造成種種後果, 小到不便,大到危險。
-
But in other cases, it can actually be useful
在其他情況下, 培養這種與機器人的情感聯結
-
to foster this emotional connection to robots.
是很有用處的。
-
We're already seeing some great use cases,
我們已經看到了一些很好的例子,
-
for example, robots working with autistic children
例如,輔導自閉症兒童的機器人
-
to engage them in ways that we haven't seen previously,
前所未有地吸引着孩子們,
-
or robots working with teachers to engage kids in learning with new results.
或者,與老師們共事的機器人 讓孩子們投入學習,取得了新成果。
-
And it's not just for kids.
這並非只限於孩子。
-
Early studies show that robots can help doctors and patients
早期的研究顯示, 在醫療保健領域中,
-
in health care settings.
機器人能夠幫助醫生和病人。
-
This is the PARO baby seal robot.
這是叫做「帕羅」的海豹寶寶機器人。
-
It's used in nursing homes and with dementia patients.
它被用於養老院中,陪伴失智症患者。
-
It's been around for a while.
這項服務已有一段時間了。
-
And I remember, years ago, being at a party
我還記得,幾年前在一個聚會上,
-
and telling someone about this robot,
我告訴某人關於這種機器人的事,
-
and her response was,
她的反應是,
-
"Oh my gosh.
「噢,天哪,
-
That's horrible.
這太可怕了。
-
I can't believe we're giving people robots instead of human care."
簡直難以置信,照顧人們的 竟然是機器人,而不是人工護理。」
-
And this is a really common response,
這是相當普遍的反應,
-
and I think it's absolutely correct,
而且我認為這也是人之常情,
-
because that would be terrible.
因為那樣做真的非常糟糕。
-
But in this case, it's not what this robot replaces.
但在這裏,機器人 所替代的並非人工護理。
-
What this robot replaces is animal therapy
機器人所替代的,是動物療法,
-
in contexts where we can't use real animals
在這種環境下, 我們不允許使用活的動物,
-
but we can use robots,
但我們可以使用機器人,
-
because people will consistently treat them more like an animal than a device.
因為人們永遠會把它們 當作動物,而不是設備。
-
Acknowledging this emotional connection to robots
隨着這些設備日益走近人們的生活,
-
can also help us anticipate challenges
承認與機器人的情感聯結
-
as these devices move into more intimate areas of people's lives.
同樣能夠幫助我們 預料到即將面臨的挑戰。
-
For example, is it OK if your child's teddy bear robot
例如,如果你孩子的泰迪熊機器人
-
records private conversations?
會記録私人談話,這合理嗎?
-
Is it OK if your sex robot has compelling in-app purchases?
如果你的性愛機器人強烈 要求購買內置服務,這合理嗎?
-
(Laughter)
(笑聲)
-
Because robots plus capitalism
因為當機器人遇上資本主義,
-
equals questions around consumer protection and privacy.
便會產生關於消費者保護 以及隱私方面的問題。
-
And those aren't the only reasons
這些並不是唯一造成
-
that our behavior around these machines could matter.
我們對於這些機器的 行為十分重要的原因。
-
A few years after that first initial experience I had
在最初那場
-
with this baby dinosaur robot,
對恐龍寶寶機器人的 實驗數年之後,
-
I did a workshop with my friend Hannes Gassert.
我和朋友哈尼斯 · 哥薩特 開了個研討會。
-
And we took five of these baby dinosaur robots
我們拿了 5 個這樣的 恐龍寶寶機器人,
-
and we gave them to five teams of people.
分發給 5 組受試者。
-
And we had them name them
我們讓人們給它們起名字,
-
and play with them and interact with them for about an hour.
和它們一起玩耍、互動, 大約用了一個小時。
-
And then we unveiled a hammer and a hatchet
然後我們拿出一隻錘子和斧頭,
-
and we told them to torture and kill the robots.
我們要受試者折磨並殺死機器人。
-
(Laughter)
(笑聲)
-
And this turned out to be a little more dramatic
實驗的結果比我們想象的
-
than we expected it to be,
更有戲劇性,
-
because none of the participants would even so much as strike
因為參與者甚至都不忍敲打
-
these baby dinosaur robots,
這些恐龍寶寶機器人,
-
so we had to improvise a little, and at some point, we said,
所以我們不得不臨時變卦, 後來只好告訴他們,
-
"OK, you can save your team's robot if you destroy another team's robot."
「好吧,如果摧毀了其他組的機器人, 你就能拯救自己的機器人了。」
-
(Laughter)
(笑聲)
-
And even that didn't work. They couldn't do it.
即使是這樣也沒有起色。 他們同樣做不到。
-
So finally, we said,
所以最後,我們說,
-
"We're going to destroy all of the robots
「我們要摧毀所有的機器人了,
-
unless someone takes a hatchet to one of them."
除非某個人拿起斧頭 對其中一個下手。
-
And this guy stood up, and he took the hatchet,
某位仁兄站起來了, 他拿著斧頭,
-
and the whole room winced as he brought the hatchet down
當他砍中機器人的脖子時,
-
on the robot's neck,
整個房間的空氣都仿彿凝固了。
-
and there was this half-joking, half-serious moment of silence in the room
這就是當時在房間裏 因為這隻倒下的機器人
-
for this fallen robot.
所帶來既幽默又嚴肅的時刻。
-
(Laughter)
(笑聲)
-
So that was a really interesting experience.
這真是一段有趣的經歷。
-
Now, it wasn't a controlled study, obviously,
當然,這不是一個具有控制組的研究,
-
but it did lead to some later research that I did at MIT
但它確實導向了 我在麻省理工學院
-
with Palash Nandy and Cynthia Breazeal,
和帕拉斯 · 南迪和辛西婭 · 布雷西亞 一起做的一些後續的研究,
-
where we had people come into the lab and smash these HEXBUGs
我們當時讓人們進入實驗室 粉碎這些電子甲蟲,
-
that move around in a really lifelike way, like insects.
它們像昆蟲一樣十分逼真地移動。
-
So instead of choosing something cute that people are drawn to,
所以,與其選擇某種 吸引人們的可愛動物,
-
we chose something more basic,
我們選擇了更為簡單的東西,
-
and what we found was that high-empathy people
我們發現,富有同情心的人
-
would hesitate more to hit the HEXBUGS.
在粉碎電子甲蟲時會更加猶豫。
-
Now this is just a little study,
這只是一個小研究,
-
but it's part of a larger body of research
但這也屬於一個更大範疇的研究,
-
that is starting to indicate that there may be a connection
該研究開始表明: 在人們對同情之心的傾向
-
between people's tendencies for empathy
和他們對於機器人的行為之間
-
and their behavior around robots.
或許存在聯繫。
-
But my question for the coming era of human-robot interaction
但對於未來人類和機器互動的時代,
-
is not: "Do we empathize with robots?"
我想問的並不是「我們會對 機器人產生同情嗎?」
-
It's: "Can robots change people's empathy?"
而是「機器人能夠 改變我們的同情心嗎?」
-
Is there reason to, for example,
例如,
-
prevent your child from kicking a robotic dog,
阻止你的孩子踢一隻機器狗,
-
not just out of respect for property,
且並非出於對財產的尊重,
-
but because the child might be more likely to kick a real dog?
而是因為如此孩子就更有可能 虐待真正的狗,這是否有道理?
-
And again, it's not just kids.
並且,同樣不侷限於孩子。
-
This is the violent video games question, but it's on a completely new level
這個問題源於暴力遊戲, 但它站在一個全新的角度,
-
because of this visceral physicality that we respond more intensely to
因為這種自發的肢體 讓我們做出的反應
-
than to images on a screen.
會比對螢幕上的圖片更強烈。
-
When we behave violently towards robots,
當我們虐待機器人,
-
specifically robots that are designed to mimic life,
尤其是那種被設計成 模仿生命的機器人,
-
is that a healthy outlet for violent behavior
這是對暴力行為的正常發泄,
-
or is that training our cruelty muscles?
還是在助長我們的暴戾習性?
-
We don't know ...
我們不知道……
-
But the answer to this question has the potential to impact human behavior,
但是這個問題的答案 能夠影響人類行為,
-
it has the potential to impact social norms,
它能夠影響社會準則,
-
it has the potential to inspire rules around what we can and can't do
它能夠啓發我們,對於 特定的機器人,哪些事能做,
-
with certain robots,
哪些事不能做。
-
similar to our animal cruelty laws.
與我們的「動物保護法」相似。
-
Because even if robots can't feel,
因為即使機器人無法感知情感,
-
our behavior towards them might matter for us.
我們對它們的行為 對我們或許很重要。
-
And regardless of whether we end up changing our rules,
不論最後我們是否 改變了我們的準則,
-
robots might be able to help us come to a new understanding of ourselves.
機器人或許能夠幫助我們 再一次全新地認識我們自己。
-
Most of what I've learned over the past 10 years
我過去 10 年來 所瞭解的大部分內容
-
has not been about technology at all.
並不是關於技術本身。
-
It's been about human psychology
而是關於人類的心理、
-
and empathy and how we relate to others.
同情心以及我們 如何與他人建立聯繫。
-
Because when a child is kind to a Roomba,
因為當一個孩子和善相待 Roomba 機器人時,
-
when a soldier tries to save a robot on the battlefield,
當一名軍人想在 戰場上拯救一名機器人時,
-
or when a group of people refuses to harm a robotic baby dinosaur,
或者當一群人拒絶 傷害恐龍機器寶寶時,
-
those robots aren't just motors and gears and algorithms.
這些機器人已不再是 馬達、齒輪和演算法。
-
They're reflections of our own humanity.
他們都折射出我們人性的光輝。
-
Thank you.
謝謝。
-
(Applause)
(掌聲)