聽力課堂TED音頻欄目主要包括TED演講的音頻MP3及中英雙語文稿,供各位英語愛好者學習使用。本文主要內(nèi)容為演講MP3+雙語文稿:為什么我們和機器人有情感上的聯(lián)系,希望你會喜歡!
【演講者及介紹】Kate Darling
凱特·達林(Kate Darling)——機器人倫理學家,研究人類與機器人之間的關系。
【演講主題】為什么我們和機器人有情感上的聯(lián)系
【中英文字幕】
Translation by psjmz mz. Reviewed by XinranBi.
00:13
There was a day, about 10 years ago, when Iasked a friend to hold a baby dinosaur robot upside down. It was this toycalled a Pleo that I had ordered, and I was really excited about it becauseI've always loved robots. And this one has really cool technical features. Ithad motors and touch sensors and it had an infrared camera. And one of thethings it had was a tilt sensor, so it knew what direction it was facing. Andwhen you held it upside down, it would start to cry. And I thought this wassuper cool, so I was showing it off to my friend, and I said, "Oh, hold itup by the tail. See what it does." So we're watching the theatrics of thisrobot struggle and cry out. And after a few seconds, it starts to bother me alittle, and I said, "OK, that's enough now. Let's put him back down."And then I pet the robot to make it stop crying.
大概10年前的一天,我讓一個朋友頭朝下地握持一個小恐龍機器人。這個機器人是我訂購的一款叫做Pleo的玩具,我對此非常興奮,因為我一直都很喜歡機器人。這個機器人有很酷的技術特征。它有馬達和觸覺傳感器,還有一個紅外攝像頭。它還有一個部件是傾斜傳感器,所以它就會知道自己面對的是什么方向。當你把它倒過來,它會開始哭泣。我覺得這點非??幔晕艺故窘o我朋友看,我說:“抓住尾巴豎起來,看看它會怎樣。”于是我們看著這個機器人表演,掙扎和哭泣。幾秒鐘后,我開始感到有點不安,于是我說,“好了,差不多了,我們把它放回去吧。”然后我撫摸著機器人,讓它停止哭泣。
01:19
And that was kind of a weird experience forme. For one thing, I wasn't the most maternal person at the time. Althoughsince then I've become a mother, nine months ago, and I've learned that babiesalso squirm when you hold them upside down.
這對我來說是一種奇怪的經(jīng)歷。首先,我那時還不是個很有母性的人。盡管在那之前的9個月,我已經(jīng)成為了一個母親,我還知道,當你讓嬰兒大頭朝下時,嬰兒也會抽泣。
01:33
(Laughter)
(笑聲)
01:35
But my response to this robot was alsointeresting because I knew exactly how this machine worked, and yet I stillfelt compelled to be kind to it. And that observation sparked a curiosity thatI've spent the past decade pursuing. Why did I comfort this robot? And one ofthe things I discovered was that my treatment of this machine was more thanjust an awkward moment in my living room, that in a world where we'reincreasingly integrating robots into our lives, an instinct like that mightactually have consequences, because the first thing that I discovered is thatit's not just me.
但我對這個機器人的反應也非常有趣,因為我確切地知道這個機器工作的原理,然而我仍然感到有必要對它仁慈些。這個觀察引起了好奇心,讓我花費了長達10年的時間去追尋。為什么我會去安慰這個機器人?我發(fā)現(xiàn)我對待這個機器人的方式不僅是我起居室里一個尷尬時刻,在這個世界里,我們正越來越多地將機器人融入到我們生活中,像這樣的本能可能會產(chǎn)生一些后果,因為我發(fā)現(xiàn)的第一件事情是,這并非只是發(fā)生在我身上的個例。
02:20
In 2007, the Washington Post reported thatthe United States military was testing this robot that defused land mines. Andthe way it worked was it was shaped like a stick insect and it would walkaround a minefield on its legs, and every time it stepped on a mine, one of thelegs would blow up, and it would continue on the other legs to blow up moremines. And the colonel who was in charge of this testing exercise ends upcalling it off, because, he says, it's too inhumane to watch this damaged robotdrag itself along the minefield. Now, what would cause a hardened militaryofficer and someone like myself to have this response to robots?
2007年,華盛頓郵報報道稱,美國軍方 正在測試拆除地雷的機器人。它的形狀就像一只竹節(jié)蟲,用腿在雷區(qū)上行走,每次踩到地雷時,它的一條腿就會被炸掉,然后繼續(xù)用其他腿去引爆更多的地雷。負責這次測試的上校后來取消了這個測試,因為他說,看著這個機器人拖著殘破的身軀在雷區(qū)掙扎行走,實在太不人道了。那么,是什么導致了一個強硬的軍官和像我這樣的人 對機器人有這種反應呢?
03:04
Well, of course, we're primed by sciencefiction and pop culture to really want to personify these things, but it goes alittle bit deeper than that. It turns out that we're biologically hardwired toproject intent and life onto any movement in our physical space that seemsautonomous to us. So people will treat all sorts of robots like they're alive.These bomb-disposal units get names. They get medals of honor. They've hadfunerals for them with gun salutes. And research shows that we do this evenwith very simple household robots, like the Roomba vacuum cleaner.
不可否認,我們都被科幻小說及流行文化所影響,想要將這些東西擬人化,但真實情況還有著更深層的含義。事實表明,我們天生就具有將意圖和生活投射到物理空間中,在我們看來能自主行動的任何運動物體上。所以人們像對待活物一樣對待各種各樣的機器人。這些拆彈機器人有自己的名字。它們能獲得榮譽勛章。人們?yōu)樗鼈兣e行了葬禮,并用禮炮向它們致敬。研究還發(fā)現(xiàn),我們即便對非常簡單的家居機器人也會這樣,比如Roomba吸塵器。
03:41
(Laughter)
(笑聲)
03:42
It's just a disc that roams around yourfloor to clean it, but just the fact it's moving around on its own will cause peopleto name the Roomba and feel bad for the Roomba when it gets stuck under thecouch.
它只是一個在你地板上通過旋轉進行清理的圓盤,但僅僅因為它能夠自己移動,就會導致人們想要給Roomba取名,當它卡在沙發(fā)下時,還會替它感到難過。
03:53
(Laughter)
(笑聲)
03:55
And we can design robots specifically toevoke this response, using eyes and faces or movements that peopleautomatically, subconsciously associate with states of mind. And there's anentire body of research called human-robot interaction that really shows howwell this works. So for example, researchers at Stanford University found outthat it makes people really uncomfortable when you ask them to touch a robot'sprivate parts.
我們可以專門設計機器人來喚起這種反應,使用諸如眼睛,面孔或動作,這些人們自動地,在潛意識中與心智狀態(tài)相聯(lián)系的特征。這一整套叫做人機交互的研究顯示了這個方法的效果的確非常好。比如,在斯坦福大學的研究者發(fā)現(xiàn),當你叫人們觸摸機器人的私處時,他們會感到很不舒服。
04:20
(Laughter)
(笑聲)
04:22
So from this, but from many other studies,we know, we know that people respond to the cues given to them by theselifelike machines, even if they know that they're not real.
從這個以及更多其他研究中,我們知道人們會對這些栩栩如生的機器給他們的線索做出反應,即使他們知道它們只是機器。
04:34
Now, we're headed towards a world whererobots are everywhere. Robotic technology is moving out from behind factorywalls. It's entering workplaces, households. And as these machines that cansense and make autonomous decisions and learn enter into these shared spaces, Ithink that maybe the best analogy we have for this is our relationship withanimals. Thousands of years ago, we started to domesticate animals, and wetrained them for work and weaponry and companionship. And throughout history,we've treated some animals like tools or like products, and other animals,we've treated with kindness and we've given a place in society as ourcompanions. I think it's plausible we might start to integrate robots insimilar ways.
我們正邁入一個機器人無處不在的社會。機器人科技正在走出工廠的圍墻。它們正在進入工作場所,家居環(huán)境。隨著這些能夠感知并自己做決定和學習的機器進入這些共享空間,我認為一個最好的類比就是我們和動物的關系。幾千年前,我們開始馴養(yǎng)動物,我們訓練它們?yōu)槲覀児ぷ鳎Wo和陪伴我們。在這個歷史進程中,我們把有些動物當作工具或產(chǎn)品使用,對其它一些動物,我們則對它們很好,在社會中給予它們同伴的位置。我認為我們可能會開始以類似的方式整合機器人。
05:22
And sure, animals are alive. Robots arenot. And I can tell you, from working with roboticists, that we're pretty faraway from developing robots that can feel anything. But we feel for them, andthat matters, because if we're trying to integrate robots into these sharedspaces, we need to understand that people will treat them differently thanother devices, and that in some cases, for example, the case of a soldier whobecomes emotionally attached to the robot that they work with, that can beanything from inefficient to dangerous. But in other cases, it can actually beuseful to foster this emotional connection to robots. We're already seeing somegreat use cases, for example, robots working with autistic children to engagethem in ways that we haven't seen previously, or robots working with teachersto engage kids in learning with new results. And it's not just for kids. Earlystudies show that robots can help doctors and patients in health care settings.
當然,動物有生命。機器人沒有。作為機器人專家,我可以告訴各位,我們距離能產(chǎn)生感情的機器人還很遙遠。但我們同情它們,這點很重要,因為如果我們嘗試把機器人整合進這些共享空間,就需要懂得人們會把它們與其他設備區(qū)別對待,而且在有些場景下,比如,那個士兵對一起工作的機器人產(chǎn)生情感依戀的例子,這可能是低效的,也可能是危險的。但在其他場景下,培養(yǎng)與機器人的情感聯(lián)系可能非常有用。我們已經(jīng)看到了一些很好的使用場景,比如跟自閉癥兒童一起的機器人以我們前所未見的方式與他們互動,或者讓機器人與老師共事,在幫助孩子們學習方面獲得新的成果。并且并不只適用于兒童。早期的研究發(fā)現(xiàn)機器人可以在醫(yī)療保健領域幫助醫(yī)生和病人。
06:26
This is the PARO baby seal robot. It's usedin nursing homes and with dementia patients. It's been around for a while. AndI remember, years ago, being at a party and telling someone about this robot,and her response was, "Oh my gosh. That's horrible. I can't believe we'regiving people robots instead of human care." And this is a really commonresponse, and I think it's absolutely correct, because that would be terrible.But in this case, it's not what this robot replaces. What this robot replacesis animal therapy in contexts where we can't use real animals but we can userobots, because people will consistently treat them more like an animal than adevice.
這是帕羅嬰兒海豹機器人。它被用于療養(yǎng)院來陪伴老年癡呆癥患者。它已經(jīng)面世有陣子了。我記得若干年前,在參與的一次聚會上跟人講到這個機器人時,她的反應往往是,“哦,天哪。太可怕了。我無法相信我們給人們的是機器人護理,而不是人類護理?!边@是一個非常普遍的反應,我覺得這是完全正確的,因為這可能會很可怕。但在這個場景下,機器人替代的不是護理。機器人替代的是動物療法,這可以用在無法使用真正動物,但可以使用機器人的場合中,因為人們會把它們當成動物而不是設備看待。
07:16
Acknowledging this emotional connection torobots can also help us anticipate challenges as these devices move into moreintimate areas of people's lives. For example, is it OK if your child's teddybear robot records private conversations? Is it OK if your sex robot hascompelling in-app purchases?
承認這種與機器人的情感聯(lián)系也能幫助我們預見到挑戰(zhàn),隨著這些設備將進入人們生活中更親密的領域。比如,用你孩子的玩具熊機器人錄制私人對話是否合適?你的性愛機器人有強制的內(nèi)置付費系統(tǒng)是否合適?
07:34
(Laughter)
(笑聲)
07:36
Because robots plus capitalism equalsquestions around consumer protection and privacy.
因為機器人加上資本就等于消費者保護和隱私問題。
07:43
And those aren't the only reasons that ourbehavior around these machines could matter. A few years after that firstinitial experience I had with this baby dinosaur robot, I did a workshop withmy friend Hannes Gassert. And we took five of these baby dinosaur robots and wegave them to five teams of people. And we had them name them and play with themand interact with them for about an hour. And then we unveiled a hammer and ahatchet and we told them to torture and kill the robots.
這些還不是我們對待這些機器人的行為之所以重要的唯一原因。在我第一次見到這只小恐龍機器人的幾年后,我和朋友漢內(nèi)斯· 加瑟特 開展了一次研討會。我們拿了5個小恐龍機器人,把它們分給5隊人。我們讓他們?yōu)樗鼈內(nèi)∶?,陪伴它們一起互動大約一個小時。然后我們拿出了斧頭和錘子讓他們?nèi)フ勰ズ蜌⑺罊C器人。
08:13
(Laughter)
(笑聲)
08:17
And this turned out to be a little moredramatic than we expected it to be, because none of the participants would evenso much as strike these baby dinosaur robots, so we had to improvise a little,and at some point, we said, "OK, you can save your team's robot if youdestroy another team's robot."
這個結果比我們想的要更有戲劇性,因為甚至沒有一個參與者去攻擊這些小恐龍機器人。所以我們得臨時湊合一下,在某個時候,我們說,“好吧,你可以保住你們隊的機器人,但前提是把其它隊的機器人毀掉。”
08:35
(Laughter)
(笑聲)
08:37
And even that didn't work. They couldn't doit. So finally, we said, "We're going to destroy all of the robots unlesssomeone takes a hatchet to one of them." And this guy stood up, and hetook the hatchet, and the whole room winced as he brought the hatchet down onthe robot's neck, and there was this half-joking, half-serious moment ofsilence in the room for this fallen robot.
即便這樣也沒用,他們不愿意去做。所以最后,我們說,“我們將要毀掉所有的機器人,除非有人拿短柄斧砍掉它們中的一個?!庇袀€人站了起來,他拿起斧頭,當他把斧頭砍到機器人的脖子上時,整個房間的人都縮了回去,房間中出現(xiàn)了一個為這個倒下的機器人半玩笑半嚴肅的沉默時刻。
09:02
(Laughter)
(笑聲)
09:03
So that was a really interestingexperience. Now, it wasn't a controlled study, obviously, but it did lead tosome later research that I did at MIT with Palash Nandy and Cynthia Breazeal,where we had people come into the lab and smash these HEXBUGs that move aroundin a really lifelike way, like insects. So instead of choosing something cutethat people are drawn to, we chose something more basic, and what we found wasthat high-empathy people would hesitate more to hit the HEXBUGS.
那真是一個有趣的體驗。它不是一個對照實驗,顯然不是,但這引發(fā)了我后來在麻省理工跟帕拉什 · 南迪和 辛西婭 · 布雷西亞爾做的研究,我們讓來到實驗室的人們打碎這些 像活生生的昆蟲那樣移動的遙控電子甲蟲。與選擇人們喜歡的可愛東西相比,我們選擇了一些更基本的東西,我們發(fā)現(xiàn)富有同情心的人們在擊碎這些機器昆蟲時要更加猶豫。
09:34
Now this is just a little study, but it'spart of a larger body of research that is starting to indicate that there maybe a connection between people's tendencies for empathy and their behavioraround robots. But my question for the coming era of human-robot interaction isnot: "Do we empathize with robots?" It's: "Can robots changepeople's empathy?" Is there reason to, for example, prevent your childfrom kicking a robotic dog, not just out of respect for property, but becausethe child might be more likely to kick a real dog?
這只是一個小小的研究,但它是一個更大范圍研究的一部分,這開始表明人們的同情心與他們對待機器人的行為可能存在某種聯(lián)系。但我對即將到來的人機交互時代的問題并不是:“我們對機器人會產(chǎn)生同情心嗎?”而是:“機器人會改變?nèi)祟惖耐樾膯幔俊笔遣皇谴嬖谶@樣的原因,比如說,阻止你的孩子踢一只機器狗,不只是出于對財產(chǎn)的尊重,而是因為孩子更可能會去踢一只真的狗?
10:11
And again, it's not just kids. This is theviolent video games question, but it's on a completely new level because ofthis visceral physicality that we respond more intensely to than to images on ascreen. When we behave violently towards robots, specifically robots that aredesigned to mimic life, is that a healthy outlet for violent behavior or isthat training our cruelty muscles? We don't know ... But the answer to thisquestion has the potential to impact human behavior, it has the potential toimpact social norms, it has the potential to inspire rules around what we canand can't do with certain robots, similar to our animal cruelty laws. Becauseeven if robots can't feel, our behavior towards them might matter for us. Andregardless of whether we end up changing our rules, robots might be able tohelp us come to a new understanding of ourselves.
并且,這不只適用于兒童。這是一個關于暴力游戲的問題,但這個問題上升到了一個全新的水平,因為這種出于本能的物質(zhì)性行為要比我們對屏幕上的圖像反應更強烈。當我們對機器人,對專門設計來模擬生命的機器人表現(xiàn)出暴力行徑時,這是暴力行為的健康疏導還是在培養(yǎng)我們實施殘忍行徑的力量?我們還不知道… 但這個問題的答案有可能影響人類行為,它有可能影響社會規(guī)范,可能會啟發(fā)我們制定對特定機器人能做什么和不能做什么的規(guī)則,就類似于我們的動物虐待法。因為即便機器人不能感知,我們對待它們的行為也可能對我們有著重要意義。不管我們是否最終會改變我們的規(guī)則,機器人也許能幫助我們對自己有一個全新的認識。
11:15
Most of what I've learned over the past 10years has not been about technology at all. It's been about human psychologyand empathy and how we relate to others. Because when a child is kind to aRoomba, when a soldier tries to save a robot on the battlefield, or when agroup of people refuses to harm a robotic baby dinosaur, those robots aren'tjust motors and gears and algorithms. They're reflections of our own humanity.
我在過去10年中學到的經(jīng)驗大部分跟技術無關,而是關于人類心理學,同情心,以及我們?nèi)绾闻c他人相處。因為當一個兒童友好地對待Roomba時,當一個士兵試圖拯救戰(zhàn)場上的機器人時,或者當一組人拒絕傷害小恐龍機器人時,這些機器人就不只是馬達,齒輪和算法。它們映射出了我們的人性。
11:46
Thank you.
謝謝。
11:47
(Applause)
(掌聲)