Placeholder Image

字幕列表 影片播放

  • Who would you save, the pedestrian in the road or the drivers in the car?

    路上的行人,或車內的駕駛,你會選擇救誰?

  • It's not easy, and yet that's the kind of decision which millions of autonomous cars

    這並不容易,但這是數百萬自駕車

  • would have to make in the near future. We programme the machine but who do we tell it to save?

    在不久的將來必須做出的決定。我們對機器編制程式,但是我們該叫它救誰?

  • That is the set-up of the moral machine experiment.

    這就是道德機器實驗的設置。

  • There are so many moral decisions that we usually make during the day we don't realise.

    我們一天當中很常會做出道德選擇,只是我們不明白而已。

  • In driverless cars, these decisions will have to be implemented ahead of time.

    在無人車中,這些決定必須提前實行。

  • The goal was to open this discussion to the public.

    目標是向大眾討論這個議題。

  • Some decisions might seem simple - should the car save a family of 4 or a cat?

    有些決定看似簡單-這台車應該要救一家四口,還是救一隻貓?

  • But what about a homeless person and their dog instead of a businessman?

    但如果是救流浪漢和他們的狗,而不是商人呢?

  • Or how about two athletes and an old woman instead of two schoolchildren?

    或者是救兩名運動員和一名老婦人,而不是兩名學童呢?

  • The problem was that there were so many combinations, so many possible accidents,

    問題在於有這麼多組合,這麼多可能的意外,

  • that it seemed impossible to investigate them all using classic social science methods.

    似乎不可能用傳統的科學方法來研究他們。

  • Not only that, but how do people's culture and background affect the decisions that they make?

    不僅如此,大眾的文化和背景會如何影響他們做出決定?

  • The only option we had really was to turn it into a viral website. Of course, it's easier said than done, right.

    我們只能選擇把它變成一個受歡迎的網站。當然,說起來比做還簡單,對吧。

  • But that is exactly what the team managed to do. They turned these situations into an online task

    但這正是團隊設法做到的。他們將這些情況做成線上任務,

  • that people across the globe wanted to share and take part in.

    讓全世界的人都想分享並加入討論。

  • They gathered almost 40 million moral decisions, taken from millions of online participants across 233 countries and territories from all around the world.

    他們從全球 233 個國家和地區中,數百萬的線上參與者身上,收集了近 4000 萬個道德決定。

  • The results are intriguing. First, there are three fundamental principles which hold true across the world.

    結果很有趣。 首先,全世界有三個基本原則。

  • The main results of the paper, for me, are first, the big three in people's preferences which is save human, save the greater number, save the kids.

    對我來說,研究的主要結果是大眾有三大偏好:救人類,救多數人,救小孩。

  • The second most interesting finding was the clusters, the clusters of countries with different moral profiles.

    第二個最有趣的發現是族群,具有不同道德特徵的國家族群。

  • The first cluster included many western countries, the second cluster had many eastern countries

    第一個族群包括許多西方國家,第二個族群有許多東方國家,

  • and the third cluster had countries from Latin America and also from former French colonies.

    第三個族群是拉丁美洲和前法國殖民地。

  • The cultural differences we find are sometimes hard to describe because they're multidimensional,

    我們發現的文化差異有時難以描述,因為它們是多方面的問題,

  • but some of them are very striking, like the fact that eastern countries do not have such a strong preference for saving young lives.

    但其中一些令人非常驚訝,像是東方國家沒有如此強烈的偏好,會選擇先救小孩子。

  • Eastern countries seem to be more respectful of older people, which I thought was a very interesting finding.

    東方國家似乎更尊重老年人,我覺得這是一個非常有趣的發現。

  • And it wasn't just age. One cluster showed an unexpectedly strong preference for saving women over men.

    不只是年齡而已,有一個族群非常出乎意料,他們展現出先救女性而非男性的強烈傾向。

  • I was also struck by the fact that French and the French subcluster was so interested in saving women.

    讓我也感到驚訝的是,法國和其子群體對先救女性如此感興趣。

  • That was, yeah, I'm still not quite sure what's going on here.

    就是,對,我還是不太確定這其中的原因。

  • Another surprising finding concerned people's social status. On one side we put male and female executives,

    另一個驚人的發現有關大眾的社會地位。 一方面來說,我們把管理階層的男性和女性放在一邊,

  • and on the other side we put a homeless person. The higher the economic inequality in a country,

    另一方面我們放了一個流浪漢。一個國家的經濟程度越不平等,

  • the more people were willing to spare the executives at the cost of the homeless people.

    就越有人願意以流浪漢當代價,來拯救管理階層的人。

  • This work provides new insight into how morals change across cultures

    這個實驗為在不同文化中道德如何改變,提供了新的見解,

  • and the team see particular relevance to the field of artificial intelligence and autonomous vehicles.

    團隊認為這與人工智慧和自駕車的領域有特殊相關性。

  • In the grand scheme of it, I think these results are going to be very important to align artificial intelligence to human values. We sometimes change our minds.

    這在主要的計劃中,我認為這些結果,對人工智慧與人類價值觀的結合非常重要。 我們有時會改變想法。

  • Other people next to us don't think the same things we do. Other countries don't think the same things we do.

    旁人不會和我們想著同樣的事,其他國家的人也不會和我們想著同樣的事。

  • So aligning AI and human moral value is only possible if we do understand these differences,

    所以,可能只有了解這些差異,才能讓人工智慧和道德價值站在同一條線上,

  • and that's what we tried to do. I so much hope that we can converge, that we avoid a future

    這就是我們努力在做的事。我非常希望我們能夠整合這兩者,為了避免我們在未來,

  • where you have to learn about the new ethical setting of your car every time you cross a border.

    每次到不同國家時,還必須先了解車子新的道德設置。

Who would you save, the pedestrian in the road or the drivers in the car?

路上的行人,或車內的駕駛,你會選擇救誰?

字幕與單字

B1 中級 中文 道德 族群 國家 自駕車 人工 流浪漢

無人車該救誰?跨文化的道德選擇差異 (Moral Machines: How Culture Changes Values)

  • 656 11
    Jacky Avocado Tao   發佈於 2019 年 02 月 15 日
影片單字

返回舊版