Placeholder Image

字幕列表 影片播放

由 AI 自動生成
  • how do we think about self driving cars?

    我們如何看待自動駕駛汽車?

  • The technology is essentially here and we now have to take a technology in which machines can make a bunch of quick decisions, oftentimes quicker than we can.

    技術基本上在這裡,我們現在必須採取一種技術,機器可以做出一堆快速的決定,往往比我們更快。

  • Ah, that could drastically reduce traffic fatalities, could drastically improve the efficiency of our transportation grid, helped solve things like carbon emissions that are causing, causing the warming of the planet.

    啊,這可以大大減少交通死亡事故,可以大大改善我們的交通網的效率,幫助解決像碳排放這樣的問題,這些問題正在導致,導致地球變暖。

  • But Joey made a very elegant and simple point, which is what are the values that we're going to embed in the cars, if in fact we're going to get all the benefits of self driving cars and how do we make the public comfortable with it?

    但喬伊提出了一個非常優雅和簡單的觀點,那就是如果事實上我們要獲得自動駕駛汽車的所有好處,我們將在汽車中嵌入什麼價值觀,以及我們如何讓公眾適應它?

  • Now, some of it's just right now, the overriding concern of the public is safety.

    現在,有些只是現在,公眾最關心的是安全問題。

  • Alright.

    好吧。

  • The notion of essentially taking your hands off the wheel, but as Joey pointed out, they're going to be a bunch of choices that you have to make a classic problem being if the cars are driving and you can swerve to avoid a pedestrian who maybe wasn't paying attention, it's not your fault.

    基本上把你的手從方向盤上移開的概念,但正如喬伊所指出的,他們將是一堆你必須做出的選擇,一個典型的問題是,如果汽車正在行駛,你可以轉彎以避免一個可能沒有注意的行人,這不是你的錯。

  • But if you swerve, you're going to go into a wall and might kill yourself.

    但如果你轉彎,你就會撞到牆上,可能會殺死自己。

  • And how do you make the calculations about odds and airbags and speed and all that, that you can have that machine make that decision?

    而你是如何對機率、安全氣囊和速度以及所有這些進行計算的,你可以讓那個機器做出這個決定?

  • Uh but that's a moral decision, not just a pure utilitarian decision.

    呃,但這是一個道德上的決定,而不僅僅是一個純粹的功利主義決定。

  • And who's setting up those rules?

    那麼誰在制定這些規則呢?

  • Do we have broad consensus around what those rules are?

    我們是否對這些規則有廣泛的共識?

  • That's going to be important?

    這將是很重要的?

  • We're going to have the same set of questions when it comes to medicine.

    談到醫學,我們也會有同樣的問題。

  • Um We have invested heavily in thinking about precision medicine or individualized medicine, thinking about how the combination of the human genome and uh, yep, computer data and a large enough sample size can potentially arrive at a whole host of cures.

    我們在思考精準醫療或個體化醫療方面投入了大量資金,思考人類基因組和呃,是的,計算機數據和足夠大的樣本量的結合,有可能得出一大堆的治療方法。

  • Parkinson's Alzheimer's cancer.

    帕金森氏症 阿爾茨海默氏症 癌症。

  • There are a whole bunch of interesting choices that we're going to have to make as we proceed in this, because the better we get at it, the more predictive we are about certain genetic variations having an impact.

    在我們進行這項工作時,有一大堆有趣的選擇,因為我們在這方面做得越好,我們對某些基因變異的影響就越有預見性。

  • How we think about insurance, how we think about medical pricing, who gets what, when, how is that something that we're going to hand out over to an algorithm and if so, who is writing it?

    我們如何思考保險,如何思考醫療定價,誰得到什麼,什麼時候得到,這怎麼是我們要交給一個算法的東西,如果是這樣,誰在寫?

  • So, so these are going to be unavoidable questions.

    所以,所以這些都將是不可避免的問題。

  • And I think that Joy is exactly right, making sure that the broad public that's not necessarily going to be following every single iteration of this debate still feels as if their voices heard, they're represented.

    我認為喬伊是完全正確的,確保不一定會關注這場辯論的每一次迭代的廣大公眾仍然感到他們的聲音被聽到,他們被代表。

  • The people in the room are mindful of a range of equities that's going to be really important.

    房間裡的人注意到了一系列的股票,這將是非常重要的。

  • And what is the role of government in that context, as we start to get into these ethical questions?

    在這種情況下,政府的作用是什麼,我們開始進入這些倫理問題?

  • Well, my instinct is initially the role is a convener course.

    嗯,我的直覺是最初的角色是一個召集人課程。

  • The way I've been thinking about the regulatory structure as a I emerges is that early in a technology 1000 flowers should bloom and the government should have a relatively light touch investing heavily in research, making sure that there is a conversation between basic research and applied research and companies that are trying to figure out how to apply it.

    我一直在思考我出現時的監管結構的方式是,在一項技術的早期,1000朵花應該盛開,政府應該有一個相對較輕的觸摸,大量投資於研究,確保基礎研究和應用研究與試圖找出如何應用它的公司之間有一個對話。

  • A good example of where this has worked pretty well, I think is in predicting the weather.

    我認為,在預測天氣方面,這是一個很好的例子,效果相當好。

  • Got big data, really complex systems.

    得到了大數據,真正複雜的系統。

  • Government basically said, hey, we got all this data and suddenly a whole bunch of folks are gathering around working with the National Weather Center and developing new apps.

    政府基本上說,嘿,我們得到了所有這些數據,突然有一大堆人聚集在一起與國家氣象中心合作,開發新的應用程序。

  • And we've actually been able to predict an oncoming tornado 34 times faster than it used to be.

    而且我們實際上已經能夠預測一個即將到來的龍捲風,比過去快34倍。

  • That saves lives.

    這可以拯救生命。

  • That's a that's a good example of where the government isn't doing all the work initially.

    這是一個很好的例子,說明政府最初並沒有做所有的工作。

  • But his inviting others to participate as technologies emerge and mature than figuring out how they get incorporated into existing regulatory structures becomes a tougher problem.

    但是,隨著技術的出現和成熟,他邀請其他人参與進來,而不是想辦法把它們納入現有的監管結構,這就成了一個比較棘手的問題。

  • And the government needs to be involved a little bit more.

    而政府需要更多地參與其中。

  • Not always to force the new technology into the square peg that exists, but maybe to change the peg.

    並非總是要把新技術強行塞進現有的方枘圓鑿中,而是可能要改變方枘圓鑿。

  • And one of the things that we're trying to do, for example, is to get the Federal Drug Administration the FDA to redesign how it's thinking about genetic medicine.

    例如,我們正在努力做的一件事是讓聯邦藥物管理局重新設計它對遺傳醫學的思考方式。

  • When a lot of it's rules regulations were designed for a time when, you know, it was worried about heart stent, uh, is a very different problem.

    當它的很多規則條例是為一個時代設計的,你知道,它擔心的是心臟支架,呃,是一個非常不同的問題。

  • So, basic research from government convening to make sure the conversations are happening, ensuring transparency.

    是以,從政府召集的基本研究,以確保對話正在發生,確保透明度。

  • But as things mature, making sure that uh, there is a transition and a seamless way to get the technology to rethink regulations.

    但隨著事情的成熟,確保呃,有一個過渡和一個無縫的方式,讓技術重新思考法規。

  • And as Joey pointed out, making sure that the regulations themselves reflect a broad base set of values because otherwise, if it's not transparent, we may find that it's disadvantaging certain people, certain groups or that the public is just suspicious of it.

    正如喬伊所指出的,確保法規本身反映了廣泛的基礎價值,因為否則,如果它不透明,我們可能會發現它對某些人、某些群體不利,或者公眾只是對它產生懷疑。

  • I can say one thing about that.

    關於這一點,我可以說一件事。

  • So there's it ties to two things.

    所以它與兩件事有關。

  • So one is when we did the this car trolley problem, I think we found that most people like the idea that the driver or the passenger could be sacrificed to save many people, but they would never buy that car.

    是以,一個是當我們做這個汽車手推車的問題時,我想我們發現大多數人喜歡這樣的想法,即司機或乘客可以被犧牲以拯救許多人,但他們絕不會買那輛車。

  • And that was that was sort of short version of the result.

    而這就是結果的簡短版本。

  • The other related thing, which is, I don't know if you've heard the neuro diversity movement, but this is a this, if we solve autism, let's say.

    另一個相關的事情,就是,我不知道你是否聽說過神經多樣性運動,但這是一個這,如果我們解決自閉症,比方說。

  • And Temple Grandin talks about this a lot.

    坦普爾-葛蘭汀也經常談到這個問題。

  • She says that, you know, Mozart and Einstein and Tesla would all be considered autistic if they're here today.

    她說,你知道,莫扎特、愛因斯坦和特斯拉如果今天在這裡,都會被認為是自閉症患者。

  • I don't know if that's true, but something might be honest about the spectrum.

    我不知道這是不是真的,但有些東西可能是誠實的光譜。

  • So if we were able to eliminate autism, um, and make everyone, you're a normal, normal, I bet a whole swath of mighty kids would not be the way they are.

    是以,如果我們能夠消除自閉症,嗯,讓每個人,你是一個正常的,正常的,我敢打賭,一大片強大的孩子將不會是他們的方式。

  • And you know, you probably wouldn't want Einstein as your kids, as somebody who was in Cambridge at the Harvard Law School.

    而且你知道,你可能不希望愛因斯坦作為你的孩子,作為在劍橋的哈佛法學院的人。

  • B, I didn't want to echo that stereotype about and and someone was therapy, but some of the brilliant kids are kind of on the spectrum.

    B,我不想呼應那種關於和和的刻板印象,有人在治療,但一些聰明的孩子有點在光譜上。

  • And I think one of the things that's really important, whether we're talking about autism or just diversity broadly, one of the problems I think is that allowing the market and each individual to decide, okay, I just want a normal kid and I want a car that's going to protect me is not going to lead to a maximum for the societal benefit.

    我認為真正重要的事情之一是,無論我們談論的是自閉症還是廣泛的多樣性,我認為問題之一是,允許市場和每個人決定,好吧,我只想要一個正常的孩子,我想要一輛能保護我的車,這不會導致社會利益的最大化。

  • And I think that whether it's government or something, we can't just have this market driven.

    我認為,無論是政府還是什麼,我們不能只是讓這個市場驅動。

  • And I think a lot of these decisions are going to be this way.

    而且我認為這些決定很多都會是這樣的。

  • I think that's a great point.

    我認為這是一個很好的觀點。

  • And it actually goes to the larger issue, um, that we wrestle with all the time around ai oh, and science fiction taps into this all the time.

    它實際上涉及到一個更大的問題,嗯,我們一直在圍繞AI進行鬥爭,哦,科幻小說一直在挖掘這個問題。

  • Part of what makes us human are the kinks, they're the mutations there, the outliers there, the flaws that create art or the new invention, right?

    使我們成為人類的部分原因是扭結,它們是那裡的突變,那裡的離群索居,創造藝術或新發明的缺陷,對嗎?

  • We we have to assume that if a system is perfect, then it's static and part of what makes us who we are, part of what makes us alive is that is dynamic.

    我們我們必須假設,如果一個系統是完美的,那麼它是靜態的,而使我們成為我們的一部分,使我們活著的一部分,是動態的。

  • And we're surprised.

    而我們很驚訝。

  • One of the challenges that will have over time is to think about where those areas where it's entirely appropriate for us just to have things work exactly the way they're supposed to without surprises.

    隨著時間的推移,將面臨的挑戰之一是思考那些完全適合我們的領域,只是讓事情完全按照他們應該的方式運作而不出現意外。

  • So airline flight might be a good example where, you know, I'm not that interested in having surprises.

    是以,航空飛行可能是一個很好的例子,你知道,我對有驚喜不那麼感興趣。

  • If I have a smooth flight every time I'm fine.

    如果我每次都能順利飛行,我就很好。

  • Right?

    對嗎?

  • Yeah.

    是的。

how do we think about self driving cars?

我們如何看待自動駕駛汽車?

字幕與單字
由 AI 自動生成

單字即點即查 點擊單字可以查詢單字解釋