Placeholder Image

字幕列表 影片播放

由 AI 自動生成
  • positive outcomes.

    積極成果。

  • But there are certainly some risks.

    但肯定會有一些風險。

  • Certainly, we've heard from folks like you on on Nick Boast Room, concerned about a eyes potential outpaced our ability to understand it.

    當然,我們已經聽到了像你這樣的人在Nick Boast Room上,擔心眼睛的潛力超過我們理解它的能力。

  • What about those concerns?

    這些擔憂呢?

  • And how do we think about that move for to protect not only ourselves but humanity at scale.

    我們又該如何看待這一舉動,因為不僅要保護自己,還要保護人類的規模。

  • So let me start with what I think is the more immediate concern.

    所以我先說說我認為比較直接的問題。

  • That's a solvable problem, but we have to be mindful of it.

    這是一個可以解決的問題,但我們要注意。

  • And that is this category of specialized ai.

    而這就是這一類專門的艾。

  • If you've got a computer that can play, go is pretty complicated game with a lot of variations.

    如果你有一臺會玩的電腦,圍棋是相當複雜的遊戲,有很多變化。

  • Developing an algorithm that simply says maximize profits on the New York Stock Exchange is probably within sight.

    開發一種簡單地說,在紐約證券交易所實現利潤最大化的算法可能就在眼前。

  • And if one person or one organization got there first, they could bring down the stock market pretty quickly.

    而如果有一個人或一個組織先到了那裡,他們可以很快地把股市打垮。

  • Or at least they could, um, you know, raise questions about the integrity of the financial markets.

    或者至少他們可以,嗯,你知道,提出了關於金融市場完整性的問題。

  • Um a.

    一個a。

  • An algorithm that said, Go figure out how to launch, penetrate the nuclear code in the country and figure out how to launch some missiles.

    一個算法說,你去想辦法發射,穿透國內的核密碼,想辦法發射一些飛彈。

  • That's their only job.

    那是他們唯一的工作。

  • It's very narrow.

    它非常狹窄。

  • It doesn't require a super intelligence.

    它不需要超級智能。

  • It just requires a really effective algorithm, then on itself teaching.

    只是需要一個真正有效的算法,然後對自己進行教學。

  • Then you got problems.

    那你就有問題了。

  • So So part of I think my directive toe my national security team is, um don't worry as much yet about machines taking over the world do worry about the capacity of either non state actors or hostile actors to penetrate systems.

    所以我想我對國家安全團隊的部分訓示是,不要擔心機器接管世界,但要擔心非國家行為者或敵對行為者滲透系統的能力。

  • And in that sense, it's not, um, it is not conceptually different.

    而在這個意義上,它不是,嗯,它不是概念上的不同。

  • You are, uh, different in a legal sense than a lot of the cybersecurity work that we're doing.

    你,呃,在法律意義上和我們正在做的很多網絡安全工作不同。

  • It just means that we're gonna have to be better, because those who might deploy these systems are going to be a lot better.

    這只是意味著我們要做得更好,因為那些可能部署這些系統的人要好得多。

  • Now.

    現在就去

  • I think, as a precaution, all of us have spoken folks like Elon Musk who are concerned about uh yeah, the, uh, the super Intelligent Machine.

    我想,作為預防措施,我們所有人都說過像埃隆-馬斯克這樣的人,他們擔心的是,呃,呃,超級智能機器。

  • There's some prudence in thinking about benchmarks that would indicate some general intelligence developing on the horizon.

    有一些謹慎的思考基準,這將表明一些一般的智能在地平線上發展。

  • And if we can see that coming over the course of three decades, five decades, you know, whatever the latest estimates are, if ever, because they're also arguments that this thing is a lot more complicated than people make it out to be.

    如果我們能看到,在30年,50年的過程中,你知道,不管最新的估計是什麼,如果有的話,因為它們也是論據,這件事比人們做出來的要複雜得多。

  • Ben.

    本。

  • Uh, no future generations were our kids or grandkids were gonna be ableto see it coming and figure it out.

    我們的子孫後代都不可能看到它的到來並想出辦法來

  • Um, but But I do worry right now about all right.

    嗯,但我確實擔心現在的所有權利。

  • Specialized ai.

    專門的AI。

  • I was on the West Coast.

    我是在西海岸。

  • Some kid looked like he was 25.

    有些孩子看起來像他25歲。

  • Shows me a laptop.

    給我看一臺筆記本電腦。

  • He's this.

    他是這個。

  • This is not a laptop.

    這不是一臺筆記本電腦。

  • An iPad says this.

    一臺iPad這樣說。

  • This is the future of radiology, right?

    這就是放射學的未來吧?

  • And he's got an algorithm that is teaching enough sufficient pattern recognition that over time it's gonna be a better identify her of disease than a radiologist would be.

    他有一個算法,正在教足夠的模式識別,隨著時間的推移,它會是一個更好的識別她的疾病比放射科醫生會。

  • And if that's already happening today on an iPad invented by some kid at M i t.

    如果這已經發生在今天的iPad上,由M i t的某個孩子發明的。

  • Then you know the vulnerability of a lot of our systems is gonna be coming around pretty quick on.

    那麼你知道我們很多系統的漏洞會很快出現在。

  • We're gonna have toe have some preparation for that.

    我們必須要有一些準備。

  • But joy may have worst nightmares generally agree.

    但快樂可能有最糟糕的噩夢一般同意。

  • I think the only caveat is I would say there are a few people who believe that generally I will happen at some fairly high percentage chance in the next 10 years, people who are smart.

    我想唯一的告誡是,我想說有一些人相信,一般我在未來10年內會發生一些相當高比例的機會,這些人都很聰明。

  • So I do think that being keeping aware, but the way I look at it is that there's a dozen or two different breakthroughs that need to happen for each of the pieces, so you can kind of monitor.

    所以我確實認為,要保持意識,但是我的看法是,每一塊都需要有十幾二十個不同的突破,所以你可以種監測。

  • It's sort of and you don't know exactly when they're gonna happen because there, by definition, breakthroughs.

    這是一種,你不知道確切的時間 他們會發生,因為有,根據定義,突破。

  • And I think it's kind of when you think these breakthroughs will happen and you just have to have somebody close to the power cord right when you said about the Oh, wait, I'm completely with president that its short term, it's going to be bad people using a eyes for bad things will be an extension of of of us.

    我認為這是一種當你認為這些突破會發生,你只是必須有人接近電源線,當你說的哦,等待,我完全與總統,其短期,這將是壞人使用眼睛的壞東西將是我們的延伸。

  • And then there's this other meta thing which happens, which is a group of people.

    然後就會發生另外一種元的事情,就是一群人。

  • So So if you look at all of the hate, the Internet, it's not.

    所以,所以如果你看所有的仇恨,互聯網,它不是。

  • One person doesn't control that, but it's a thing.

    一個人控制不了,但這是一件事。

  • It is pointed.

    它是尖的。

  • It points at things.

    它指向的東西。

  • It's definitely fueling some political activity right now, but it's kind of that a life of its own, it's not even code, it's a culture and you see that also in the Middle East, right?

    它肯定是助長了一些政治活動,現在,但它是一種,它自己的生活,它甚至不是代碼,它是一種文化,你看到,也在中東,對不對?

  • So why it's so hard to prevent yet because it actually gets stronger than when you attack it and And to me, what's curious and interesting is going to be the relationship between an A.

    所以,為什麼它這麼難預防還因為它實際上會比你攻擊它的時候更強,對我來說,好奇和有趣的是會是一個A之間的關係。

  • I say, a service that runs like that.

    我說,這樣運行的服務。

  • And then you throw in Bitcoin, which is the ability to move money around by a machine anonymously and so to me, it will be this weird.

    然後你再把比特幣扔進去,這是一種機器匿名移動資金的能力,所以對我來說,它會是這種奇怪的。

  • And again, this is where I think it could be embedded if you if you if you gave this sort of mob more tools to cause that they are actually fairly, uh, coordinated in their own peculiar way, and them and the good side is you can imagine.

    再一次,這是我認為它可以嵌入,如果你,如果你給這種暴徒更多的工具,導致他們實際上是相當,呃,協調在自己的奇特的方式,和他們和好的一面是你可以想象。

  • You know, I was talking to some politicians like Michael Johnson in Colorado.

    你知道,我跟一些政客談過,比如科羅拉多州的邁克爾-約翰遜。

  • He's trying to figure out How can we harness these things to inform and engaged citizens So so to me that the trick is if the problems, if you suppress it because of fear, the bad guys are still use it, and what is important is to get people who want to use it for good communities and leaders and figure out how to get them to use it so that they that that's where we start to lean.

    他試圖弄清楚我們如何能利用這些東西來告知和參與的公民,所以對我來說,訣竅是,如果問題,如果你因為恐懼而壓制它,壞人仍然使用它,重要的是讓人們誰想要使用它為良好的社區和領導者,並找出如何讓他們使用它,使他們,這就是我們開始傾斜。

  • Yeah, this may not be a precise analogy.

    是啊,這可能不是一個精確的類比。

  • Traditionally, when we think about security and protecting ourselves, we think in terms of we need armor or walls from swords, blunt instruments, etcetera.

    傳統上,當我們想到安全和保護自己的時候,我們想到的是我們需要盔甲或牆壁,以防止刀劍、鈍器等等。

  • And increasingly, um, I find myself looking to medicine and thinking about viruses.

    越來越多的,嗯,我發現自己在尋找醫學,思考病毒。

  • Antibodies, right?

    抗體,對吧?

  • How do you create healthy systems back him word off, destructive elements distributed and in a distributed way.

    如何建立健康的系統揹他字關,破壞性元素分佈和分佈式的方式。

  • And that requires more imagination.

    而這需要更多的想象力。

  • And we're not, vary it.

    而我們不是,變化它。

  • It's part of the reason why cybersecurity continues to be so hard is because the threat is not a bunch of tanks rolling at you, but a whole bunch of systems that may be vulnerable.

    網絡安全之所以持續如此之難,部分原因在於,威脅不是一堆坦克向你滾來,而是一大堆可能存在漏洞的系統。

  • Toe a warm didn't in there.

    腳趾頭一暖沒有在那裡。

  • It means that we've got to think differently about our security, make different investments that may not be as sexy, but actually may end up being as important as anything.

    這意味著,我們要對自己的安全感進行不同的思考,進行不同的投資,這些投資可能沒有那麼性感,但實際上最終可能和任何東西一樣重要。

  • And part of the reason I think about this good is because I also think that what I spend a lot of time worrying about are things like pandemic.

    而我之所以想到這個好,部分原因是我也認為我花了很多時間擔心的是大流行病等事情。

  • You can't build walls in order to prevent, you know, the next airborne, uh, lethal flu from landing on our shores.

    你不能為了防止,你知道的,下一次空中傳播的,呃,致命的流感登陸我們的海岸,而建牆。

  • Instead, what we have to do is be able to set up systems toe, create public health systems and all parts of the world quick triggers that tell us when we see something emerging.

    相反,我們要做的是能夠建立系統趾,建立公共衛生系統和世界各地的快速觸發器,當我們看到一些東西出現時,告訴我們。

  • Make sure we've got quick protocols.

    確保我們有快速協議。

  • Systems that allow us toe make that vaccines a lot smarter.

    系統讓我們能夠讓疫苗變得更聰明。

  • So if you think, uh, if you take that model a public health model and you think about how we can deal with the, uh, the problems of cybersecurity, a lot of that may end up being really helpful in thinking about the I threats.

    所以,如果你認為,呃,如果你採取這種模式,一個公共衛生模式,你認為我們如何能夠處理,呃,網絡安全的問題,很多可能最終是真正有用的思考I威脅。

  • And just one thing that I think is interesting is when we start to look at microbiome and microbes everywhere.

    只是有一件事,我認為是有趣的是,當我們開始看微生物組和微生物無處不在。

  • There's a lot of evidence to show that introducing good bacteria to fight against the bad bacteria is the strategy and not to sterilize.

    有很多證據表明,引進好的細菌來對抗壞的細菌是策略,而不是為了殺菌。

  • And I think that will sunny and bo like me when I wear.

    我想,會陽光燦爛,博喜歡我時,我穿。

  • When I walked them in the South Lawn, some things I see them that way.

    當我在南草坪走過他們的時候,有些事情我是這樣看他們的。

  • Researcher.

    研究員:

  • Just reading the shine that actually opening windows in hospitals, Prince there sits sterilizing there may actually limit, so we have to rethink what clean means, and it's similar whether you're talking about cyber security or national security.

    只是看了一下光澤,其實在醫院開窗,王子那裡坐著消毒其實可能會限制,所以我們要重新思考乾淨的含義,不管你說的是網絡安全還是國家安全,都是類似的。

  • I think that the notion that you can make strict borders or that you could eliminate every possible pathogen is difficult.

    我認為,你可以制定嚴格的邊界或你可以消除所有可能的病原體的概念是困難的。

  • And I think I think that in that sense you're in your position to be able to see medicine and cyber, and I think that's a so they're distributed threats.

    我想我認為,在這個意義上,你在你的位置能夠看到醫學和網絡,我認為這是一個所以他們是分佈式威脅。

  • But is there also the risk that this creates a new kind of arms race?

    但這是否也有可能造成一種新的軍備競賽?

  • Look, I I think there's no doubt that developing international norms rules protocols, verification mechanisms around cybersecurity generally on a I in particular is in its infancy.

    你看,我我認為毫無疑問,圍繞網絡安全普遍制定國際規範規則協議、核查機制,尤其是在i上還處於起步階段。

  • Um, and part of the reason for that is as joy identified, you got a lot of non state actors who are the biggest players.

    嗯,部分原因是正如Joy所指出的,你有很多非國家行為者,他們是最大的參與者。

  • Part of the problem is, is that identify us buying who's doing what is much more difficult if you're building a bunch of I C B m, we see him, Uh, if somebody sitting at a keyboard, we don't And so we've begun this conversation.

    部分的問題是,是識別我們購買 誰在做什麼是更困難的,如果你正在建立 一堆I C B M,我們看到他,呃,如果有人 坐在鍵盤上,我們不 所以我們已經開始了這次談話。

  • A lot of the conversation right now is not at the level of no dealing with real sophisticated AI, but has more to do with essentially states establishing norms about how they use their cyber capabilities.

    現在的很多對話並不是在不處理真正複雜的人工智能的層面上,而更多的是與本質上國家建立關於如何使用其網絡能力的規範有關。

  • Part of what makes this an interesting problem.

    部分原因使這是一個有趣的問題。

  • This is that the line between offense and defense is pretty blurred.

    這就是攻防之間的界限很模糊。

  • Um, you know the truth, The matter is and part of the reason why, for example, the debate here about cybersecurity Who are you more afraid of Big brother in the state or the guy was trying to empty out your bank account?

    嗯,你知道真相,這件事是和部分原因,例如,這裡關於網絡安全的辯論,你更害怕誰在國家的大哥或傢伙試圖掏空你的銀行賬戶?

  • Part of reason that's so difficult is that if we're going to police this Wild West, whether it's the Internet or a I or any of these other areas, then by definition the government's gotta have capabilities of its got capabilities, then they're subject to abuse.

    部分原因是,如果我們要警察 這個狂野的西部, 無論是互聯網或I或任何這些其他領域, 那麼根據定義,政府的總得有能力 它的能力,然後他們受到濫用。

  • And, uh, at a time when there's been a lot of mistrust built up about government that makes it difficult.

    而且,呃,在這個時候,有很多的不信任 建立了對政府,這使得它的困難。

  • And when you have countries around the world who see America as the preeminent cyber power, now is the time for us to say we're willing to restrain ourselves if you are willing to restrain yourselves.

    而當你讓全世界的國家都把美國視為卓越的網絡強國時,現在就是我們說,如果你們願意剋制自己,我們願意剋制自己。

  • The challenge is the most sophisticated state actors.

    挑戰的是最複雜的國家行為者。

  • Russia, China, Iran, uh, don't always embody the same norms or values that we do, but we're gonna have to surface this as an international issue in order for us to be effective because it's effectively.

    俄羅斯,中國,伊朗,呃,並不總是體現相同的規範或價值觀,我們做的,但我們要去必須表面這個作為一個國際問題,以使我們有效,因為它是有效的。

  • It's a borderless problem, and ultimately all states are gonna have to worry about this.

    這是一個無邊界的問題,最終所有的州都要擔心這個問題。

  • The it is very short sighted.

    它是非常短視的。

  • If there's a statement, thinks that it can develop, uh, super capacities in this area without some 25 year old kid in a basement somewhere, figuring that out pretty quick.

    如果有一個聲明,認為它可以發展,呃,在這個領域的超級能力,沒有一些25歲的孩子在某處的地下室,弄清楚這一點相當快。

positive outcomes.

積極成果。

字幕與單字
由 AI 自動生成

單字即點即查 點擊單字可以查詢單字解釋