Placeholder Image

字幕列表 影片播放

由 AI 自動生成
  • I don't want to be human.

    我不想成為人類。

  • What am I?

    我是什麼?

  • Hi there.

    你們好。

  • I'm chris.

    我是克里斯。

  • Atkinson.

    阿特金森。

  • Chris is a professor at the robotics institute at Carnegie Mellon University.

    克里斯是卡內基梅隆大學機器人研究所的教授。

  • Today.

    今天。

  • I'll be breaking down clips from movie and tv about artificial intelligence and robotics out of control robots.

    我將從電影和電視中分解出有關人工智能和機器人失控的片段。

  • I robot detective.

    我是機器人偵探。

  • What are you doing?

    你在做什麼?

  • You said they've all been programmed with the three laws.

    你說他們都被編入了三大法則。

  • Isaac Asimov was one of the earliest science fiction writers who focused on robots and he came up with this scheme of the three laws.

    艾薩克-阿西莫夫是最早關注機器人的科幻作家之一,他提出了這個三定律的方案。

  • Yeah, I know the three laws.

    是的,我知道這三條法律。

  • Your perfect circle of protection robots don't hurt humans but doesn't the second law state that a robot has to obey any order given by a human being, except if it causes a violation of the first law, right?

    你的完美保護圈機器人不會傷害人類,但第二定律不是說機器人必須服從人類發出的任何命令,除非它導致違反第一定律,對嗎?

  • But the third law states that a robot can defend itself unless it causes a violation of the second law.

    但第三條定律指出,機器人可以自衛,除非它造成對第二條定律的違反。

  • And the first law, we have 1000 robots that will not try to protect themselves if it violates a direct order from a human.

    而第一條定律,我們有1000個機器人,如果它違反了人類的直接命令,就不會試圖保護自己。

  • And I'm betting one who will got you get the hell out of here.

    而且我敢打賭,有一個人會讓你離開這裡。

  • What am I?

    我是什麼?

  • You saw a robot that somehow it's three laws were disabled and that left it with an existential crisis, what am I?

    你看到一個機器人,不知為何它的三大法則被禁用,這讓它有了生存危機,我是什麼?

  • Which is sort of similar to what am I supposed to do next?

    這有點類似於我接下來應該做什麼?

  • If I don't have any guiding purpose and if it was a case that robots ran on what we call expert systems or sets of laws that might actually be a reasonable way to program robots.

    如果我沒有任何指導性的目的,如果是機器人在我們所謂的專家系統或法律集上運行的情況,這實際上可能是一個合理的機器人編程方式。

  • You know what they say, laws are made to be broken.

    你知道他們說什麼,法律是用來被打破的。

  • But nowadays we're actually programming robots in a very different way by giving them lots of training examples and having them essentially learn parameters in formulas so that they do the right thing.

    但如今我們實際上是以一種非常不同的方式對機器人進行編程,給它們大量的訓練實例,讓它們基本上學習公式中的參數,以便它們做正確的事情。

  • So we have this mismatch between logical rules and numbers in a formula.

    是以,我們有這種邏輯規則和公式中的數字之間的不匹配。

  • So in robot movie after robot movie, they're obsessed with the robots actively turning against the humans and started killing humans.

    是以,在一部又一部的機器人電影中,他們痴迷於機器人主動與人類反目,開始殺害人類。

  • And I get that the goddamn robots john far more likely is the robot will screw up, it will make a mistake and that mistake will have bad consequences.

    我明白,該死的機器人嫖客更有可能的是機器人會搞砸,會犯錯,而這個錯誤會帶來壞的後果。

  • It's a robot.

    它是一個機器人。

  • It doesn't need a motive.

    它不需要一個動機。

  • It just has to be broken.

    它只是必須被打破。

  • Deactivating an android Blade Runner.

    停用機器人 "銀翼殺手"。

  • Should I really shoot the crazy robot that's coming to attack me?

    我真的應該射殺那個要來攻擊我的瘋狂機器人嗎?

  • Yeah, shooting a robot.

    是的,射擊機器人。

  • It is a potentially a pretty good way to stop a robot.

    這有可能是阻止機器人的一個相當好的方法。

  • Now there might be parts of its chassis that you shoot a bullet through.

    現在它的底盤可能有一些部分被你的子彈射穿。

  • It doesn't do anything, it doesn't cut any wires.

    它不做任何事情,也不切斷任何電線。

  • Doesn't open up a hydraulic fluid hose.

    並沒有打開液壓油管。

  • So you might have to shoot it a bunch of times.

    所以你可能要拍上好幾遍。

  • There are other ways to disable a robot that might not damage it in the same way.

    還有其他禁用機器人的方法,可能不會以同樣的方式損害它。

  • For example, there's something called an electromagnetic pulse which will fry all its circuits but leave the mechanics intact.

    例如,有一種叫做電磁脈衝的東西會燒燬它的所有電路,但使機械裝置保持完好。

  • Blade Runner is a fantastic movie, has this incredible vision of the future, which by the way, we've already caught up with learning from imitation terminator.

    銀翼殺手》是一部神奇的電影,對未來有這種令人難以置信的設想,順便說一下,我們已經趕上了從模仿終結者那裡學習。

  • Hey buddy got a dead cat in there or what you were going to program robots to have conversations and dialogue and in many, many situations you can anticipate what happens next.

    嘿,夥計,那裡有一隻死貓,或者你打算給機器人編程,讓它們進行對話,在很多很多情況下,你可以預見到接下來會發生什麼。

  • But sooner or later the robot's going to face the situation this new, it doesn't know what to say.

    但機器人遲早要面對這種新的情況,它不知道該說什麼。

  • Dead cat in there.

    死貓在那裡。

  • You know, the robot's going to have to wing it in.

    你知道,機器人將不得不插上翅膀。

  • This clip was the way to go a screen came up and english words were there and it sort of moved a cursor down and picked one that's only there for the audience of the movie.

    這個片段是這樣的,一個螢幕出現了,英語單詞在那裡,它有點像把遊標往下移,挑了一個只為電影觀眾存在的單詞。

  • The robots aren't going to do that.

    機器人是不會這樣做的。

  • You know, it's all electronics, little transistors going blip, blip, blip and it's gonna make the decision decided our fate in a microsecond programming the new Westworld.

    你知道,這都是電子器件,小半導體在閃爍,閃爍,閃爍,它將在一微秒內做出決定,決定我們的命運,為新的西部世界編程。

  • Did you see it?

    你看到了嗎?

  • No, give it a second.

    不,給它一秒鐘。

  • She'll do it again.

    她還會這樣做的。

  • Her finger that's not standard.

    她的手指那是不標準的。

  • Occasionally in Westworld you see a black box looks like a fancy keyboard in which they're sort of programming various sub behaviors.

    在《西部世界》中,你偶爾會看到一個黑色的盒子,看起來像一個花哨的鍵盤,他們在其中對各種子行為進行編程。

  • Let's call them or primitives.

    讓我們把它們稱為或基元。

  • There's been this controversy of, do you build up behaviors from twitch here and a twitch there or do you have a few fundamental behaviors and then you combine them.

    一直有這樣的爭議,你是在這裡和那裡建立起抽動的行為,還是有一些基本的行為,然後把它們結合起來。

  • You must have I slipped it in there without telling anyone.

    你一定是我在沒有告訴任何人的情況下把它塞到了那裡。

  • It turns out you can learn a lot faster if you combine these fundamental behaviors rather than adding up a lot of twitches, he calls them reveries.

    事實證明,如果你把這些基本行為結合起來,而不是把大量的抽動加起來,你可以學得更快,他稱之為遐想。

  • The old gestures were just generic movements.

    舊的手勢只是一般的動作。

  • These are tied to specific memories and this is a spoiler things in Westworld go bad because they couldn't completely wipe the memory of a robot.

    這些都是與特定的記憶相聯繫的,這是一個破壞者,《西部世界》中的事情變壞了,因為他們無法完全抹去機器人的記憶。

  • The memories are purged at the end of every narrative.

    記憶在每次敘述結束時都會被清除。

  • They're still in there waiting to be overwritten.

    它們還在裡面,等待被覆蓋。

  • He found a way to access them.

    他找到了一個訪問它們的方法。

  • It turns out in complex machines so you've got lots of different kinds of memory in lots of different kinds of places.

    事實證明,在複雜的機器中,所以你有很多不同種類的內存在很多不同種類的地方。

  • For example the CPU is gonna have a little bit of memory in it.

    例如,CPU中會有一點內存。

  • It will also have something called cache memory.

    它還會有一個叫做緩存的東西。

  • We have what's called a memory hierarchy.

    我們有一個所謂的記憶層次結構。

  • There's fast memory and then there's slower memory like a subconscious a hooker with hidden depths.

    有快速的記憶,然後有較慢的記憶,就像一個潛意識一個有隱藏深度的妓女。

  • If you totally fried the machine, all you'd end up with is a broken robot.

    如果你完全燒燬了機器,你最終得到的只是一個壞掉的機器人。

  • So you can't totally wipe the machine.

    所以你不能完全擦拭機器。

  • It's the tiny things that make them seem real, robotics.

    正是這些微小的東西使它們看起來很真實,機器人。

  • Lab making mr.

    實驗使先生。

  • Right okay.

    對的好的。

  • You look at that scene you say why is chemistry happening in robotics lab?

    你看那個場景,你會說為什麼化學會發生在機器人實驗室?

  • The future of robotics is where we make materials using chemistry to make soft materials.

    機器人技術的未來是我們利用化學制造材料,使之成為軟性材料。

  • So that's actually it looks very similar to what goes on.

    是以,這實際上是它看起來與所發生的事情非常相似。

  • And some of my colleagues labs at Carnegie Mellon.

    還有我在卡內基梅隆大學的一些同事實驗室。

  • I thought showing you these tapes might help to make you more familiar with the drawing.

    我想給你看這些錄音帶可能有助於讓你更熟悉這幅畫。

  • We video record all our experiments and in fact often you know when we press an on button to say do something that simultaneously turns the cameras on programming the android takes him just so far.

    我們對所有的實驗進行錄像,事實上,你經常知道,當我們按下一個開機按鈕,說做一些事情,同時打開攝影機的程序,安卓機帶他到目前為止。

  • The rest must be learned.

    其餘的必須學習。

  • Watching video feedback is very helpful to debug behavior.

    觀看視頻反饋對調試行為非常有幫助。

  • We had some difficulty with those gross motor functions before we modified his cerebral muscular coordination.

    在我們修改他的腦部肌肉協調能力之前,我們在這些大運動功能方面有一些困難。

  • Okay, so you just saw some mumbo jumbo.

    好的,所以你剛剛看到了一些胡言亂語。

  • He's mixing things we'd say about a human who had a disease.

    他把我們對一個有病的人說的話混在一起。

  • We modified his cerebral muscular coordination and a robot that doesn't know how to walk gross motor functions.

    我們修改了他的腦部肌肉協調和一個不知道如何行走的機器人的大運動功能。

  • Oh, my robots actually fell down much more than their robot did robot malfunction.

    哦,我的機器人實際上比他們的機器人做的機器人故障摔得多。

  • Austin powers.

    奧斯汀的力量。

  • Mm hmm.

    嗯,嗯。

  • Can you design an input to a robot that will cause it to malfunction or crash?

    你能為機器人設計一個會導致其發生故障或崩潰的輸入嗎?

  • Only Austin Powers can do that.

    只有奧斯汀-鮑爾斯能做到這一點。

  • Yeah.

    是的。

  • Baby multi agent robotics.

    嬰兒多代理機器人技術。

  • Minority Report One area of research and robotics is what we call multi agent robotics.

    少數派報告 研究和機器人學的一個領域是我們所說的多代理機器人學。

  • Which is to get a bunch of robots to work together.

    這就是讓一群機器人一起工作。

  • Mm hmm.

    嗯,嗯。

  • We model that on humans working together like a sports team or when we're searching for somebody lost in the woods.

    我們以人類像運動隊一樣一起工作,或者當我們在樹林中尋找迷路的人時,作為模型。

  • Small robots in this clip, it would seem very smart and complicated.

    這個片段中的小機器人,會顯得非常聰明和複雜。

  • The dream in robotics is can we make a lot of stupid cheap robots that by working together?

    機器人領域的夢想是,我們能否通過合作製造出大量愚蠢的廉價機器人?

  • Get the job done.

    把工作做好。

  • What we saw in this clip is they were using some kind of imaging radar to figure out where people are likely to be tom cruise hidden in the bathtub that water shields him from radar as you saw.

    我們在這個片段中看到的是,他們正在使用某種成像雷達來計算人們可能在哪裡,湯姆-克魯斯藏在浴缸裡,正如你所看到的,水將他從雷達中屏蔽。

  • But it also tries to hide his thermal signature.

    但它也試圖隱藏他的熱信號。

  • So he's hotter than everything else around him And any kind of infrared imaging would have found him.

    所以他比周圍的一切都要熱,任何一種紅外成像都會發現他。

  • That's why he dumped ice in the water as well.

    這就是為什麼他把冰塊也倒在了水裡。

  • But you know, if the robots have any kind of camera that could have just looked down and there's tom Cruise.

    但你知道,如果機器人有任何形式的攝像頭,可以直接向下看,有湯姆-克魯斯。

  • So I think we're gonna have to do a little better than that if we're going to hide from the robots, thermal vision Westworld, the West Road clip is using what we call thermal imaging.

    所以我認為,如果我們要躲避機器人,我們必須做得比這更好一點,熱視力 西部世界,西路的片段是使用我們所說的熱成像。

  • Far infrared imaging where you essentially can see an image of the temperature of things that are out there.

    遠紅外成像,你基本上可以看到外面事物的溫度影像。

  • When the human got next to something that was much hotter.

    當人類走到更熱的東西旁邊時。

  • The movie tried to suggest that that would hide the human.

    電影試圖暗示,這將隱藏人類。

  • That's actually probably not the case because the human isn't any less hot and as long as the image doesn't do what we call bloom and the whole thing saturate that robot should have been able to see the human just as well.

    實際上可能不是這樣的,因為人類的溫度並不低,只要影像不出現我們所說的綻放和整體飽和,機器人就應該能夠看到人類的樣子。

  • Mhm.

    嗯。

  • Thanks.

    謝謝。

  • Human limitations, battlestar galactica.

    人類的侷限性,太空堡壘卡拉狄加。

  • The five of us designed you to be as human as possible.

    我們五個人把你設計成儘可能的人類。

  • I don't want to be human.

    我不想成為人類。

  • I want to see gamma rays.

    我想看到伽馬射線。

  • I want to hear x rays and I want to I want to smell dark matter.

    我想聽到X射線,我想我想聞到暗物質。

  • It's highly unlikely robots are going to complain about their bodies.

    機器人很可能不會抱怨自己的身體。

  • What they might complain about is our crappy computational hardware.

    他們可能抱怨的是我們蹩腳的計算硬件。

  • I can't even express these things properly.

    我甚至無法正確表達這些東西。

  • A lot of it gets back to how do we build computers and right now for largely historical reasons.

    很多東西都回到了我們如何建造計算機的問題上,現在主要是歷史原因。

  • We separate out the thinking part.

    我們把思考的部分分離出來。

  • We'll call that the processor and the memory part.

    我們將其稱為處理器和內存部分。

  • And in order to really think about things, we've got to move everything in the memory part into the processor so it can process it.

    而為了真正思考問題,我們必須把內存部分的所有東西移到處理器中,以便它能夠處理。

  • And it turns out that's really slow.

    而事實證明,這真的很慢。

  • And if you want to save stuff, you got to move it back and that's really slow.

    如果你想保存東西,你得把它移回去,這真的很慢。

  • If I'm so broken, then whose fault is that?

    如果我如此崩潰,那麼這是誰的錯?

  • It's my maker's fault.

    這是我的製造者的錯。

  • Google has something called a tpu, which they're optimizing to run something called neural networks and they're building bucket loads of these things and they're very, very different from standard computers.

    谷歌有一種叫做TPU的東西,他們正在優化,以運行一種叫做神經網絡的東西,他們正在建造大量的這些東西,它們與標準計算機非常非常不同。

  • But I know I want to reach out with something other than these prehensile paws.

    但我知道我想用這些無柄的爪子以外的東西伸出去。

  • I do believe that the way we build computers now is going to change completely in many ways.

    我確實相信,我們現在製造計算機的方式將在許多方面完全改變。

  • I'm a machine and I could know much more if the robots want to help us do a better job at that.

    我是一臺機器,如果機器人想幫助我們在這方面做得更好,我可以知道得更多。

  • More power to them robotics.

    對他們來說,機器人的力量更大。

  • Lab rising sun, why do we have guys in gold suits?

    實驗室旭日東昇,為什麼我們有穿金裝的人?

  • I don't know why you'd wear gold suits.

    我不知道你為什麼要穿金色的套裝。

  • You certainly in a clean room where some kind of suit.

    你當然是在一個乾淨的房間裡,那裡有某種套裝。

  • So your dandruff doesn't ruin the chips.

    這樣你的頭皮屑就不會毀掉薯片了。

  • They look ridiculous.

    他們看起來很可笑。

  • Jim.

    吉姆。

  • How are you, Captain Connor, the people making the movies that we got to get us some cool looking robots.

    你怎麼樣,康納上尉,製作電影的人,我們得到了一些看起來很酷的機器人。

  • But I've been told by reliable sources, but only you have the next generation of technology to do this kind of work.

    但我被可靠的消息來源告知,但只有你有下一代的技術來做這種工作。

  • So they actually reached out to what was then the leg lab, which was at that time at M.

    所以他們實際上聯繫了當時的腿部實驗室,當時是在M.

  • I.

    I.

  • T.

    T.

  • You're looking at early robots that eventually became boston dynamics.

    你在看早期的機器人,最終成為波士頓動力。

  • I'm getting out to a lot more dodger games lately.

    我最近出去看了很多道奇隊的比賽。

  • The robot on the boom I believe was a una rue, which it was a kangaroo like robot.

    吊杆上的機器人我相信是una rue,它是一個類似袋鼠的機器人。

  • So the three D.

    所以這三個D。

  • By pad is quite famous.

    通過墊子是相當有名的。

  • It was, you know, one of the first robots that didn't have a sort of protective system to keep it from falling down.

    你知道,它是第一批沒有某種保護系統的機器人之一,以防止它倒下。

  • Well, your reliable sources are wrong and it did a lot of amazing things.

    好吧,你的可靠消息來源是錯誤的,它做了很多驚人的事情。

  • Walk, run it even did flips.

    走路、跑步,它甚至做了翻轉。

  • In contrast the robot you saw earlier inside that that building was on a boom and it runs in a circle.

    相比之下,你之前看到的那個建築物內的機器人是在一個吊杆上,它在一個圓圈內運行。

  • What do they think of next designed bicentennial man?

    他們對下一個設計好的二百歲的人怎麼看?

  • Have you given any thought whatsoever as to what age you'd like to be Officially?

    你有沒有考慮過你想成為正式的什麼年齡?

  • I am 62 years old.

    我今年62歲了。

  • Mhm.

    嗯。

  • Let's take off 25 years.

    讓我們減去25年。

  • What do you say?

    你怎麼說?

  • 15, 20, perfect.

    15,20,完美。

  • They use soft materials to make the face of the robot.

    他們使用柔軟的材料來製作機器人的面部。

  • Soft materials typically do what we call creep over long periods of time.

    軟質材料通常在很長一段時間內做我們所說的蠕變。

  • The animatronic figures such as the Disney presidents and whatnot all begin to sag just keeping you on your toes.

    諸如迪斯尼總統之類的動畫人物都開始下垂,只是讓你保持警惕。

  • I don't have any toes.

    我沒有任何腳趾。

  • A big problem with robot skin is after a while it gets worn, It gets cuts it sags.

    機器人皮膚的一個大問題是在一段時間後它會被磨損,它被切割,它下垂。

  • It's actually a big problem because if you're loaded up with sensors and wires, It's $1 million dollar piece of skin and fixing it is a big problem.

    這實際上是一個大問題,因為如果你裝滿了傳感器和電線,這是一塊價值100萬美元的皮膚,修復它是一個大問題。

  • But otherwise that's pretty much how you make a robot face.

    但除此之外,這幾乎是你製作機器人臉的方式。

  • You have some mechanism and you cover it with some soft, gooey stuff.

    你有一些機制,你用一些柔軟、粘稠的東西覆蓋它。

  • Origami transformers.

    摺紙變壓器。

  • Mhm.

    嗯。

  • Mhm.

    嗯。

  • There are a lot of roboticists out there who want to build origami robots, robots that can reconfigure themselves by folding.

    有很多機器人專家想建造摺紙機器人,即可以通過摺紙來重新配置自己的機器人。

  • That turns out to be a great way to build robots.

    這原來是製造機器人的一個好方法。

  • Another area of robotics is more like traditional origami where you have a flat sheet and you have a folding pattern and that makes a three dimensional robot.

    機器人的另一個領域更像是傳統的摺紙,你有一張平坦的床單,你有一個摺疊圖案,這就構成了一個三維的機器人。

  • Why is it a good idea to start things from the flat sheet?

    為什麼從平坦的地方開始做事是個好主意?

  • You can use any printing technology you want to lay down things like surfaces or materials.

    你可以使用任何你想要的印刷技術來鋪設表面或材料等東西。

  • But most of these folding robots on a smaller scale for structural reasons.

    但由於結構上的原因,這些摺疊式機器人的規模大多較小。

  • So they're on the meter or centimeter scale rather than the 100 or tens of meter scale.

    所以它們是以米或釐米為組織、部門,而不是以100或幾十米為組織、部門。

  • Big guys, big guys with big guns, robotic insects, Black mirror colony collapse disorder.

    大人物,帶著大槍的大人物,機器人昆蟲,黑鏡群落崩潰症。

  • We still don't know what's behind it, bees themselves were virtually extinct.

    我們仍然不知道這背後是什麼,蜜蜂本身幾乎已經滅絕了。

  • So what are ladies do is effectively stand in for them?

    那麼,女士們所做的就是有效地站在他們的立場上呢?

  • This clip is very close to the truth in that.

    這個片段在這一點上非常接近事實。

  • People are trying to build a robot insects right now, they can build things that are almost as small as a B.

    人們現在正試圖建造一個機器人昆蟲,他們可以建造幾乎和B一樣小的東西。

  • And that can fly, we simply set the behavior and leave them to it.

    而這可以飛,我們只需設置行為,讓他們去做。

  • They need rudimentary pattern recognition in order to locate compatible Thora navigate.

    他們需要基本的模式識別,以定位兼容的索拉導航。

  • They even construct these hives themselves.

    他們甚至自己建造這些蜂巢。

  • They reproduce.

    他們繁衍後代。

  • How could you each hive is a replication point.

    你怎麼可能每個蜂巢都是一個複製點。

  • I was entertained by the hives which had these sort of squarish honeycombs.

    我被那些有這種方形蜂窩的蜂巢所吸引。

  • It's like a three D printer basically.

    它基本上就像一臺三D打印機。

  • They also had a little bit of confusion about reproduction.

    他們對繁殖也有一點迷惑。

  • They created duplicates of themselves, create more hives and spread out exponentially.

    他們創造了自己的複製品,創造了更多的蜂巢,併成倍地擴散開來。

  • They seem to suggest that there's this big three D printer that's churning them out.

    他們似乎在暗示,有一個大型的三D打印機在生產這些產品。

  • Which I wouldn't really call reproduction.

    我不會真的把這稱為繁殖。

  • I call that production to shame.

    我稱這種生產為恥辱。

  • It's necessary.

    這很有必要。

  • The alternative would have an environmental catastrophe.

    另一種情況是會出現環境災難。

  • These were dying out.

    這些都在逐漸消失。

  • Are we going to have to make robots in big factories in the future or are we going to find a similar way to just grow new robots?

    未來我們是要在大工廠裡製造機器人,還是要找到類似的方式來直接培育新的機器人?

  • It's a big issue for robots.

    這對機器人來說是個大問題。

  • Didn't expect to find myself living in the radiation.

    沒想到會發現自己住在輻射區。

  • Chernobyl.

    切爾諾貝利。

  • Alright, let's take this easy Forward.

    好吧,讓我們把這個簡單的 "前進"。

  • one m reverse.

    一米的反向。

  • one swim.

    一遊。

  • Could you lose the signal?

    你會不會失去信號?

  • It's not a signal.

    這不是一個信號。

  • It's the vehicle.

    是車輛。

  • It's dead.

    它已經死了。

  • The problem with high energy radiation is you have chips circuits which rely on every part working.

    高能輻射的問題是你有芯片電路,而芯片電路依賴於每個部件的工作。

  • High energy particle comes in and damages.

    高能粒子進入並破壞。

  • It knocks out a trace or two and the whole thing stops working and they should have known that that was gonna happen.

    它打掉了一兩個痕跡,整個東西就停止工作了,他們應該知道會發生這種情況。

  • I'm a little surprised at that robot was never going to work.

    我有點驚訝於那個機器人是永遠不會工作的。

  • An implication of this is why does military electronics cost so much?

    這其中的一個含義是,為什麼軍用電子產品的價格這麼高?

  • They anticipate high levels of radiation.

    他們預計會有高水平的輻射。

  • If you're fighting in the midst of a nuclear war, that amount of gamma radiation penetrates everything.

    如果你是在核戰爭中作戰,那麼那麼多的伽馬射線會穿透一切。

  • The particles literally shred the circuits in microchips apart.

    這些粒子實際上是把微芯片中的電路撕碎了。

  • All their equipment needs to be read hard as well.

    他們所有的設備也需要努力閱讀。

  • It means the circuit is much more likely to keep working in the presence of high energy radiation.

    這意味著在高能量輻射的情況下,電路更有可能繼續工作。

  • And that makes it fantastically expensive.

    而這使得它的價格奇高。

  • I'm surprised at how good the old stuff was at predicting what's going to happen.

    我很驚訝,舊的東西在預測將要發生的事情方面有多好。

  • A lot of roboticists hate Hollywood.

    很多機器人主義者都討厭好萊塢。

  • I love Hollywood.

    我愛好萊塢。

  • You know, give me more robot movies.

    你知道,給我更多的機器人電影。

  • It's very inspirational.

    這是非常鼓舞人心的。

  • Let's go out there and watch more movies.

    讓我們走出去,看更多的電影。

  • Mm hmm.

    嗯,嗯。

I don't want to be human.

我不想成為人類。

字幕與單字
由 AI 自動生成

單字即點即查 點擊單字可以查詢單字解釋

B1 中級 中文 機器人 人類 定律 電路 編程 蜂巢

ロボット工匠が解説、未來を測したロボット映畫の矛盾點|The Breakdown|WIRED.jp (ロボット工学者が解説、未来を予測したロボット映画の矛盾点 | The Breakdown | WIRED.jp)

  • 10 0
    林宜悉 發佈於 2022 年 05 月 18 日
影片單字