Placeholder Image

字幕列表 影片播放

已審核 字幕已審核
  • It's safe to say that our private information isn't private anymore.

    可以很肯定地說,我們的私人資訊不再隱私了。

  • New technologies are collecting data to be sold or shared between companies in sometimes questionable ways.

    新興科技正在收集數據以向公司出售公司或與之共享,但數據收集的方式有時是可疑的。

  • Y'all really think that "Black Mirror" isn't gonna happen?

    你們真的認為《黑鏡 》中的場景不會發生?

  • Well, it's not going to happen; it's already happening.

    嗯,它不是即將發生,而是已經發生了。

  • Let's start in China, where employers are monitoring their employees' brain wavesoh, yeah, you heard that right, monitoring their brain waves.

    讓我們從中國開始說起,那裡的僱主正在監測員工的腦波,對,你沒聽錯,監測他們的腦波。

  • To be clear, they're not attempting to read the workers' thoughts, but their emotions.

    說白了,他們不是企圖解讀員工的想法,而是他們的情緒。

  • Factories, state-owned enterprises, and sections of the Chinese military are placing wireless sensors in employees' hats that record and transmit data similar to an electroencephalogram, or EEG.

    工廠、國有企業和中國軍隊的一些部門把無線傳感器置入員工的帽子裡,而這些設備會記錄並傳輸類似於腦電圖(EEG)的數據。

  • By analyzing the incoming sensor data, AI models can detect anomalies that might indicate a spike in anger, depression, or anxiety.

    透過分析傳感器傳入的數據,人工智慧模型可以檢測到像是憤怒、抑鬱或焦慮激增等異常情況。

  • This system is said to help employers find out who's stressed, modulate break times, and increase productivity, in turn, spiking company profits by an estimated ¥2 billion since 2014.

    這個系統據說可以幫助僱主發現誰有壓力、調整休息時間,並提高生產力,進而自 2014 年以來,估計使公司利潤飆升約 20 億人民幣。

  • This tech is being used elsewhere, too, like assessing fatigue in high-speed train drivers and monitoring patients in hospitals.

    這項科技也被用於其它地方,例如評估高鐵列車駕駛員的疲勞度程度以及監測醫院裡的病患。

  • Sure, it would be dope to find out if your significant other is really fine when they say "I'm fine" after a fight.

    當然,能夠知道你的另一半在吵架後說「我很好」時,是否有口是心非,也是蠻酷的。

  • But how do you regulate something like this?

    但是,你要如何監管這種科技呢?

  • If emotional data is mineable, what happens if companies nefariously use it to abuse their power?

    如果情感數據是可開採的,如果公司企業惡意利用數據濫用權力怎麼辦?

  • I listen to a lot of SZA and Drake⏤I'm emotional; please don't use my emotions against me.

    我聽了很多 SZA 和 Drake 的歌 ──我很情緒化,請不要用我的情緒來對付我。

  • China has a social credit score, a clout score based on your criminal record, donations to charity, loyalty to political parties, how many video games you buy, and even your friends' social credit scores.

    中國有一個社會信用分數,一個根據你的犯罪記錄、慈善捐贈、對黨的忠誠度、購入的電玩,甚至朋友的社會信用分數而打出的一種影響力分數。

  • This is just like "Black Mirror" series three's "Nosedive", where everyone has a score based on social interactions.

    這就像《黑鏡》第三季的《急轉直下》,每個人都有一個根據社會互動而獲得的分數。

  • The Chinese government claims it's trying to build "trust" with this score, but its implications can be more sinister.

    中國政府聲稱它正試圖用這個分數建立 「信任」,但這個字眼的暗示可能更加可怕。

  • For instance, in 2016, a man was denied a plane ticket because a judge deemed a court apology "insincere" and placed him on a blacklist, tanking his score.

    例如 2016 年時,一名男子被禁止購買飛機票,因為法官認為他在法庭上的道歉是 「不真誠的」,並將他列入黑名單,使他的分數下降。

  • "Insincerity" is hella subjective, so how would we regulate for everyone's opinions?

    「不真誠 」超級主觀的,那我們又能如何調節每個人的意見?

  • Finally, China is using all this information to make you into a precog.

    最後,中國正在利用這所有資訊來預言你會做的事。

  • They're literally trying to predict political instability using feeds from surveillance cameras, phone usage, travel records, and religious orientation.

    他們確實地試圖用監控攝像頭、電話使用情況、旅行記錄和宗教取向等資料來預測政治不穩定性。

  • Extrapolating the negative consequences, this taps into personal data and can unfairly target groups based on prejudice, specifically the Uyghur and other predominantly Muslim populations.

    從負面的後果推斷,這涉及個人數據,也會因此基於偏見而不公平地針對某群體,特別是維吾爾族以及其他以穆斯林為主的人口。

  • And let's just say you protest this state-sponsored measure.

    假設你反對這種國家操縱的措施。

  • That affects your social credit score, which then can deny you things like plane tickets and jobs, keeping you trapped by the system.

    那就會影響你的社會信用評分,導致你無法購買機票或獲得工作等失權,讓你被評分系統困住。

  • Tracking every arenapersonal, professional, recreational, political, etc.⏤is dangerous, especially in the United States, where we value life, liberty, and the pursuit of happiness.

    追蹤包括個人、職業、娛樂、政治等每個面向是一件危險的事,尤其是在重視生命、自由和追求幸福的美國。

  •  Like we don't already know that the government is in our webcams, Siris, and Alexas.

    好像以為我們不知道政府已經存在我們的網路攝影機、 Siris 和 Alexa 中。

  • Than... thank you?

    ……我該說謝謝嗎?

  • It's pretty spooky to think about how the systemic issues were already grappling pretty hard with this society, such as all of these biases, could be magnified by technology we already developed.

    想想就讓人毛骨悚然,我們這個社會已經多麽的努力在和這些具偏見等的系統性問題對抗,而這些問題都可能因為我們已經開發的科技而被放大。

  • America has a lot to deal with right now, so maybe we should sit this tech out.

    美國現在還有很多議題要應付,所以也許我們應該把這項技術先放在一邊。

  • All of these tools can have a pro-social end goal, but it's too soon to tell if the ends justify the means.

    所有這些工具都可以有一個友善社會的最終目標,但要判斷為了目的是否就要合理化這樣的手段還操之過急。

  • Data will continue to be collected on usthat's for sure.

    可以肯定的是,數據將繼續被收集。

  • But, with few regulatory systems in place, we gotta keep an eye on this new tech that's already just chillin' here and stop pretending that this is all happening in some distant dystopian future.

    但由於目前只有少數監管系統到位,我們必須密切關注這種新技術,這樣的技術已經融入我們的生活了,所以不要再假裝這一切都只會在某個遙遠的反烏托邦式未來才會發生。

  • Not only is AI gonna run the world someday, but it's already being used to predict the next global pandemic.

    人工智慧不僅有朝一日會統治世界,而且已經被用來預測下一個全球大流行病。

  • Wanna find out how? Bet you do.

    想知道是怎麼辦到的嗎?不想才怪。

  • Check out this video right here.

    看看這部影片。

  • Thanks for watching, subscribe to Seeker, and come back for new videos every day so you can watch your computer as much as it's watching you.

    感謝各位觀看,請訂閱 Seeker,並每天回訪觀看新的影片,這樣你就可以像你的電腦觀察你那樣來觀察它。

It's safe to say that our private information isn't private anymore.

可以很肯定地說,我們的私人資訊不再隱私了。

字幕與單字
已審核 字幕已審核

單字即點即查 點擊單字可以查詢單字解釋