Placeholder Image

字幕列表 影片播放

已審核 字幕已審核
  • It's safe to say that our private information isn't private anymore.

    可以說,我們的個人資訊不再隱私了。

  • New technologies are collecting data to be sold or shared between companies in sometimes questionable ways.

    新技術收集數據以便在公司之間出售或共享,但這些數據有時是以可疑的方式存取到的。

  • Y'all really think that "Black Mirror" isn't gonna happen?

    你們真的認為像電影《黑鏡 》中所出現的以後不會發生嗎?

  • Well, it's not going to happen; it's already happening.

    嗯……它以後不會發生;它現在就在發生了。

  • Let's start in China, where employers are monitoring their employees' brain wavesoh, yeah, you heard that right, monitoring their brain waves.

    讓我們從中國開始說起,那裡的僱主正在監測他們員工的腦電波──哦,是的,你沒聽錯,監測他們的腦電波。

  • To be clear, they're not attempting to read the workers' thoughts, but their emotions.

    說白了,他們不是在試圖解讀工人的想法,而是他們的情緒。

  • Factories, state-owned enterprises, and sections of the Chinese military are placing wireless sensors in employees' hats that record and transmit data similar to an electroencephalogram, or EEG.

    工廠、國有企業和中國軍隊的一些部門把無線傳感器置入到僱員的帽子裡,而這些設備則能記錄並傳輸類似於腦電圖,也就是EEG的數據。

  • By analyzing the incoming sensor data, AI models can detect anomalies that might indicate a spike in anger, depression, or anxiety.

    通過分析傳感器傳入的數據,人工智能模型可以檢測到像是憤怒、抑鬱或焦慮激增的異常情況。

  • This system is said to help employers find out who's stressed, modulate break times, and increase productivity, in turn, spiking company profits by an estimated ¥2 billion since 2014.

    據說這個系統可以幫助僱主發現誰有壓力,調整休息時間,並提高生產力,因此據估計,自2014年以來,公司利潤飆升了約20億人民幣。

  • This tech is being used elsewhere, too, like assessing fatigue in high-speed train drivers and monitoring patients in hospitals.

    這項技術也被用於其他地方,比如評估高鐵列車駕駛員的疲勞度和監測醫院裡的病人。

  • Sure, it would be dope to find out if your significant other is really fine when they say "I'm fine" after a fight.

    當然,這也會是一個很讚的工具,看看你的另一半在吵架後說:「我很好」時,是否有口是心非。

  • But how do you regulate something like this?

    但是,你如何監管這樣的事情呢?

  • If emotional data is mineable, what happens if companies nefariously use it to abuse their power?

    如果情感數據是可以挖掘的,那麼如果公司惡意利用它來濫用他們的權力呢?

  • I listen to a lot of SZA and Drake⏤I'm emotional; please don't use my emotions against me.

    我聽了很多SZA和Drake的歌 ──我很情緒化;請不要用我的情緒來對付我。

  • China has a social credit score, a clout score based on your criminal record, donations to charity, loyalty to political parties, how many video games you buy, and even your friends' social credit scores.

    中國有一個社會信用分數:一個基於你的犯罪記錄、慈善捐贈、對黨的忠誠度、你所購入的電玩,甚至你朋友的社會信用分數而打出的一種影響力分數

  • This is just like "Black Mirror" series three's "Nosedive", where everyone has a score based on social interactions.

    這就像《黑鏡》系列三的《急轉直下》,每個人都有一個基於社會互動而產生的分數。

  • The Chinese government claims it's trying to build "trust" with this score, but its implications can be more sinister.

    中國政府聲稱它正試圖用這個分數建立 「信任」,但「信任」這個字面下的含義可能更加可怕。

  • For instance, in 2016, a man was denied a plane ticket because a judge deemed a court apology "insincere" and placed him on a blacklist, tanking his score.

    例如在2016年,有一名男子被拒絕購買飛機票,因為法官認為他在法庭上的道歉是 「不真誠的」,並將他列入黑名單使他的分數下降。

  • "Insincerity" is hella subjective, so how would we regulate for everyone's opinions?

    「不真誠 」是非常主觀的,那我們又如何能管理每個人的意見?

  • Finally, China is using all this information to make you into a precog.

    最後,中國正在利用所有這些資訊來預言你會做的事。

  • They're literally trying to predict political instability using feeds from surveillance cameras, phone usage, travel records, and religious orientation.

    他們真的在試圖用監控攝像頭、電話使用情況、旅行記錄和宗教取向的資料來預測政治不穩定。

  • Extrapolating the negative consequences, this taps into personal data and can unfairly target groups based on prejudice, specifically the Uyghur and other predominantly Muslim populations.

    從負面的後果推斷,這涉及到個人數據,而會因此基於偏見而不公平地針對某群體,特別是維吾爾族,以及其他以穆斯林為主的人口。

  • And let's just say you protest this state-sponsored measure.

    就說你反對這種國家操縱的措施吧。

  • That affects your social credit score, which then can deny you things like plane tickets and jobs, keeping you trapped by the system.

    這就影響了你的社會信用評分,那你就會被拒絕做一些事,像是飛機票和工作,讓你被這種評分系統給困住。

  • Tracking every arenapersonal, professional, recreational, political, etc.⏤is dangerous, especially in the United States, where we value life, liberty, and the pursuit of happiness.

    追蹤每個面向──包括個人、職業、娛樂、政治等等,是一件危險的事,特別是在美國,我們重視生命、自由和對幸福的追求。

  •  Like we don't already know that the government is in our webcams, Siris, and Alexas.

    好像我們還不知道政府就在我們的網絡攝影鏡頭 Siris 和 Alexa 中。

  • Than... thank you?

    ……我要說聲謝謝嗎?

  • It's pretty spooky to think about how the systemic issues were already grappling pretty hard with this society, such as all of these biases, could be magnified by technology we already developed.

    想想就讓人毛骨悚然,我們這個社會已經多麽的努力在和這些具偏見等的系統性問題對抗,而這些問題都可能因為我們已經開發的科技而被放大。

  • America has a lot to deal with right now, so maybe we should sit this tech out.

    美國現在還有很多事情要處理,所以也許我們應該把這項技術先放在一邊。

  • All of these tools can have a pro-social end goal, but it's too soon to tell if the ends justify the means.

    所有這些工具都可以有一個友善社會的最終目標,但要判斷為了目的是否就要合理化這樣的手段還操之過急。

  • Data will continue to be collected on usthat's for sure.

    可以肯定的是,數據將繼續被收集。

  • But, with few regulatory systems in place, we gotta keep an eye on this new tech that's already just chillin' here and stop pretending that this is all happening in some distant dystopian future.

    但由於目前只有少數監管系統到位,我們必須密切關注這種新技術,這樣的技術已經融入我們的生活了,所以不要再假裝這一切都只會在某個遙遠的反烏托邦式未來才會發生。

  • Not only is AI gonna run the world someday, but it's already being used to predict the next global pandemic.

    人工智能不僅有朝一日會統治世界,而且已經被用來預測下一個全球大流行病。

  • Wanna find out how? Bet you do.

    想知道是怎麼辦到的嗎?我賭你一定想知道。

  • Check out this video right here.

    看看這邊這個影片。

  • Thanks for watching, subscribe to Seeker, and come back for new videos every day so you can watch your computer as much as it's watching you.

    感謝您的觀看,請訂閱Seeker,並每天回來觀看新的影片,這樣你就可像你的電腦觀察你那樣來觀察它。

It's safe to say that our private information isn't private anymore.

可以說,我們的個人資訊不再隱私了。

字幕與單字
已審核 字幕已審核

單字即點即查 點擊單字可以查詢單字解釋

B2 中高級 中文 美國腔 數據 分數 社會 中國 信用 監測

中國版黑鏡情節?中國政府宣稱監控腦電波以提升產能 (China's New Surveillance Tech Monitors Workers’ Brainwaves)

  • 1215 54
    Misaki 發佈於 2022 年 08 月 05 日
影片單字