Placeholder Image

字幕列表 影片播放

  • I would like to tell you a story

    譯者: Bert Chen 審譯者: Kuo-Hsien Chiang

  • connecting the notorious privacy incident

    我想告訴各位一則故事

  • involving Adam and Eve,

    是著名的亞當和夏娃隱私事件

  • and the remarkable shift in the boundaries

    是著名的亞當和夏娃隱私事件

  • between public and private which has occurred

    以及這10年來, 在公眾與個人之間分界的重大改變

  • in the past 10 years.

    以及這10年來, 在公眾與個人之間分界的重大改變

  • You know the incident.

    以及這10年來, 在公眾與個人之間分界的重大改變

  • Adam and Eve one day in the Garden of Eden

    你知道這起事件

  • realize they are naked.

    有一天亞當和夏娃在伊甸園

  • They freak out.

    發現他們都沒穿衣服

  • And the rest is history.

    他們嚇壞了

  • Nowadays, Adam and Eve

    其餘的部分你們都知道了

  • would probably act differently.

    換做是現在的話, 亞當和夏娃

  • [@Adam Last nite was a blast! loved dat apple LOL]

    可能會有不同的反應

  • [@Eve yep.. babe, know what happened to my pants tho?]

    (twitter)@亞當 昨晚的表現真是精采! 愛死那顆蘋果了

  • We do reveal so much more information

    (twitter)@夏娃 寶貝你知道我的褲子怎麼了嗎?

  • about ourselves online than ever before,

    在網路上, 我們都比從前透露出更多關於自己的訊息

  • and so much information about us

    在網路上, 我們都比從前透露出更多關於自己的訊息

  • is being collected by organizations.

    而這些有關我們的訊息

  • Now there is much to gain and benefit

    正被許多政府機構收集起來

  • from this massive analysis of personal information,

    現在可以從分析個人資訊或是巨量資料中得到許多利益

  • or big data,

    現在可以從分析個人資訊或是巨量資料中得到許多利益

  • but there are also complex tradeoffs that come

    現在可以從分析個人資訊或是巨量資料中得到許多利益

  • from giving away our privacy.

    但是在捨棄隱私權的同時, 也伴隨著複雜的得失交換

  • And my story is about these tradeoffs.

    但是在捨棄隱私權的同時, 也伴隨著複雜的得失交換

  • We start with an observation which, in my mind,

    我要講的故事是有關這些得失交換

  • has become clearer and clearer in the past few years,

    我們先從觀察開始

  • that any personal information

    在我看來, 過去幾年中, 這個情況已經變得越來越明確

  • can become sensitive information.

    任何個人資訊

  • Back in the year 2000, about 100 billion photos

    都能變成敏感的訊息

  • were shot worldwide,

    回朔到西元2000年的時候,

  • but only a minuscule proportion of them

    全世界上所有人約拍出1000億張照片

  • were actually uploaded online.

    但是只有極少數的照片

  • In 2010, only on Facebook, in a single month,

    被上傳到網路上

  • 2.5 billion photos were uploaded,

    在2010年 光是一個月, Facebook用戶就上傳25億張照片

  • most of them identified.

    在2010年時 光是一個月, Facebook用戶就上傳25億張照片

  • In the same span of time,

    而多數照片上的人都可以被辨識出來

  • computers' ability to recognize people in photos

    在這段時間裡

  • improved by three orders of magnitude.

    辨認照片內人物的電腦運算能力

  • What happens when you combine

    也加快了1000倍

  • these technologies together:

    如果將這些技術結合起來

  • increasing availability of facial data;

    會發生什麼事?

  • improving facial recognizing ability by computers;

    取得了更多的臉部資料

  • but also cloud computing,

    改善了電腦臉部辨識的能力

  • which gives anyone in this theater

    還有雲端運算

  • the kind of computational power

    這會給與現場任何一個人

  • which a few years ago was only the domain

    一種運算能力

  • of three-letter agencies;

    而這種運算能力在幾年前只專屬於

  • and ubiquitous computing,

    那些政府機構

  • which allows my phone, which is not a supercomputer,

    這種普及的運算能力

  • to connect to the Internet

    能讓我的普通手機

  • and do there hundreds of thousands

    連上網際網路

  • of face metrics in a few seconds?

    在幾秒之內進行數十萬次的人臉辨識

  • Well, we conjecture that the result

    在幾秒之內進行數十萬次的人臉辨識

  • of this combination of technologies

    我們推測這些

  • will be a radical change in our very notions

    技術結合的結果

  • of privacy and anonymity.

    會顛覆我們

  • To test that, we did an experiment

    對於隱私權與匿名性最初的想法

  • on Carnegie Mellon University campus.

    為了進行測試 我們做了一項實驗

  • We asked students who were walking by

    在卡內基美隆大學的校園裡

  • to participate in a study,

    我們找路過的學生

  • and we took a shot with a webcam,

    來參與這項研究

  • and we asked them to fill out a survey on a laptop.

    我們拿視訊攝影機拍照

  • While they were filling out the survey,

    請他們用筆電填寫問卷調查

  • we uploaded their shot to a cloud-computing cluster,

    他們在填寫問卷的時候

  • and we started using a facial recognizer

    上傳他們的照片到一個雲端運算群組

  • to match that shot to a database

    使用一個臉部辨識系統

  • of some hundreds of thousands of images

    將這組照片拿去與

  • which we had downloaded from Facebook profiles.

    一個約有數十萬張圖像的資料庫比對

  • By the time the subject reached the last page

    這些圖像是我們從Facebook的個人簡介下載下來的

  • on the survey, the page had been dynamically updated

    等到受測者填寫到問卷最後一頁的時候

  • with the 10 best matching photos

    畫面會更新成辨識器找出的10張最相符的照片

  • which the recognizer had found,

    畫面會更新成辨識器找出的10張最相符的照片

  • and we asked the subjects to indicate

    畫面會更新成辨識器找出的10張最相符的照片

  • whether he or she found themselves in the photo.

    我們要求受測者指出

  • Do you see the subject?

    是否有在這些照片中找到他們自己

  • Well, the computer did, and in fact did so

    你有看到這名受測者嗎?

  • for one out of three subjects.

    是的, 電腦有找到

  • So essentially, we can start from an anonymous face,

    三人之中就有一人被找到

  • offline or online, and we can use facial recognition

    基本上 我們能夠從一張不知名的臉開始,

  • to give a name to that anonymous face

    不管是離線或在線 我們都能利用臉部辨識

  • thanks to social media data.

    讓一張不知名的臉找到它的名字

  • But a few years back, we did something else.

    這都是拜社群媒體資料庫所賜

  • We started from social media data,

    但是幾年前 我們又做其他的事情

  • we combined it statistically with data

    我們從社群媒體開始著手-

  • from U.S. government social security,

    我們將它與美國社會安全局的資料做結合

  • and we ended up predicting social security numbers,

    我們將它與美國社會安全局的資料做結合

  • which in the United States

    我們可以猜出個人的社會安全號碼

  • are extremely sensitive information.

    這在美國是一項極度敏感的個人資訊

  • Do you see where I'm going with this?

    這在美國是一項極度敏感的個人資訊

  • So if you combine the two studies together,

    你知道我在講什麼嗎?

  • then the question becomes,

    所以如果你們將兩種研究結果加在一起,

  • can you start from a face and,

    那這個問題就會變成

  • using facial recognition, find a name

    你能從一張臉開始

  • and publicly available information

    利用臉部辨識技術 找到他的名字

  • about that name and that person,

    找到這個人的公開資訊

  • and from that publicly available information

    找到這個人的公開資訊

  • infer non-publicly available information,

    再從公開資訊

  • much more sensitive ones

    推測出那些更加敏感的非公開資訊

  • which you link back to the face?

    推測出那些更加敏感的非公開資訊

  • And the answer is, yes, we can, and we did.

    然後你再回想起這張臉嗎?

  • Of course, the accuracy keeps getting worse.

    答案是可以的, 而且我們也做到了

  • [27% of subjects' first 5 SSN digits identified (with 4 attempts)]

    當然, 準確度還不是很好

  • But in fact, we even decided to develop an iPhone app

    在四次嘗試中, 可以辨識出27%受測者的社會安全號碼前五碼

  • which uses the phone's internal camera

    但事實上 我們甚至決定做一個 iPhone app

  • to take a shot of a subject

    利用手機內建像機

  • and then upload it to a cloud

    幫受測者拍一張照片

  • and then do what I just described to you in real time:

    然後上傳至雲端網路

  • looking for a match, finding public information,

    接下來馬上就像我對大家描述的一樣

  • trying to infer sensitive information,

    即時找出相符的臉, 找出公開資訊

  • and then sending back to the phone

    試著推斷敏感的私人資訊

  • so that it is overlaid on the face of the subject,

    然後傳回手機

  • an example of augmented reality,

    這些資訊會顯示在受測者的臉部照片旁

  • probably a creepy example of augmented reality.

    這是一個擴增實境的例子

  • In fact, we didn't develop the app to make it available,

    也許是一個會令人毛骨悚然的擴增實境案例

  • just as a proof of concept.

    事實上我們並沒有讓這個app上市

  • In fact, take these technologies

    只是做為一種觀念的證明

  • and push them to their logical extreme.

    事實上, 利用這些科技到極致的時候

  • Imagine a future in which strangers around you

    事實上, 利用這些科技到極致的時候

  • will look at you through their Google Glasses

    想像一下未來, 你身旁的陌生人

  • or, one day, their contact lenses,

    能透過Google眼鏡來看你

  • and use seven or eight data points about you

    或者有一天 隱形眼鏡也能做到同樣的事情

  • to infer anything else

    使用七或八個有關於你的資訊

  • which may be known about you.

    去推測其他

  • What will this future without secrets look like?

    可能與你相關的事

  • And should we care?

    沒有秘密的未來會是什麼樣子?

  • We may like to believe

    我們應該關心這個嗎?

  • that the future with so much wealth of data

    我們可能比較願意去相信

  • would be a future with no more biases,

    一個有這麼多數據資料的未來

  • but in fact, having so much information

    會是一個沒有偏差的未來

  • doesn't mean that we will make decisions

    但是, 事實上, 擁有這麼多資訊

  • which are more objective.

    不表示我們能夠做出

  • In another experiment, we presented to our subjects

    更客觀的決定

  • information about a potential job candidate.

    在另一個實驗裡 我們把求職者的資訊給受測者看

  • We included in this information some references

    在另一個實驗裡 我們把求職者的資訊給受測者看

  • to some funny, absolutely legal,

    我們的資料含括

  • but perhaps slightly embarrassing information

    關於一些有趣, 絕對合法

  • that the subject had posted online.

    但也許稍微有點尷尬的訊息

  • Now interestingly, among our subjects,

    這些都是受測者張貼在網路上的資訊

  • some had posted comparable information,

    有趣的是 我們實驗的對象中

  • and some had not.

    有些人也發表了類似的訊息

  • Which group do you think

    但有些人則沒有

  • was more likely to judge harshly our subject?

    你認為哪一組人

  • Paradoxically, it was the group

    比較可能嚴厲批評我們的受測者?

  • who had posted similar information,

    答案出乎意料的是

  • an example of moral dissonance.

    那些發表類似訊息的人

  • Now you may be thinking,

    這也是種道德觀念不一致的例子

  • this does not apply to me,

    現在你可能正在想

  • because I have nothing to hide.

    這對我來說沒用

  • But in fact, privacy is not about

    因為我沒有什麼要藏的東西

  • having something negative to hide.

    但事實上 隱私不只是

  • Imagine that you are the H.R. director

    有什麼不好的東西要藏起來而已

  • of a certain organization, and you receivesumés,

    想像你是某個組織的人事主管

  • and you decide to find more information about the candidates.

    你收到應徵者寄來的履歷

  • Therefore, you Google their names

    你決定要找出更多該名應徵者的訊息

  • and in a certain universe,

    因此 你就在google上搜尋他們的名字

  • you find this information.

    在特定時空

  • Or in a parallel universe, you find this information.

    你可以找到這筆資訊

  • Do you think that you would be equally likely

    或是在平行時空 你找到這筆資訊

  • to call either candidate for an interview?

    你認為你會同樣的

  • If you think so, then you are not

    打電話通知應徵者前來面試嗎?

  • like the U.S. employers who are, in fact,

    如果你這麼認為,

  • part of our experiment, meaning we did exactly that.

    那你就不像美國雇主

  • We created Facebook profiles, manipulating traits,

    事實上, 他們也在我們的實驗當中

  • then we started sending outsumés to companies in the U.S.,

    我們創造了一些Facebook個人簡介,

  • and we detected, we monitored,

    然後寄送履歷到美國各家公司

  • whether they were searching for our candidates,

    然後我們偵查、監控

  • and whether they were acting on the information

    看是否他們正在上網搜尋我們的應徵者

  • they found on social media. And they were.

    並且依照這些社群媒體上找到的資訊做事.

  • Discrimination was happening through social media

    他們真的這麼作.

  • for equally skilled candidates.

    透過社群媒體, 對技能相當的應徵者們來說

  • Now marketers like us to believe

    也會發生不公平待遇的事情

  • that all information about us will always

    現在行銷的人想讓我們相信

  • be used in a manner which is in our favor.

    所有關於我們的個人資訊都會

  • But think again. Why should that be always the case?

    用在對我們有利的面向

  • In a movie which came out a few years ago,

    但是再想想, 真的會這樣嗎?

  • "Minority Report," a famous scene

    幾年前上映的一部電影

  • had Tom Cruise walk in a mall

    「關鍵報告」裡一個著名的場景

  • and holographic personalized advertising

    就是湯姆克‧魯斯走進一間賣場

  • would appear around him.

    有一個個人化的雷射投影廣告

  • Now, that movie is set in 2054,

    出現在他旁邊

  • about 40 years from now,

    那部電影的時空背景設定於2054年

  • and as exciting as that technology looks,

    從現在算起 大約是40年之後

  • it already vastly underestimates

    那種技術看起來很精彩

  • the amount of information that organizations

    它已經大大低估

  • can gather about you, and how they can use it

    各組織能夠匯集起有關你個人的資料量

  • to influence you in a way that you will not even detect.

    與他們是如何運用這些資料

  • So as an example, this is another experiment

    以某一個你無法查覺的方式, 對你造成影響

  • actually we are running, not yet completed.

    還有一個例子 這是另一項實驗

  • Imagine that an organization has access

    是我們正在進行中的實驗, 還沒有完成

  • to your list of Facebook friends,

    想像一個組織能夠進入

  • and through some kind of algorithm

    你的Facebook好友清單

  • they can detect the two friends that you like the most.

    透過某種運算規則

  • And then they create, in real time,

    他們可以偵測到你最喜歡的兩個好友

  • a facial composite of these two friends.

    然後他們就能即時創造出

  • Now studies prior to ours have shown that people

    由這兩個好友所組成的臉部合成照

  • don't recognize any longer even themselves

    在我們之前有研究已經顯示

  • in facial composites, but they react

    人們在看臉部合成照, 連他們自己都認不出來

  • to those composites in a positive manner.

    人們在看臉部合成照, 連他們自己都認不出來

  • So next time you are looking for a certain product,

    但是他們對那些合成照有正面評價

  • and there is an ad suggesting you to buy it,

    所以下次你在找某項產品

  • it will not be just a standard spokesperson.

    此時有一個建議購買的廣告

  • It will be one of your friends,

    廣告將不會是一個固定的代言人

  • and you will not even know that this is happening.

    他很可能是你其中一位朋友

  • Now the problem is that

    你甚至不知道這種事正發生在你的生活中

  • the current policy mechanisms we have

    現在問題就是

  • to protect ourselves from the abuses of personal information

    目前政策機制是我們必須

  • are like bringing a knife to a gunfight.

    保護我們自己免於個人資料遭到濫用

  • One of these mechanisms is transparency,

    這就像是以卵擊石

  • telling people what you are going to do with their data.

    其中一項機制就是資訊透明化

  • And in principle, that's a very good thing.

    你必須告訴人們你想拿他們資料做什麼

  • It's necessary, but it is not sufficient.

    原則上 這是一件非常好的事情

  • Transparency can be misdirected.

    這是應該的, 但是這麼做還不夠

  • You can tell people what you are going to do,

    資訊透明化的方向可能會被誤導

  • and then you still nudge them to disclose

    你可以告訴大家你想做什麼

  • arbitrary amounts of personal information.

    然後你促使他人揭露

  • So in yet another experiment, this one with students,

    或多或少的個人資訊

  • we asked them to provide information

    在另一項實驗中, 實驗對象是學生

  • about their campus behavior,

    我們要求他們提供個人資訊

  • including pretty sensitive questions, such as this one.

    關於他們在學校裡做的事

  • [Have you ever cheated in an exam?]

    包括一些相當敏感的問題 就像這一個

  • Now to one group of subjects, we told them,

    在考試的時候 你有作弊過嗎?

  • "Only other students will see your answers."

    對其中一組受測者, 我們告訴他們

  • To another group of subjects, we told them,

    只有其他的同學會看到你的答案

  • "Students and faculty will see your answers."

    對另一組受測者 我們告訴他們

  • Transparency. Notification. And sure enough, this worked,

    所有學生和教職員都會看到你的答案

  • in the sense that the first group of subjects

    透明化 告知. 這真的有用.

  • were much more likely to disclose than the second.

    第一組受測者

  • It makes sense, right?

    比第二組受測者更有可能公佈事實

  • But then we added the misdirection.

    合理吧?

  • We repeated the experiment with the same two groups,

    但是之後我們加入誤導手段

  • this time adding a delay

    我們對相同兩組學生重覆進行實驗

  • between the time we told subjects

    這次在不同的時間告訴受測者我們是如何使用他們的資料

  • how we would use their data

    這次在不同的時間告訴受測者我們是如何使用他們的資料

  • and the time we actually started answering the questions.

    這次在不同的時間告訴受測者我們是如何使用他們的資料

  • How long a delay do you think we had to add

    現在我們知道了

  • in order to nullify the inhibitory effect

    你認為我們必須要延遲多久時間

  • of knowing that faculty would see your answers?

    為使這種抑制效應無效

  • Ten minutes?

    而這種效應就是知道教職員會看見你的答案?

  • Five minutes?

    10分鐘?

  • One minute?

    5分鐘?

  • How about 15 seconds?

    1分鐘?

  • Fifteen seconds were sufficient to have the two groups

    15秒怎樣?

  • disclose the same amount of information,

    15秒就足夠讓兩組人

  • as if the second group now no longer cares

    透露出相同資訊量

  • for faculty reading their answers.

    就好像第二組人現在不再關心教職員會看他們的答案

  • Now I have to admit that this talk so far

    就好像第二組人現在不再關心教職員會看他們的答案

  • may sound exceedingly gloomy,

    現在我必須承認目前為止我說的這些話

  • but that is not my point.

    可能聽起來非常沉悶

  • In fact, I want to share with you the fact that

    但我要說的不是這個

  • there are alternatives.

    事實上 我想與大家分享的是

  • The way we are doing things now is not the only way

    還有替代方案

  • they can done, and certainly not the best way

    我們現在實驗的方式

  • they can be done.

    並不是唯一可行的方式

  • When someone tells you, "People don't care about privacy,"

    當然也不是最好的辦法

  • consider whether the game has been designed

    有人告訴你 「沒人會在乎他的隱私」

  • and rigged so that they cannot care about privacy,

    想想看是否這場遊戲已經遭到設計

  • and coming to the realization that these manipulations occur

    暗中操作 所以他們才不在意隱私權

  • is already halfway through the process

    逐漸發現這些操作手段的已經入侵到那些能夠能夠保護你的方法中

  • of being able to protect yourself.

    逐漸發現這些操作手段的已經入侵到那些能夠能夠保護你的方法中

  • When someone tells you that privacy is incompatible

    逐漸發現這些操作手段的已經入侵到那些能夠能夠保護你的方法中

  • with the benefits of big data,

    有人告訴你隱私

  • consider that in the last 20 years,

    與巨量資料所帶來的利益是無法共存的

  • researchers have created technologies

    想想看近20年

  • to allow virtually any electronic transactions

    研究人員已經研發出數套技術

  • to take place in a more privacy-preserving manner.

    讓幾乎所有電子交易

  • We can browse the Internet anonymously.

    能夠在有更高度的隱私環境下進行

  • We can send emails that can only be read

    我們可以匿名瀏覽網頁

  • by the intended recipient, not even the NSA.

    傳送唯讀電子郵件

  • We can have even privacy-preserving data mining.

    這些電子郵件僅能由指定的收件者閱讀 就連國家安全局都沒辦法查看

  • In other words, we can have the benefits of big data

    我們甚至能在隱私受到保護的情況下 進行資料開採

  • while protecting privacy.

    另一方面, 在保護隱私權的同時, 我們仍擁有巨量資料所帶來的好處

  • Of course, these technologies imply a shifting

    另一方面, 在保護隱私權的同時, 我們仍擁有巨量資料所帶來的好處

  • of cost and revenues

    當然 這些技術也可以看出,

  • between data holders and data subjects,

    在資料持有人與資料提供者之間

  • which is why, perhaps, you don't hear more about them.

    利益的變化

  • Which brings me back to the Garden of Eden.

    這也許是為什麼你沒有聽過太多有關這些技術的事情

  • There is a second privacy interpretation

    就讓我將話題轉回伊甸園

  • of the story of the Garden of Eden

    有第二種

  • which doesn't have to do with the issue

    對伊甸園故事的隱私解釋

  • of Adam and Eve feeling naked

    這與亞當和夏娃

  • and feeling ashamed.

    覺得全身赤裸

  • You can find echoes of this interpretation

    和感到羞恥沒有關係

  • in John Milton's "Paradise Lost."

    你可以在約翰·密爾頓的《失樂園》裡發現對於這個解釋的迴響

  • In the garden, Adam and Eve are materially content.

    你可以在約翰·密爾頓的《失樂園》裡發現對於這個解釋的迴響

  • They're happy. They are satisfied.

    在伊甸園裡 亞當和夏娃只是物品

  • However, they also lack knowledge

    他們很快樂 很滿足

  • and self-awareness.

    然而 他們也缺乏知識

  • The moment they eat the aptly named

    和自覺

  • fruit of knowledge,

    此刻他們恰好吃下名叫

  • that's when they discover themselves.

    「知識」的水果

  • They become aware. They achieve autonomy.

    就在那時他們才發現自我

  • The price to pay, however, is leaving the garden.

    他們開始擁有自覺和自主能力

  • So privacy, in a way, is both the means

    然而 所付出的代價就是必須離開伊甸園

  • and the price to pay for freedom.

    所以, 隱私權是自由的意義也是代價

  • Again, marketers tell us

    所以, 隱私權是自由的意義也是代價

  • that big data and social media

    市場商人告訴我們

  • are not just a paradise of profit for them,

    巨量資料與社群媒體

  • but a Garden of Eden for the rest of us.

    不只對於他們是獲利的天堂

  • We get free content.

    對我們其餘的人也是座伊甸園

  • We get to play Angry Birds. We get targeted apps.

    我們可以得到免費的內容

  • But in fact, in a few years, organizations

    我們可以玩憤怒鳥 使用挑選好的app

  • will know so much about us,

    實際上,在幾年之內

  • they will be able to infer our desires

    政府機構將知道許多關於我們的資訊

  • before we even form them, and perhaps

    他們能在我們想到之前推斷我們想做的事情

  • buy products on our behalf

    他們能在我們想到之前推斷我們想做的事情

  • before we even know we need them.

    也許在我們知道我們需要這些東西之前, 就替我們購買產品

  • Now there was one English author

    也許在我們知道我們需要這些東西之前, 就替我們購買產品

  • who anticipated this kind of future

    現在有一名英國作家

  • where we would trade away

    考慮到未來可能會發生這種情況

  • our autonomy and freedom for comfort.

    到時候我們可能會為了過舒適的生活

  • Even more so than George Orwell,

    而賤賣我們的自主能力與自由

  • the author is, of course, Aldous Huxley.

    其著作比喬治·歐威爾還多

  • In "Brave New World," he imagines a society

    這名作家當然就是奧爾德斯·赫胥黎

  • where technologies that we created

    在《美麗新世界》書中 他想像出一個社會

  • originally for freedom

    那裡的科技是

  • end up coercing us.

    我們為了自由而創造的技術

  • However, in the book, he also offers us a way out

    最後我們反被科技奴役

  • of that society, similar to the path

    然而 在書中 他也提供我們一個逃離

  • that Adam and Eve had to follow to leave the garden.

    那個社會的方式 與那條路很像

  • In the words of the Savage,

    就是亞當和夏娃離開伊甸園的那條路

  • regaining autonomy and freedom is possible,

    就「野蠻人」這個詞而言

  • although the price to pay is steep.

    重新找回自主能力和自由是可行的

  • So I do believe that one of the defining fights

    雖然需要付出的代價實在太高

  • of our times will be the fight

    所以我相信我們這個時代的

  • for the control over personal information,

    其中一個決定性的戰鬥將會是

  • the fight over whether big data will become a force

    為掌控個人資訊而戰

  • for freedom,

    不管巨量資料是否將成為一股迎向自由的力量

  • rather than a force which will hiddenly manipulate us.

    這場戰鬥終將結束

  • Right now, many of us

    而不會成為一股暗中操縱我們的力量-

  • do not even know that the fight is going on,

    現在 我們當中許多人

  • but it is, whether you like it or not.

    甚至都不知道 戰鬥正在進行

  • And at the risk of playing the serpent,

    不管你喜不喜歡 這就是現況

  • I will tell you that the tools for the fight

    冒著玩弄魔鬼的危險

  • are here, the awareness of what is going on,

    我告訴各位, 這場戰爭的工具就在這裡

  • and in your hands,

    了解現在發生什麼事

  • just a few clicks away.

    就掌握在你手裡

  • Thank you.

    只要用滑鼠點幾下就行了

  • (Applause)

    謝謝大家

I would like to tell you a story

譯者: Bert Chen 審譯者: Kuo-Hsien Chiang

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

B1 中級 中文 美國腔 TED 受測 資訊 資料 隱私權 夏娃

TED】亞歷山德羅-阿奎斯蒂:沒有祕密的未來會是什麼樣子?(沒有祕密的未來會是什麼樣子?|Alessandro Acquisti) (【TED】Alessandro Acquisti: What will a future without secrets look like? (What will a future without secrets look like? | Alessandro Acquisti))

  • 28 2
    Zenn 發佈於 2021 年 01 月 14 日
影片單字