Placeholder Image

字幕列表 影片播放

由 AI 自動生成
  • Around five years ago,

    大約五年前。

  • it struck me that I was losing the ability

    我突然覺得我正在失去能力

  • to engage with people who aren't like-minded.

    來與不志同道合的人交往。

  • The idea of discussing hot-button issues with my fellow Americans

    與我的美國同胞討論熱點問題的想法。

  • was starting to give me more heartburn

    開始給我更多的胃灼熱

  • than the times that I engaged with suspected extremists overseas.

    比我在海外與極端分子嫌疑人接觸的時候。

  • It was starting to leave me feeling more embittered and frustrated.

    它開始讓我感到更加苦惱和沮喪。

  • And so just like that,

    於是就這樣。

  • I shifted my entire focus

    我把注意力全部轉移到了

  • from global national security threats

    國家安全威脅的影響

  • to trying to understand what was causing this push

    試圖瞭解是什麼原因導致了這種推力

  • towards extreme polarization at home.

    在國內走向極致化。

  • As a former CIA officer and diplomat

    作為一名前中情局官員和外交官。

  • who spent years working on counterextremism issues,

    他花了多年時間研究反極端主義問題。

  • I started to fear that this was becoming a far greater threat to our democracy

    我開始擔心這對我們的民主構成了更大的威脅。

  • than any foreign adversary.

    勝過任何外國對手。

  • And so I started digging in,

    於是,我開始挖掘。

  • and I started speaking out,

    我開始說出來。

  • which eventually led me to being hired at Facebook

    最終導致我被Facebook錄用。

  • and ultimately brought me here today

    並最終把我帶到了今天

  • to continue warning you about how these platforms

    繼續警告你,這些平臺如何

  • are manipulating and radicalizing so many of us

    正在操縱和激進的我們這麼多。

  • and to talk about how to reclaim our public square.

    並討論如何開墾我們的公共廣場。

  • I was a foreign service officer in Kenya

    我在肯亞當過外交官

  • just a few years after the September 11 attacks,

    就在 "9-11 "事件發生幾年後。

  • and I led what some call "hearts and minds" campaigns

    而我則上司了一些人所謂的 "人心 "運動。

  • along the Somalia border.

    在索馬里邊境沿線,

  • A big part of my job was to build trust with communities

    我工作的一個重要部分是與社區建立信任。

  • deemed the most susceptible to extremist messaging.

    被認為最容易受到極端主義資訊的影響。

  • I spent hours drinking tea with outspoken anti-Western clerics

    我花了好幾個小時和那些直言不諱的反西方教士喝茶。

  • and even dialogued with some suspected terrorists,

    甚至與一些恐怖分子嫌疑人對話。

  • and while many of these engagements began with mutual suspicion,

    雖然這些約定很多都是在相互猜疑的情況下開始的。

  • I don't recall any of them resulting in shouting or insults,

    我不記得有任何一次導致大喊大叫或侮辱。

  • and in some case we even worked together on areas of mutual interest.

    在某些情況下,我們甚至在共同感興趣的領域進行合作。

  • The most powerful tools we had were to simply listen, learn

    我們最有力的工具就是簡單的傾聽,學習。

  • and build empathy.

    並建立同理心。

  • This is the essence of hearts and minds work,

    這就是人心工作的本質。

  • because what I found again and again is that what most people wanted

    因為我一次又一次地發現大多數人想要的是

  • was to feel heard, validated and respected.

    是感到被傾聽、被認可和被尊重。

  • And I believe that's what most of us want.

    而我相信這也是我們大多數人想要的。

  • So what I see happening online today is especially heartbreaking

    所以我今天在網上看到的事情特別讓人心痛

  • and a much harder problem to tackle.

    也是一個更難解決的問題。

  • We are being manipulated by the current information ecosystem

    我們正在被當前的資訊生態系統所操縱。

  • entrenching so many of us so far into absolutism

    使我們很多人陷入絕對主義的泥潭

  • that compromise has become a dirty word.

    妥協已經成為一個骯髒的詞。

  • Because right now,

    因為現在。

  • social media companies like Facebook

    社交媒體公司,如Facebook

  • profit off of segmenting us and feeding us personalized content

    通過對我們進行細分並向我們提供個性化的內容來獲利。

  • that both validates and exploits our biases.

    既驗證又利用我們的偏見。

  • Their bottom line depends on provoking a strong emotion

    他們的底線是要靠激起強烈的情緒。

  • to keep us engaged,

    讓我們參與其中。

  • often incentivizing the most inflammatory and polarizing voices,

    往往激勵著最具有煽動性和極化的聲音。

  • to the point where finding common ground no longer feels possible.

    以至於覺得再也找不到共同點了。

  • And despite a growing chorus of people crying out for the platforms to change,

    而儘管越來越多的人喊著要改變平臺。

  • it's clear they will not do enough on their own.

    很明顯,他們自己的努力是不夠的。

  • So governments must define the responsibility

    所以政府必須明確責任

  • for the real-world harms being caused by these business models

    這些商業模式對現實世界造成的危害。

  • and impose real costs on the damaging effects

    破壞性影響的實際代價

  • they're having to our public health, our public square and our democracy.

    他們對我們的公共健康,我們的公共廣場和我們的民主有。

  • But unfortunately, this won't happen in time for the US presidential election,

    但遺憾的是,這不會在美國總統大選前發生。

  • so I am continuing to raise this alarm,

    所以,我在繼續發出這個警報。

  • because even if one day we do have strong rules in place,

    因為即使有一天我們真的制定了強有力的規則。

  • it will take all of us to fix this.

    這將需要我們所有人來解決這個問題。

  • When I started shifting my focus from threats abroad

    當我開始把注意力從國外的威脅轉移到國內的時候。

  • to the breakdown in civil discourse at home,

    到國內民間話語權的崩潰。

  • I wondered if we could repurpose some of these hearts and minds campaigns

    我在想,我們是否可以重新利用一些這樣的人心活動呢?

  • to help heal our divides.

    以幫助彌合我們的分歧。

  • Our more than 200-year experiment with democracy works

    我們200多年的民主實驗成功了

  • in large part because we are able to openly and passionately

    在很大程度上是因為我們能夠公開地、充滿激情地。

  • debate our ideas for the best solutions.

    辯論我們的想法,尋找最佳的解決方案。

  • But while I still deeply believe

    但是,雖然我仍然深深地相信

  • in the power of face-to-face civil discourse,

    在面對面的民間話語權。

  • it just cannot compete

    敵不過

  • with the polarizing effects and scale of social media right now.

    與現在社交媒體的極化效應和規模。

  • The people who are sucked down these rabbit holes

    被吸進這些兔子洞的人。

  • of social media outrage

    輿論譁然

  • often feel far harder to break of their ideological mindsets

    往往感覺到自己的思想觀念難以突破。

  • than those vulnerable communities I worked with ever were.

    比我工作過的那些脆弱的社區有過之而無不及。

  • So when Facebook called me in 2018

    所以當Facebook在2018年給我打電話時

  • and offered me this role

    並給了我這個角色

  • heading its elections integrity operations for political advertising,

    上司其政治廣告的選舉誠信業務。

  • I felt I had to say yes.

    我覺得我必須要答應。

  • I had no illusions that I would fix it all,

    我沒有幻想過我會解決這一切。

  • but when offered the opportunity

    但當有機會

  • to help steer the ship in a better direction,

    以幫助引導船舶向更好的方向發展。

  • I had to at least try.

    我至少得試試。

  • I didn't work directly on polarization,

    我沒有直接在極化上下功夫。

  • but I did look at which issues were the most divisive in our society

    但我確實看了一下社會上哪些問題是最容易引起分歧的。

  • and therefore the most exploitable in elections interference efforts,

    是以也是干涉選舉工作中最容易被利用的。

  • which was Russia's tactic ahead of 2016.

    這是俄羅斯在2016年之前的策略。

  • So I started by asking questions.

    於是,我就從問問題開始。

  • I wanted to understand the underlying systemic issues

    我想了解潛在的系統性問題

  • that were allowing all of this to happen,

    是允許這一切發生的。

  • in order to figure out how to fix it.

    以便想辦法解決。

  • Now I still do believe in the power of the internet

    現在我還是相信網絡的力量的

  • to bring more voices to the table,

    帶來更多的聲音。

  • but despite their stated goal of building community,

    但儘管他們提出了建立社區的目標。

  • the largest social media companies as currently constructed

    目前最大的社交媒體公司

  • are antithetical to the concept of reasoned discourse.

    是與理性話語的概念相對立的。

  • There's no way to reward listening,

    沒有辦法獎勵聽。

  • to encourage civil debate

    鼓勵文明辯論

  • and to protect people who sincerely want to ask questions

    並保護真心實意想問問題的人。

  • in a business where optimizing engagement and user growth

    在一個優化參與度和用戶增長的業務中。

  • are the two most important metrics for success.

    是成功的兩個最重要的衡量標準。

  • There's no incentive to help people slow down,

    沒有動力幫助人們慢下來。

  • to build in enough friction that people have to stop,

    以建立足夠的摩擦,讓人們不得不停下來。

  • recognize their emotional reaction to something,

    認識到自己對某件事的情緒反應。

  • and question their own assumptions before engaging.

    並在參與之前質疑自己的假設。

  • The unfortunate reality is:

    不幸的現實是:

  • lies are more engaging online than truth,

    謊言在網上比真相更有吸引力。

  • and salaciousness beats out wonky, fact-based reasoning

    歪打正著

  • in a world optimized for frictionless virality.

    在一個優化的病毒性世界裡。

  • As long as algorithms' goals are to keep us engaged,

    只要算法的目標是讓我們參與。

  • they will continue to feed us the poison that plays to our worst instincts

    他們會繼續給我們提供毒藥,發揮我們最壞的本能。

  • and human weaknesses.

    和人性的弱點。

  • And yes, anger, mistrust,

    是的,憤怒,不信任。

  • the culture of fear, hatred:

    恐懼、仇恨的文化。

  • none of this is new in America.

    這些在美國都不新鮮。

  • But in recent years, social media has harnessed all of that

    但近年來,社會化媒體已經利用這些

  • and, as I see it, dramatically tipped the scales.

    而且,據我看來,戲劇性地傾斜了天平。

  • And Facebook knows it.

    而Facebook也知道這一點。

  • A recent "Wall Street Journal" article

    最近《華爾街日報》的一篇文章

  • exposed an internal Facebook presentation from 2018

    曝光了2018年Facebook的一份內部簡報

  • that specifically points to the companies' own algorithms

    的,特別指出公司自己的算法。

  • for growing extremist groups' presence on their platform

    極端主義組織在其平臺上的存在感不斷增強

  • and for polarizing their users.

    並因其用戶的極化。

  • But keeping us engaged is how they make their money.

    但讓我們參與是他們賺錢的方式。

  • The modern information environment is crystallized around profiling us

    現代資訊環境是圍繞著對我們進行剖析的結晶

  • and then segmenting us into more and more narrow categories

    然後把我們抽成越來越窄的類別。

  • to perfect this personalization process.

    來完善這個個性化的過程。

  • We're then bombarded with information confirming our views,

    然後我們就會被資訊轟炸,證實我們的觀點。

  • reinforcing our biases,

    加強了我們的偏見。

  • and making us feel like we belong to something.

    並讓我們覺得自己屬於什麼。

  • These are the same tactics we would see terrorist recruiters

    這些是同樣的策略,我們會看到恐怖招募者

  • using on vulnerable youth,

    對弱勢青年使用。

  • albeit in smaller, more localized ways before social media,

    儘管在社交媒體之前,是以更小的、更本地化的方式。

  • with the ultimate goal of persuading their behavior.

    以說服其行為為最終目的。

  • Unfortunately, I was never empowered by Facebook to have an actual impact.

    遺憾的是,我從來沒有被Facebook授權產生實際影響。

  • In fact, on my second day, my title and job description were changed

    事實上,在我的第二天,我的頭銜和工作描述就被改變了。

  • and I was cut out of decision-making meetings.

    而我被排除在決策會議之外。

  • My biggest efforts,

    我最大的努力。

  • trying to build plans

    試製

  • to combat disinformation and voter suppression in political ads,

    打擊政治廣告中的虛假資訊和壓制選民行為;

  • were rejected.

    被拒絕。

  • And so I lasted just shy of six months.

    所以我只堅持了半年左右的時間。

  • But here is my biggest takeaway from my time there.

    但這是我在那裡的最大收穫。

  • There are thousands of people at Facebook

    在Facebook上有成千上萬的人

  • who are passionately working on a product

    熱衷於產品的人

  • that they truly believe makes the world a better place,

    他們真正相信會讓世界變得更美好。

  • but as long as the company continues to merely tinker around the margins

    但是,只要公司繼續只是在利潤上做文章

  • of content policy and moderation,

    的內容政策和節制。

  • as opposed to considering

    而不是考慮

  • how the entire machine is designed and monetized,

    整機如何設計和貨幣化。

  • they will never truly address how the platform is contributing

    他們將永遠不會真正解決平臺如何促進

  • to hatred, division and radicalization.

    仇恨、分裂和激進化。

  • And that's the one conversation I never heard happen during my time there,

    而這是我在那裡的時候從未聽到過的一次對話。

  • because that would require fundamentally accepting

    因為這需要從根本上接受

  • that the thing you built might not be the best thing for society

    你建造的東西可能不是社會上最好的東西。

  • and agreeing to alter the entire product and profit model.

    並同意改變整個產品和盈利模式。

  • So what can we do about this?

    那麼,我們能做些什麼呢?

  • I'm not saying that social media bears the sole responsibility

    我不是說社交媒體要承擔全部責任

  • for the state that we're in today.

    對於我們今天的狀態。

  • Clearly, we have deep-seated societal issues that we need to solve.

    顯然,我們有深層次的社會問題需要解決。

  • But Facebook's response, that it is just a mirror to society,

    但Facebook的迴應,認為它只是社會的一面鏡子。

  • is a convenient attempt to deflect any responsibility

    是一個方便的企圖,以轉移任何責任

  • from the way their platform is amplifying harmful content

    從其平臺放大有害內容的方式來看

  • and pushing some users towards extreme views.

    並將部分用戶推向極端觀點。

  • And Facebook could, if they wanted to,

    而Facebook也可以,如果他們想。

  • fix some of this.

    解決一些問題。

  • They could stop amplifying and recommending the conspiracy theorists,

    他們可以停止放大和推薦陰謀論者。

  • the hate groups, the purveyors of disinformation

    仇恨團體,虛假資訊的傳播者;

  • and, yes, in some cases even our president.

    是的,在某些情況下,甚至我們的總統。

  • They could stop using the same personalization techniques

    他們可以停止使用相同的個性化技術

  • to deliver political rhetoric that they use to sell us sneakers.

    提供他們用來賣給我們運動鞋的政治言論。

  • They could retrain their algorithms

    他們可以重新訓練他們的算法

  • to focus on a metric other than engagement,

    以專注於參與度以外的指標。

  • and they could build in guardrails to stop certain content from going viral

    他們可以建立護欄,以阻止某些內容的病毒式傳播。

  • before being reviewed.

    在審查之前。

  • And they could do all of this

    他們可以做到這一切

  • without becoming what they call the arbiters of truth.

    而不成為他們所謂的真理仲裁者。

  • But they've made it clear that they will not go far enough

    但他們已經明確表示,他們不會走得太遠。

  • to do the right thing without being forced to,

    做正確的事情,而不被強迫。

  • and, to be frank, why should they?

    坦白說,他們為什麼要這樣做?

  • The markets keep rewarding them, and they're not breaking the law.

    市場一直在獎勵他們,他們也沒有違法。

  • Because as it stands,

    因為就目前來看。

  • there are no US laws compelling Facebook, or any social media company,

    美國沒有任何法律強制Facebook或任何社交媒體公司。

  • to protect our public square,

    以保護我們的公共廣場。

  • our democracy

    我們的民主

  • and even our elections.

    甚至我們的選舉。

  • We have ceded the decision-making on what rules to write and what to enforce

    我們已經把寫什麼規則和執行什麼規則的決策權讓給了別人。

  • to the CEOs of for-profit internet companies.

    到營利性互聯網公司的CEO。

  • Is this what we want?

    這是我們想要的嗎?

  • A post-truth world where toxicity and tribalism

    毒性和部落主義的後真相世界

  • trump bridge-building and consensus-seeking?

    王牌搭橋,尋求共識?

  • I do remain optimistic that we still have more in common with each other

    我很樂觀,我們之間還有更多的共同點。

  • than the current media and online environment portray,

    比目前媒體和網絡環境所描繪的。

  • and I do believe that having more perspective surface

    我相信,有更多的視角表面

  • makes for a more robust and inclusive democracy.

    使得民主更加健全和包容。

  • But not the way it's happening right now.

    但不是現在這種情況。

  • And it bears emphasizing, I do not want to kill off these companies.

    需要強調的是,我不想把這些公司幹掉。

  • I just want them held to a certain level of accountability,

    我只是想讓他們承擔一定的責任。

  • just like the rest of society.

    就像社會上的其他人一樣。

  • It is time for our governments to step up and do their jobs

    現在是時候讓我們的政府站出來,做好他們的工作了。

  • of protecting our citizenry.

    保護我們的公民。

  • And while there isn't one magical piece of legislation

    雖然沒有一部神奇的法律。

  • that will fix this all,

    這將解決這一切。

  • I do believe that governments can and must find the balance

    我相信,政府可以而且必須找到平衡點。

  • between protecting free speech

    在保護言論自由之間

  • and holding these platforms accountable for their effects on society.

    並讓這些平臺對社會的影響負責。

  • And they could do so in part by insisting on actual transparency

    而他們可以通過堅持實際的透明度來做到這一點。

  • around how these recommendation engines are working,

    圍繞著這些推薦引擎是如何工作的。

  • around how the curation, amplification and targeting are happening.

    圍繞如何進行策劃、放大和定位。

  • You see, I want these companies held accountable

    你看,我想讓這些公司承擔責任。

  • not for if an individual posts misinformation

    不針對個人發佈錯誤信息的情況

  • or extreme rhetoric,

    或極端的言辭。

  • but for how their recommendation engines spread it,

    但對於其推薦引擎如何傳播。

  • how their algorithms are steering people towards it,

    他們的算法是如何引導人們走向。

  • and how their tools are used to target people with it.

    以及如何利用他們的工具來針對有的人。

  • I tried to make change from within Facebook and failed,

    我試圖從Facebook內部進行更改,但失敗了。

  • and so I've been using my voice again for the past few years

    所以這幾年我又開始用嗓子了。

  • to continue sounding this alarm

    繼續敲響這個警鐘

  • and hopefully inspire more people to demand this accountability.

    並希望能激勵更多的人要求這種問責。

  • My message to you is simple:

    我給你們的資訊很簡單。

  • pressure your government representatives

    向政府代表施壓

  • to step up and stop ceding our public square to for-profit interests.

    出面,不要再把我們的公共廣場拱手讓給營利性利益集團。

  • Help educate your friends and family

    幫助教育您的朋友和家人

  • about how they're being manipulated online.

    關於他們如何在網上被操縱。

  • Push yourselves to engage with people who aren't like-minded.

    推進自己與不志同道合的人交往。

  • Make this issue a priority.

    把這個問題作為優先事項。

  • We need a whole-society approach to fix this.

    我們需要用全社會的方法來解決這個問題。

  • And my message to the leaders of my former employer Facebook is this:

    而我給前僱主Facebook的上司的資訊是:。

  • right now, people are using your tools exactly as they were designed

    現在,人們正在使用你的工具,就像他們的設計一樣

  • to sow hatred, division and distrust,

    播種仇恨、分裂和不信任。

  • and you're not just allowing it, you are enabling it.

    你不只是允許它, 你是啟用它。

  • And yes, there are lots of great stories

    是的,有很多偉大的故事

  • of positive things happening on your platform around the globe,

    在你的平臺上發生的全球各地的積極事情。

  • but that doesn't make any of this OK.

    但這並不意味著這一切都可以。

  • And it's only getting worse as we're heading into our election,

    而且隨著選舉的臨近,情況只會越來越糟。

  • and even more concerning,

    而更令人關注的是。

  • face our biggest potential crisis yet,

    面對我們最大的潛在危機。

  • if the results aren't trusted, and if violence breaks out.

    如果結果不被信任,如果暴力事件爆發。

  • So when in 2021 you once again say, "We know we have to do better,"

    所以當2021年你再次說 "我們知道我們必須做得更好"

  • I want you to remember this moment,

    我希望你能記住這一刻。

  • because it's no longer just a few outlier voices.

    因為這已經不是少數人的聲音了。

  • Civil rights leaders, academics,

    民權領袖、學者。

  • journalists, advertisers, your own employees,

    記者、廣告商、你自己的員工。

  • are shouting from the rooftops

    喊破嗓子

  • that your policies and your business practices

    您的政策和業務實踐

  • are harming people and democracy.

    正在傷害人民和民主。

  • You own your decisions,

    你自己的決定。

  • but you can no longer say that you couldn't have seen it coming.

    但你再也不能說,你不可能看到它的到來。

  • Thank you.

    謝謝你了

Around five years ago,

大約五年前。

字幕與單字
由 AI 自動生成

單字即點即查 點擊單字可以查詢單字解釋