Placeholder Image

字幕列表 影片播放

  • BRADLEY HOROWITZ: I want to welcome you all.

  • My name's Bradley Horowitz, I'm VP

  • of Social for Google, Social Product Management.

  • And I'm here today to welcome Sandy Pentland

  • to come in and speak with us.

  • I'm going to be brief.

  • I encourage you all to Google Sandy.

  • You will pull up his long list of credentials,

  • which include academic credentials, business

  • credentials, across many, many disciplines

  • for many, many years.

  • World Economic Forum, Forbes' Most Influential,

  • it goes on and on.

  • So I'm going to try to give you a little anecdote of something

  • that I don't think you'll get on the web.

  • Nothing too embarrassing.

  • I'm a student, both former and current, of Sandy's.

  • Former in the sense that I was a media lab Ph.D. student.

  • He was my adviser.

  • And current in the sense that I stay

  • closely attuned to everything that Sandy does.

  • As we were walking over here from building 1900,

  • we were sort of doing what old friends do,

  • which is play the name game and checking in

  • on all the old connections and friends that we share.

  • How's Roz doing?

  • How Stan doing?

  • How's Ali doing?

  • What about Fad?

  • And we went through them, and turns out

  • everybody's doing fine.

  • You know, many of you are here, actually in the second row,

  • in front row.

  • Many of you have gone off to become professors

  • at MIT or Berkeley or Georgia Tech.

  • And it was just so great.

  • And thinking about that for a moment,

  • I recognized that I was just walking

  • through one of the vintages, the sort of early '90s vintage

  • of Sandy's students who have all gone off to do great things.

  • And Sandy has consecutively piled

  • on top of that, round after round,

  • of graduate students and students that he has inspired.

  • And in addition to all of those lists of accomplishments,

  • one of the things that really touches me most

  • about Sandy and his work is that he's

  • such an inspirational educator.

  • He not only has enthusiasm for his own work,

  • he's able to impart that to others

  • and create generations of people that care passionately

  • about technology and science.

  • And it's just so great to be in the company, which we will all

  • get to share for an hour right now,

  • of a person who can inspire and lead that way.

  • And so with that, I'll hand it over to Sandy.

  • Welcome.

  • SANDY PENTLAND: Well, thank you.

  • [CLAPPING]

  • SANDY PENTLAND: Now, I'll have to inspire and cause

  • passion, which is actually part of what

  • I'm going to talk about.

  • So how does that happen?

  • So maybe this is good.

  • So I'm going to talk about two things.

  • One is basic science about who people

  • are, how we use electronic media, how we use face to face

  • media, how we evolved as a social species.

  • And then I want to move to how we can use this knowledge

  • to make things better, to have a more

  • sustainable digital ecology, to make government work.

  • Wouldn't that be amazing? [LAUGHS]

  • And so on and so forth.

  • And as Bradley mentioned, I do a lot of things.

  • I thought I'd stick this in.

  • I love this picture.

  • This is the boards back in the 1990s.

  • And that's Thad in the front there,

  • Thad Starner, who I think most of you know.

  • And then two other things I do that are of real relevancy

  • here, maybe three.

  • Is one is for the last five years,

  • I've run a group-- helped run a group at Davos--

  • around personal data, privacy, and big data.

  • And that's, of course, a very relevant topic for this crowd,

  • but particularly, going forward.

  • And the group includes people like the Chairman

  • of the Federal Trade Commission, the vice president of the EU,

  • Politburo members from China, et cetera, et cetera.

  • So it's a conversation between CEOs of major companies

  • and chief regulators and advocacy.

  • And I'll talk about that at then end

  • and where I think that things are going

  • and what you might want to do about it.

  • And I just joined the Google ATAP Board

  • because it used to be owned by Motorola,

  • but when Motorola got sold, they intelligently

  • moved the really creative interesting part over here

  • to Google.

  • And as Bradley mentioned, that started a bunch

  • of companies, which are doing well.

  • So the thing that I'm really concerned about,

  • the thing that's passionate, is making the world work better.

  • And a sort of story for this is about 15 years ago,

  • I was setting up a series of laboratories in India.

  • And, you know, we had huge government sponsorship.

  • We had a board of directors, which

  • are some of the brightest most successful people in the world.

  • And it was a complete disaster.

  • And it had to do with a lot of things.

  • It had to do with all of the sort of macho,

  • signaling charisma in the room with the board of directors.

  • But it also had to do with the way the government failed

  • to work, or did work.

  • And looking back on that, I can sort of

  • see that premonitions of the US Congress today.

  • All right?

  • So we went and visited the Indian Congress,

  • where we saw people throwing shoes at each other

  • and throwing cash in the air.

  • And we look at the US Congress today,

  • and it's somewhat similar, unfortunately.

  • So I want to make things better.

  • And what occurs to me is if we knew

  • how to make our organizations work,

  • then we could really do things.

  • Like we could solve global warming tomorrow

  • if we all knew how to sort of talk about it rationally,

  • come to a good decision, and then carry that through.

  • And the fact that that sounds like ludicrous fantasy--

  • oh, yeah, everybody agree.

  • Sure.

  • Not in our lifetime-- tells you just how profound

  • the problem is.

  • And that's why I think one of the most important things

  • that's happened in the last decade, something you've all

  • been part of, is this era of big data,

  • which is not about big at all.

  • It's about personal data.

  • Detailed data about the behavior of every person on Earth,

  • where they go, what they buy, what they say online,

  • all sorts of things like that.

  • Suddenly we could watch people the way,

  • say, you would watch an ant hill or Jane Goodall

  • would watch apes.

  • We can do that, and that has profound impact

  • that is hard to appreciate.

  • I made this little graph, which comes out

  • of-- inspired by Nadav's thesis here,

  • which is duration of observation.

  • These are social science experiments, and the biggest

  • medical experiments.

  • So this is like the Framingham heart study.

  • 30 years, 30,000 people.

  • But they only talked to people like once every three years.

  • So the bit rate was like one number per month.

  • So you had no idea what these people were doing.

  • They could have been eating fried chicken all the time.

  • You don't know, right?

  • Or most of the things we know about psychology

  • come from down here.

  • This is a number of bits per second, duration.

  • This is a bunch of freshman in Psych 101

  • filling out some surveys.

  • And that's what we take to be social science,

  • political science, and medical science.

  • But now we have these new ways of doing things.

  • And so what in my group we've done

  • is we've built little badges, like you all

  • have little name badges.

  • And so we can actually know where you go

  • and who you talk to.

  • And we do this with organizations.

  • We'll track everybody for a month.

  • We never listen to the words, but we

  • do know the patterns of communication.

  • And I'll show you a little bit about that.

  • Similarly, we put software in phones,

  • and we look at the patterns of communication

  • within a community.

  • So I go into a community and give everybody brand

  • new phones.

  • And some of the people here have been integrally involved

  • in these experiments.

  • And we'll look at their Facebook activity, their credit card

  • record, their sleep pattern, their communication pattern,

  • who they hang with, who they call,

  • and ask, what do all this communication patterns

  • have to do with outcomes?

  • All right?

  • Do they spend too much?

  • What things do they choose to buy, and so forth.

  • And what you find from the big data,

  • and, of course, modern machine learning sorts of things,

  • is that you can build quantitative predictive models

  • of human behavior, which you all know.

  • But I think you know the wrong part.

  • And I'm going to tell you about the other part that's

  • much stronger than what you typically do, OK?

  • And so you can predict behavior.

  • And people go, well, wait a second.

  • What about free will?

  • That may not have occurred to you,

  • but that's a traditional thing to ask.

  • And I'll tell you a little bit about that along

  • the way because it turns out that a lot of our behavior

  • is very habitual, and that's the part

  • that we can model mathematically.

  • So the big picture here, and this

  • is part of the reason I got off onto this research,

  • is I go to places like Davos, and you

  • listen to the president of this and the CEO of that.

  • And when they talk about changing policy, talk

  • about doing anything, they use economics metaphors.

  • And the thing about [INAUDIBLE] economic metaphors

  • is that they're all about individuals.

  • So you've heard about rational individuals.

  • And everybody rags on the rational part.

  • I'm not going to do that.

  • I'm going to rag on the individual part, OK?

  • Because I don't think we are individuals.

  • What we desire, the ways we learn to go about doing it,

  • what's valuable, are consensual things.

  • So they actually are captured by this sort

  • of model, this independent model.

  • That matters because those interactions are

  • the sources, not only of fads and economic bubbles,

  • but they're really the social fabric that we live in.

  • So everybody knows about the invisible hand

  • that are led by the invisible hand

  • to advance the interest of society.

  • What that means is that markets are supposed to allocate things

  • efficiently and fairly, right?

  • If you've thought about it, you know

  • this doesn't work in the real world, [LAUGHS] OK?

  • And the question is, why?

  • So one of the things that-- there's

  • several things to say about this.

  • Most people think that this statement is something

  • that he made in "The Wealth of Nations."

  • And I'm just going to [INAUDIBLE]

  • "The Wealth of Nations."

  • But it's not.

  • He made it in a book called "Moral Sentiments," which

  • very few people read.

  • And it went on to say something very different.

  • It went on to say that "it's human nature

  • to exchange not only goods, but ideas, assistance, and favors.

  • And it's these exchanges that guide people

  • to create solutions for the good of the community."

  • So Adam Smith did not believe that markets

  • were socially efficient.

  • He believed that it was the social fabric of relationships

  • that caused the pressures of the market

  • to be allocated where they're needed.

  • And, in fact, a lot of mathematicians