Placeholder Image

字幕列表 影片播放

  • >> Announcer: Live from Austin, Texas

  • it's the Cube.

  • Covering South by Southwest 2017.

  • Brought to you by Intel.

  • Now here's John Furrier.

  • Okay we're back live here at the South by Southwest

  • Intel AI Lounge, this is The Cube's special coverage

  • of South by Southwest with Intel, #IntelAI

  • where amazing starts with Intel.

  • Our next guest is Dr. Dawn Nafus who's with Intel

  • and you are a senior research scientist.

  • Welcome to The Cube.

  • >> Thank you.

  • >> So you've got a panel coming up and you also

  • have a book AI For Everything.

  • And looking at a democratization of AI

  • we had a quote yesterday that,

  • "AI is the bulldozer for data."

  • What bulldozers were in the real world,

  • AI will be that bulldozer for data,

  • surfacing new experiences. >> Right.

  • >> This is the subject of your book, kind of.

  • What's your take on this and what's your premise?

  • >> Right well the book actually takes

  • a step way back, it's actually called

  • Self Tracking, the panel is AI For Everyone.

  • But the book is on self tracking.

  • And

  • it's really about actually

  • getting some meaning out of data

  • before we start talking about bulldozers.

  • So right now we've got this situation where

  • there's a lot of talk about AI's going

  • to sort of solve all of our problems in health

  • and there's a lot that can get accomplished, whoops.

  • But the fact of the matter is

  • is that people are still struggling with gees,

  • like, "What does my Fitbit actually mean, right?"

  • So there's this, there's a real big gap.

  • And I think probably part of what the industry

  • has to do is not just sort of build new great technologies

  • which we've got to do but also start to fill that gap

  • in sort of data education, data literacy,

  • all that sort of stuff.

  • >> So we're kind of in this first generation of AI data

  • you mentioned wearable, Fitbits.

  • >> Dawn: Yup.

  • >> So people are now getting used to this,

  • so that it sounds this integration into lifestyle

  • becomes kind of a dynamic.

  • >> Yeah. >> Why are people grappling

  • >> John: with this, what's your research say about that?

  • >> Well right now with wearables frankly

  • we're in the classic trough of disillusionment. (laughs)

  • You know for those of you listening

  • I don't know if you have sort of wearables

  • in drawers right now, right?

  • But a lot of people do.

  • And it turns out that folks tend to use it,

  • you know maybe about three or four weeks

  • and either they've learned something

  • really interesting and helpful or they haven't.

  • And so there's actually a lot of people

  • who do really interesting stuff to kind of combine it

  • with symptoms tracking, location, right

  • other sorts of things to actually

  • really reveal the sorts of triggers for

  • medical issues that you can't find in a clinical setting.

  • It's all about being out in the real world

  • and figuring out what's going on with you.

  • Right, so then when we start to think about

  • adding more complexity into that,

  • which is the thing that AI's good at,

  • we've got this problem of there's only so many data sets

  • that AI's any actually any good at handling.

  • And so I think there's going to have to be a moment where

  • sort of people themselves actually start to say,

  • "Okay you know what?

  • "This is how I define my problem.

  • "This is what I'm going to choose to keep track of."

  • And some of that's going to be on a sensor

  • and some of it isn't.

  • Right and sort of being really intervening

  • a little bit more strongly in what

  • this stuff's actually doing.

  • >> You mentioned the Fitbit and you were seeing

  • a lot of disruption in the areas, innovation

  • and disruption, same thing good and bad potentially.

  • But I'll see autonomous vehicles is pretty clear,

  • and knows what Tesla's tracking with their hot trend.

  • But you mentioned Fitbit, that's a healthcare

  • kind of thing.

  • AIs might seem to be a perfect fit into healthcare

  • because there's always alarms going off

  • and all this data flying around.

  • Is that a low hanging fruit for AI?

  • Healthcare?

  • >> Well I don't know if there's any such thing

  • as low hanging fruit (John laughs)

  • in this space. (laughs)

  • But certainly if you're talking about

  • like actual human benefit, right?

  • That absolutely comes the top of the list.

  • And we can see that in both formal healthcare

  • in clinical settings and sort of imaging for diagnosis.

  • Again I think there's areas to be cautious about, right?

  • You know making sure that there's also

  • an appropriate human check and there's also

  • mechanisms for transparency, right?

  • So that doctors, when there is a discrepancy

  • between what the doctor believes and what the machine says

  • you can actually go back and figure out

  • what's actually going on.

  • The other thing I'm particularly excited about is,

  • and this is why I'm so interested in democratization

  • is that health is not just about,

  • you know, what goes on in clinical care.

  • There are right now environmental health groups

  • who are looking at slew of air quality data

  • that they don't know what to do with, right?

  • And a certain amount of machine assistance

  • to sort of figure out you know signatures

  • of sort of point source polluters, for example,

  • is a really great use of AI.

  • It's not going to make anybody any money anytime soon,

  • but that's the kind of society that we want to live in right?

  • >> You are the social good angle for sure,

  • but I'd like to get your thoughts 'cause you mentioned

  • democratization and it's kind of a nuance

  • depending upon what you're looking at.

  • Democratization with news and media

  • is what you saw with social media now you got healthcare.

  • So how do you define democratization in your context

  • and you're excited about.?

  • Is that more of

  • freedom of information and data is it

  • getting around gatekeepers and siloed stacks?

  • I mean how do you look at democratization?

  • >> All of the above. (laughs) (John laughs)

  • I'd say there are two real elements to that.

  • The first is making sure that you know,

  • people are going to use this for more than just business,

  • have the ability to actually do it

  • and have access to the right sorts of infrastructures to,

  • whether it's the environmental health case

  • or there are actually artists now who use

  • natural language processing to create art work.

  • And people ask them, "Why are you using deblurting?"

  • I said, "Well there's a real access issue frankly."

  • It's also on the side of if you're not the person

  • who's going to be directly using data

  • a kind of a sense of, you know...

  • Democratization to me means being able

  • to ask questions of how the stuff's actually behaving.

  • So that means building in

  • mechanisms for transparency, building in mechanisms

  • to allow journalists to do the work that they do.

  • >> Sharing potentially?

  • >> I'm sorry? >> And sharing as well

  • more data? >> Very, very good.

  • Right absolutely, I mean frankly we still have

  • a problem right now in the wearable base of

  • people even getting access to their own data.

  • There's a guy I work with named Hugo Campos

  • who has an arterial defibrillator and

  • he's still fighting to get access

  • to the very data that's coming out of his heart.

  • Right? (laughs)

  • >> Is it on SSD, in the cloud?

  • I mean where is it? >> It is in the cloud.

  • It's going back to the manufacturer.

  • And there are very robust conversations about

  • where it should be.

  • >> That's super sad.

  • So this brings up the whole thing that

  • we've been talking about yesterday

  • when we had a mini segment on The Cube is that

  • there are all these new societal use cases

  • that are just springing up that we've never seen before.

  • Self-driving cars with transportation,

  • healthcare access to data, all these things.

  • What are some of the things that you see emerging

  • on that tools or approaches that could help

  • either scientists or practitioners or citizens

  • deal with these new critical problem solving

  • that needs to apply technology to.

  • I was talking just last week at Stanford with folks

  • that are looking at gender bias and algorithms.

  • >> Right, uh-huh it's real. >> Something I would never

  • have thought of that's an outlier.

  • Like hey, what? >> Oh no, it's happened.

  • >> But it's one of those things were okay,

  • let's put that on the table.

  • There's all this new stuff coming on the table.

  • >> Yeah, yeah absolutely. >> What do you see?

  • >> So they're-- >> How do we solve that

  • >> John: what approaches?

  • >> Yeah there are a couple of mechanisms

  • and I would encourage listeners and folks in the audience

  • to have a look at a really great report

  • that just came out from the Obama Administration

  • and NYU School of Law.

  • It's called AI Now and they actually propose

  • a couple of pathways to sort of making sure

  • we get this right.

  • So you know a couple of things.

  • You know one is frankly making sure that women

  • and people of color are in the room

  • when the stuff's getting built, right?

  • That helps.

  • You know as I said earlier you know making sure that

  • you know things will go awry.

  • Like it just will we can't predict how these things

  • are going to work and catching it after the fact

  • and building in mechanisms to be able

  • to do that really matter.

  • So there was a great effort by ProPublica to look at

  • a system that was predicting criminal recidivism.

  • And what they did was they said, "Look you know

  • "it is true that

  • "the thing has the same failure rate

  • "for both blacks and whites."

  • But some hefty data journalism and data scraping

  • and all the rest of it actually revealed that

  • it was producing false positives for blacks

  • and false negatives for whites.

  • Meaning that black people were predicted

  • to create more crime than white people right?

  • So you know, we can catch that, right?

  • And when we build in more system of people who

  • had the skills to do it,

  • then we can build stuff that we can live with.

  • >> This is exactly to your point of democratization

  • I think that fascinates me that I get so excited about.

  • It's almost intoxicating when you think about it

  • technically and also societal that there's

  • all these new things that are emerging

  • and the community has to work together.

  • Because it's one of those things where

  • there's no, there may be a board of governors out there.

  • I mean who is the board of governors for this stuff?

  • It really has to be community driven.

  • >> Yeah, yeah.

  • >> And NYU's got one, any other examples of communities that

  • are out there that people can participate in or?

  • >> Yup, absolutely.

  • So I think that you know, they're

  • certainly collaborating on projects that

  • you actually care about and sort of asking

  • good questions about, is this appropriate

  • for AI or not, right?

  • Is a great place to start of reaching out

  • to people who have those technical skills.

  • There are also

  • the Engineering Professional Association

  • ac