Placeholder Image

字幕列表 影片播放

  • (bright music)

  • >> Narrator: Live from Austin, Texas.

  • It's theCUBE, covering South by Southwest 2017.

  • Brought to you by Intel.

  • Now here's John Furrier.

  • >> We're here live in South by Southwest Austin, Texas.

  • Silicon Angle, theCUBE, our broadcast,

  • we go out and extract the signal from noise.

  • I'm John Furrier, I'm here with Naveene Rao,

  • the vice president general manager of

  • the artificial intelligence solutions group at Intel.

  • Welcome to theCUBE.

  • >> Thank you, yeah.

  • >> So we're here, big crowd here at Intel, Intel AI lounge.

  • Okay, so that's your wheelhouse.

  • You're the general manager of AI solutions.

  • >> Naveene: That's right.

  • >> What is AI? (laughs)

  • I mean--

  • >> AI has been redefined through time a few times.

  • Today AI means generally applied machine learning.

  • Basically ways to find useful structure

  • in data to do something with.

  • It's a tool, really, more than anything else.

  • >> So obviously AI is a mental model,

  • people can understand kind of what's going on with software.

  • Machine learning and IoT gets kind of in the industry,

  • it's a hot area, but this really is

  • points to a future world where you're seeing software

  • tackling new problems at scale.

  • So cloud computing, what you guys are doing with the chips

  • and software has now created a scale dynamic.

  • Similar to Moore's, but Moore's Law is done for devices.

  • You're starting to see software impact society.

  • So what are some of those game changing impacts

  • that you see and that you're looking at at Intel?

  • >> There are many different thought labors

  • that many of us will characterize as drudgery.

  • For instance, if I'm an insurance company,

  • and I want to assess the risk of 10 million pages of text,

  • I can't do that very easily.

  • I have to have a team of analysts run through,

  • write summaries.

  • These are the kind of problems we can start to attack.

  • So the way I always look at it is

  • what a bulldozer was to physical labor, AI is to data.

  • To thought labor, we can really get through

  • much more of it and use more data

  • to make our decisions better.

  • >> So what are the big game changing things

  • that are going on that people can relate to?

  • Obviously, autonomous vehicles

  • is one that we can all look at and say,

  • "Wow, that's mind blowing."

  • Smart cities is one that you say,

  • "Oh my god, I'm a resident of a community.

  • "Do they have to re-change the roads?

  • "Who writes the software, is there a budget for that?"

  • Smart home, you see Alexa with Amazon,

  • you see Google with their home product.

  • Voice bots, voice interfaces.

  • So the user interface is certainly changing.

  • How is that impacting some of the things

  • that you guys are working on?

  • >> Well, to the user interface changing,

  • I think that has an entire dynamic on how people use tools.

  • Easier something is, the more people use,

  • the more pervasive it becomes,

  • and we start discovering these emergent dynamics.

  • Like an iPod, for instance.

  • Storing music in a digital form,

  • small devices around before the iPod.

  • But when it made it easy to use,

  • that sort of gave rise to the smartphone.

  • So I think we're going to start seeing

  • some really interesting dynamics like that.

  • >> One of the things that I liked

  • about this past week in San Francisco,

  • Google had their big event, their cloud event,

  • and they talked a lot about, and by the way,

  • Intel was on stage with the new Xeon processor,

  • up to 72 cores, amazing compute capabilities,

  • but cloud computing does bring that scale together.

  • But you start thinking about data science

  • has moved into using data, and now you have

  • a tsunami of data, whether it's taking

  • an analog view of the world

  • and having now multiple datasets available.

  • If you can connect the dots, okay, a lot of data,

  • now you have a lot of data plus a lot of datasets,

  • and you have almost unlimited compute capability.

  • That starts to draw in some of the picture a little bit.

  • >> It does, but actually there's one thing missing

  • from what you just described, is that our ability

  • to scale data storage and data collection

  • has outpaced our ability to compute on it.

  • Computing on it typically is some sort

  • of quadratic function, something faster

  • than when your growth on amount of data.

  • And our compute has really not caught up with that,

  • and a lot of that has been more about focus.

  • Computers were really built to automate streams of tasks,

  • and this sort of idea of going highly parallel

  • and distributed, it's something somewhat new.

  • It's been around a lot in academic circles,

  • but the real use case to drive it home

  • and build technologies around it is relatively new.

  • And so we're right now in the midst of

  • transforming computer architecture,

  • and it's something that becomes a data inference machine,

  • not just a way to automate compute tasks,

  • but to actually do data inference

  • and find useful inferences in data.

  • >> And so machine learning is the hottest trend right now

  • that kind of powers AI, but also there's some talk

  • in the leader circles around learning machines.

  • Data learning from engaged data, or however

  • you want to call it, also brings out another question.

  • How do you see that evolving, because do we need to

  • have algorithms to police the algorithms?

  • Who teaches the algorithms?

  • So you bring in this human aspect of it.

  • So how does the machine become a learning machine?

  • Who teaches the machine, is it...

  • (laughs) I mean, it's crazy.

  • >> Let me answer that a little bit with a question.

  • Do you have kids?

  • >> Yes, four.

  • >> Does anyone police you on raising your kids?

  • >> (laughs) Kind of, a little bit, but not much.

  • They complain a lot.

  • >> I would argue that it's not so dissimilar.

  • As a parent, your job is to expose them to

  • the right kind of biases or not biased data

  • as much as possible, like experiences, they're exactly that.

  • I think this idea of shepherding data

  • is extremely important.

  • And we've seen it in solutions that Google has brought out.

  • There are these little unexpected biases,

  • and a lot of those come from just what we have in the data.

  • And AI is no different than a regular intelligence

  • in that way, it's presented with certain data,

  • it learns from that data and its biases are formed that way.

  • There's nothing inherent about the algorithm itself

  • that causes that bias other than the data.

  • >> So you're saying to me that exposing more data

  • is actually probably a good thing?

  • >> It is.

  • Exposing different kinds of data, diverse data.

  • To give you an example from the biological world,

  • children who have never seen people of different races

  • tend to be more, it's something new and unique

  • and they'll tease it out.

  • It's like, oh, that's something different.

  • Whereas children who are raised

  • with people of many diverse face types or whatever

  • are perfectly okay seeing new diverse face types.

  • So it's the same kind of thing in AI, right?

  • It's going to hone in on the trends that are coming,

  • and things that are outliers, we're going to call as such.

  • So having good, balanced datasets, the way we collect

  • that data, the way we sift through it

  • and actually present it to an AI is extremely important.

  • >> So one of the most exciting things

  • that I like, obviously autonomous vehicles,

  • I geek out on because, not that I'm a car head,

  • gear head or car buff, but it just,

  • you look at what it encapsulates technically.

  • 5G overlay, essentially sensors all over the car,

  • you have software powering it,

  • you now have augmented reality, mixed reality

  • coming into it, and you have an interface to consumers

  • and their real world in a car.

  • Some say it's a moving data center,

  • some say it's also a human interface

  • to the world, as they move around in transportation.

  • So it kind of brings out the AI question,

  • and I want to ask you specifically.

  • Intel talks about this a lot in their super demos.

  • What actually is Intel doing with the compute

  • and what are you guys doing to make that accelerate faster

  • and create a good safe environment?

  • Is it just more chips, is it software?

  • Can you explain, take a minute to explain

  • what Intel's doing specifically?

  • >> Intel is uniquely positioned in this space,

  • 'cause it's a great example of a full end to end problem.

  • We have in-car compute, we have software,

  • we have interfaces, we have actuators.

  • That's maybe not Intel's suite.

  • Then we have connectivity, and then we have cloud.

  • Intel is every one of those things,

  • and so we're extremely well positioned

  • to drive this field forward.

  • Now you ask what are we doing in terms of hardware

  • and software, yes, it's all of it.

  • This is a big focus area for Intel now.

  • We see autonomous vehicles as being

  • one of the major ways that people interact

  • with the world, like locality between cars

  • and interaction through social networks

  • and these kinds of things.

  • This is a big focus area, we are working

  • on the in-car compute actively,

  • we're going to lead that, 5G is a huge focus for Intel,

  • as you might've seen in other, Mobile World Congress,

  • other places.

  • And then the data center.

  • And so we own the data center today,

  • and we're going to continue to do that

  • with new technologies and actually enable

  • these solutions, not just from

  • a pure hardware primitives perspective,

  • but from the software-hardware interaction in full stack.

  • >> So for those people who think of Intel

  • as a chip company, obviously you guys

  • abstract away complexities and put it into silicon,

  • I obviously get that.

  • Google Next this week, one thing I was really impressed by

  • was the TensorFlow machine learning algorithms

  • in open source, you guys are optimizing the Xeon processor

  • to offload, not offload, but kind of take on...

  • Is this kind of the paradigm that Intel looks at,

  • that you guys will optimize the highest performance

  • in the chip where possible, and then to let the software

  • be more functional?

  • Is that a guiding principle, is that a one off?

  • >> I would say that Intel is not just a chip company.

  • We make chips, but we're a platform solutions company.

  • So we sell primitives to various levels,

  • and so, in certain cases, yes, we do optimize for software

  • that's out there because that drives adoption

  • of our solutions, of course.

  • But in new areas, like the car for instance,

  • we are driving the whole stack, it's not just the chip,

  • it's the entire package end to end.

  • And so with TensorFlow, definitely.

  • Google is a very strong partner of ours,

  • and we continue to team up on activities like that.

  • >> We are talking with Naveene Rao,

  • vice president general manager Intel's AI solutions.

  • Breaking it down for us.

  • This end to end thing is really interesting to me.

  • So I want to get just double click on that a little bit.

  • It requires a community to do that, right?

  • So it's not just Intel, right?

  • Intel's always had a great rising tide

  • floats all boats kind of concept

  • over the life of the company, but now, more than ever,

  • it's an API world, you see integration points

  • between companies.

  • This becomes an interesting part.

  • Can you talk up to that point about

  • how you guys are enabling partners to work with,